0 votes
par dans 01 - Bases, concepts et histoire
Hi,

how do I avoid being trapped in a local minimum with the regression?

Thanks

1 Réponse

0 votes
par Vétéran du GPU 🐋 (48.7k points)
sélectionné par
 
Meilleure réponse
Linear régression does not have local minimum issue. It is a convex problem
par
Sorry, I don't understand. You say there cannot be a landscape with different convex locations?
par Vétéran du GPU 🐋 (48.7k points)
I am saying that if you use a linear regression, you will never have any issue with respect to local minimum because this is a convex optimization algorithm. That does not mean that your data landscape is convex.
par
Let me ask differently then. If I have multiple parameters and I try to optimise them, what do I do?
par Vétéran du GPU 🐋 (48.7k points)
Well you can try a linear regression if your data is simple enough. But it may not work properly. Then you're going to need more powerful algorithms (like neural network). But there is no magical recipe if that's what you're looking for (otherwise it would be much easier)
par
But aren't we not talking about neural networks now? I am confused now.
par Vétéran du GPU 🐋 (48.7k points)
I misunderstood your question then. I assumed you were talking about linear regression. Then yes with the neural networks, local minimas are a big issue. But we will see in future sessions a few methods which are used to somewhat avoid them (still not 100% sure)
par
I try again. Sorry.
With y=m*x +b there is only one solution, but if I have y(x1, x2, x3) dependent on multiple parameters, I don't think there is only one solution to minimise. Correct?
par Vétéran du GPU 🐋 (48.7k points)
In this case it is a linear regression because you're linearly dependent with respect to your parameter. So it is convex (you're optimizing with respect to parameters not with respect to your inputs) and therefore only one solution.
par
Sorry again. Even if it is polynomial of higher order? x1,x2, x3 are my different observation parameters. Maybe my naming was bad here.
par Vétéran du GPU 🐋 (48.7k points)
If it's polynomial, it's polynomial with respect to input. But Y=MX+B is always linear wrt parameters M.
par
Yes, I can follow, but a polynomial can have more than one solution? Or still not? Things become complicated when multidimensional?
par Vétéran du GPU 🐋 (48.7k points)
Well if your optimisation is polynomial (non-linear) wrt parameters then yeah you can have multiple solution. But in your example, you're polynomial wrt inputs and linear wrt parameters so only one solution with this algorithm.
par
What does wrt mean, please?
par Vétéran du GPU 🐋 (48.7k points)
With respect to
par
To clarify, how should I have indicated my function as multi-dim?
If I write f(para1, para2, para3) you cannot see IMHO you cannot see if it is non-linear.
par Vétéran du GPU 🐋 (48.7k points)
No I don't know if it is non-linear, but I assumed it was because 99.9999% of the time in deep learning it is non linear. Otherwise the linear regression will do just fine.
...