In Gradient descent technique, we chose an alpha value

In Gradient descent technique, we chose an alpha value (learning rate)
in computation of parameters (theta zero and theta 1). What will happen if we assign a very small value to alpha?
1) The model computations may take a long time to converge
2) The model may never converge
3) There will be no need to iterate
4) The speed of the computations will be very high
 
Looking for a Similar Assignment? Order now and Get 10% Discount! Use Coupon Code “Newclient”

The post In Gradient descent technique, we chose an alpha value appeared first on Superb Professors.

"Order a Custom Paper on Similar Assignment! No Plagiarism! Enjoy 20% Discount"