Which of the following are true statements? Select all that apply. To make gradient descent converge, we must slowly decrease α over time. Gradient descent is guaranteed to find the global minimum for any function ). Gradient descent can converge even if α cannot be too large, or else it may fail to converge.) For the specific choice of cost function ) used in linear regression, there are no local optima (other than the global optimum). 相关文章: