Own-Archer7158
Own-Archer7158 t1_iy3mec9 wrote
Reply to comment by Own-Archer7158 in Can someone pls help me with this question and explain to me why you chose your answers? Many thanks! by CustardSignificant24
Note that the minimal loss is reached when the parameters make neural network predictions the closest to the real labels
Before that, the gradient is non zero generally (except for an very very unlucky local minimum)
You could see the case of the linear regression with least square error as loss to understand better the underlying optimization problem (in one dimension, it is a square function to minimize, so no local minimum)
Own-Archer7158 t1_iy3m6j0 wrote
Reply to comment by nutpeabutter in Can someone pls help me with this question and explain to me why you chose your answers? Many thanks! by CustardSignificant24
If all weight are the same (assume 0 to be simple) then the output of the function/neural network is far from the objective/label
The gradient is therefore non zero
And finally the parameters are updated : theta = theta + learning_rate*grad_theta(loss)
And when the parameters are updated the loss is changed
Usually, the parameters are randomly choosen
Own-Archer7158 t1_iy3h8pp wrote
Reply to comment by Own-Archer7158 in Can someone pls help me with this question and explain to me why you chose your answers? Many thanks! by CustardSignificant24
If the learning rate is zero, the update rule of the params makes the params unchanged
The data balancing does not change the loss (it only changes the overfitting) and same for the regularization strength too low
Bad initialization is rarely a problem (with a lack of chance you could get a local minimum directly but rare event)
Own-Archer7158 t1_iy3h1oa wrote
Reply to Can someone pls help me with this question and explain to me why you chose your answers? Many thanks! by CustardSignificant24
3 is the only possible solution
Own-Archer7158 t1_iy3q6b6 wrote
Reply to comment by canbooo in Can someone pls help me with this question and explain to me why you chose your answers? Many thanks! by CustardSignificant24
You are right, thank you