Thijs-vW
Thijs-vW OP t1_iw6rmvh wrote
Reply to comment by scitech_boom in Update an already trained neural network on new data by Thijs-vW
>Anyhow, you cannot do this:
I do not understand why I cannot use train my already trained model on new data. Could you elaborate?
Thijs-vW OP t1_itbjlyh wrote
Reply to comment by Thijs-vW in [Discussion] Categorical Encoding In Deep Learning by Thijs-vW
Apparently yes: https://stackoverflow.com/a/57807971/15589661
Thijs-vW OP t1_itbb6xe wrote
Reply to comment by TheCloudTamer in [Discussion] Categorical Encoding In Deep Learning by Thijs-vW
Thanks for the clarification. So if I one-hot encode my categorical variable and feed it to a dense layer, I would achieve the same as with an embedding layer?
Thijs-vW OP t1_it82bgp wrote
Reply to comment by Travolta1984 in [Discussion] Categorical Encoding In Deep Learning by Thijs-vW
I looked in to the embedding layer in Keras, but I was not impressed. They are merely fancy lookup tables. That is nice when you want to encode sentences or the like, but I have a variable with merely 51 categories. In this case, a dense layer to transform the one-hot encoded variable would achieve the same, if I am not mistaken.
Thijs-vW OP t1_iw6ryeq wrote
Reply to comment by BugSlayerJohn in Update an already trained neural network on new data by Thijs-vW
Thanks for the advice. Unfortunately I do not think transfer learning is the best thing for me to do, considering:
>if you train only on the new data, that's all it will know how to predict.
Anyhow,
>If retraining the entire model on the complete data set is possible with nominal cost in less than a few days, do that.
This is indeed the case. However, if I retrain my entire model, it is very likely that the new model will make entirely different predictions due to its weight matrix not being identical. This is the problem I would like to avoid. Do you have any advice on that?