Equivalent-Way3
Equivalent-Way3 t1_j4wjuxe wrote
Reply to [D] Is it possible to update random forest parameters with new data instead of retraining on all data? by monkeysingmonkeynew
XGBoost can do this and you can set its hyperparameters so that it's a random forest
Equivalent-Way3 t1_izm9vlw wrote
Reply to comment by mgostIH in [D] Making a regression NN estimate its own regression error by Alex-S-S
Jax only?
Equivalent-Way3 t1_izkuzgt wrote
Perhaps a Bayesian neural net would be what you're looking for
Equivalent-Way3 t1_izkudc7 wrote
Reply to comment by Unlikely-Video-663 in [D] Making a regression NN estimate its own regression error by Alex-S-S
> In practice, use for example a Gaussian likelihood, learn wicht GauddianNLL Loss also the variance. Ax long ad you stay eithin distri yadaya this can work okish ..
You ok?
Equivalent-Way3 t1_iyaj9rw wrote
Reply to comment by -horses in [r] The Singular Value Decompositions of Transformer Weight Matrices are Highly Interpretable - LessWrong by visarga
Welp, first sentence lmao
> I have several times failed to write up a well-organized list of reasons why AGI will kill you.
Immediately closes tab
Equivalent-Way3 t1_iy996pf wrote
Reply to comment by mrconter1 in [r] The Singular Value Decompositions of Transformer Weight Matrices are Highly Interpretable - LessWrong by visarga
> It's literary a doomsday cult.
Can you explain what you're referring to?
Equivalent-Way3 t1_j50y33r wrote
Reply to comment by monkeysingmonkeynew in [D] Is it possible to update random forest parameters with new data instead of retraining on all data? by monkeysingmonkeynew
Yep very simple. Say you have
model1that you trained already, then you just use thexgb_modelargument in your next training.In R (Python should be the same or close to it)