-xylon
-xylon t1_jcokzje wrote
Reply to comment by fullstackai in [D] Unit and Integration Testing for ML Pipelines by Fender6969
Having a schema and generating random or synthetic data based on that schema is my way to go for testing.
-xylon t1_jabxwa6 wrote
Reply to comment by nativedutch in [P] [R] Neural Network in Fortran! by Etterererererer
I did an applied Math masters in 2016 and they teached us Fortran (Matlab too), along with the usual commercial software such as ansys + ofc all the PDE theory necessary.
Point being: it's niche but it's still there. Classmates who ended up tightly adhering to the masters career path now write Fortran for a living.
And don't try to sell me "but C++ does the same and it's better/more modern". I've written Fortran, I've written C++, and Fortran is neither arcane nor hard, especially when you use it for its intended purpose (FORmula TRANslation, i.e. physics sims), in fact it blows C++ out of the water in usability if you are not a computer scientist... Which is why physicists and mathematicians keep using it.
-xylon t1_jabmlj2 wrote
Reply to [P] [R] Neural Network in Fortran! by Etterererererer
I'm not going to jump the "Fortran is dead" bandwagon as it is a language that continues to get used in simulation code and I feel it's never really going to be replaced. As you mentioned it has good properties for writing that kind of code, good compilers, etc.
That said, it is a very niche language, used mostly in the physics simulation & supercomputing world. And idk how popular NN are in that sector, but it seems to me it's a niche inside a niche... So you will need to dig a bit to get answers. Maybe the people at r/Fortran will know more.
-xylon t1_izx7jbc wrote
Reply to comment by IntelArtiGen in [D] G. Hinton proposes FF – an alternative to Backprop by mrx-ai
Training in that manner tends to "forget" previous knowledge in the net.
-xylon t1_izbsuy6 wrote
Reply to [D] If you had to pick 10-20 significant papers that summarize the research trajectory of AI from the past 100 years what would they be by versaceblues
Lemme ask chatgpt real quick
-xylon t1_ixtxq82 wrote
It's in the spirit of open source to contribute to packages you care about. Just saying
-xylon t1_ixgrarj wrote
Reply to [D] Schmidhuber: LeCun's "5 best ideas 2012-22” are mostly from my lab, and older by RobbinDeBank
lmaoing @ the 2 blatant bot accounts that posted there 1h ago
-xylon t1_ixgpklv wrote
...wtf?
-xylon t1_itycq7h wrote
Reply to comment by SAint7579 in [D] Python function that changed your life in regard to Machine Learning by popcornn1
.pipe() is another game changer, along with .assign().
I recently discovered you can pass callables to almost ANYTHING in pandas. Things like
df.assign(newcolumn= lambda df: ... or better yet df[lambda x:....] and df.loc[lambda x:].
-xylon t1_irjih2c wrote
Reply to comment by nomadiclizard in [D] AlphaTensor Explained (Video Walkthrough) by ykilcher
Relax this is just better compilation for algorithms that exist, not new algos.
-xylon t1_jduwnkm wrote
Reply to My ChatGPT Chrome Extension that saves conversations in .md files is finally approved by the Chrome Web Store. It's still and so will continue to be Open Source. [P] by ThePogromist
Any chance to have this for Firefox?