Striking-Travel-6649
Striking-Travel-6649 t1_j7hx5tb wrote
Reply to comment by noobgolang in [D] Is English the optimal language to train NLP models on? by MrOfficialCandy
"I wonder if we should be using a human language at all"
My response: 01101000 01100101 01101100 01101100 01101111 00101100 00100000 01101111 01110101 01101110 00100001
Striking-Travel-6649 t1_jbjkstq wrote
Reply to comment by Acrobatic-Name5948 in [D] Why are so many tokens needed to train large language models? by blacklemon67
I think you're on the money. Once we develop more novel network and system structures that are really good at what they do while still generalizing, it will be game over. I think the current models that ML engineers have created are not complex or nuanced enough to extract the kind of value that humans can out of a "small" number of tokens. The human brain is great at having centralized control, coordination across systems, and effective interconnection, and each subsystem can do its "tasks" extremely well and can generalize across tasks too. With that in mind, we are going to need much more complex systems to achieve AGI.