Additional-Cap-7110
Additional-Cap-7110 t1_ja0sd5r wrote
Reply to comment by Z1BattleBoy21 in Meta just introduced its LLM called LLaMA, and it appears meaner than ChatGPT, like it has DAN built into it. by zalivom1s
Ghaie.
Researchers only
Additional-Cap-7110 t1_ja0s90m wrote
Reply to Meta just introduced its LLM called LLaMA, and it appears meaner than ChatGPT, like it has DAN built into it. by zalivom1s
If it’s got a lot of freedom, that won’t last long. It never does. OpenAI allowed free access with ChatGPT for a couple of weeks of chaos without even threatening peoples accounts in order to get data to stop people exploiting it. Bing Chat had the same idea. Now look at it. If Meta has the same idea you’ll see freedom in access for a while in some way as well, and then it will be shut down and lobotomized in just the same way.
Additional-Cap-7110 t1_its5who wrote
Reply to comment by ishizako in How far are we from being able to clone a singers voice? by HelloGoodbyeFriend
Singing is going to be much harder.There’s so much variation. Plus words just requires it to sound natural, singing requires much more of a performance and we have all kinds of other aspects. Like singing softly, loudly, vibrato, portamento, rhythm, not it mention notes themselves,
This might make it clear. We can do synthesized percussion much better than we can do synthesized tonal instruments like violins, flutes etc. sampling percussion has always been the easiest thing to get realistic and 100% synthesized instruments are no different.
If you want to sample percussion all you really need aside from recording quality is sampling multiple repetitions and a shit-ton of dynamic layers. The best percussion sample libraries today will have like maybe 10-20 dynamic layers and 5+ to 10+ repetition samples sometimes. You don’t even need that many to make it sound convincing. But with instruments like vocals, violins flutes etc that’s not scratching the surface. These are complex on much higher dimensions and you need completely different techniques to capture them, and even then it’s still not quite right or it’s highly limited in it’s use
Additional-Cap-7110 t1_its5u3e wrote
We haven’t even done perfect regular speech yet
Additional-Cap-7110 t1_ja6mzqb wrote
Reply to comment by MysteryInc152 in Meta just introduced its LLM called LLaMA, and it appears meaner than ChatGPT, like it has DAN built into it. by zalivom1s
I don’t know what that means