Educational-Net303
Educational-Net303 t1_jegta0z wrote
Reply to comment by LoaderD in [News] Twitter algorithm now open source by John-The-Bomb-2
Get rid of the if statement and you just recreated Twitter's recommendation algorithm
Educational-Net303 t1_jeggs0s wrote
Reply to comment by ZestyData in [News] Twitter algorithm now open source by John-The-Bomb-2
Yeah, like Elon or not, the push for open source is always going to be beneficial to the community. Ironic how twitter is more open than ____AI.
Educational-Net303 t1_jd5a9fy wrote
Reply to comment by KGL-DIRECT in Microsoft’s Bing chatbot now lets you create images via OpenAI’s DALL-E by SnoozeDoggyDog
They are still miles away from chatgpt quality. With more money OpenAI will just be able to accelerate it's development while open source is behind
Educational-Net303 t1_jd4urzv wrote
I'm scared of the monopoly OpenAI/Microsoft will become
Educational-Net303 t1_jd2rsax wrote
Reply to comment by 42gether in [Project] Alpaca-30B: Facebook's 30b parameter LLaMa fine-tuned on the Alpaca dataset by imgonnarelph
My whole point is that it will take years before we get to 48GB vram consumer GPUs. You just proved my point again without even reading it.
Educational-Net303 t1_jd0k6p6 wrote
Reply to comment by 42gether in [Project] Alpaca-30B: Facebook's 30b parameter LLaMa fine-tuned on the Alpaca dataset by imgonnarelph
Which takes years
Educational-Net303 t1_jd05hmc wrote
Reply to comment by I_will_delete_myself in [Project] Alpaca-30B: Facebook's 30b parameter LLaMa fine-tuned on the Alpaca dataset by imgonnarelph
Are we still taking consumer grade hardware or specialized GPU made for a niche crowd?
Educational-Net303 t1_jd051kh wrote
Reply to comment by I_will_delete_myself in [Project] Alpaca-30B: Facebook's 30b parameter LLaMa fine-tuned on the Alpaca dataset by imgonnarelph
Cyberpunk on max with psycho takes ~16gb max. It's gonna be a few years before we actually see games demanding more than 24.
Educational-Net303 t1_jd03se1 wrote
Reply to comment by I_will_delete_myself in [Project] Alpaca-30B: Facebook's 30b parameter LLaMa fine-tuned on the Alpaca dataset by imgonnarelph
What game is limited by vram? I haven't heard of any game running over 24gb unless it's Skyrim with a bunch of 8k mods
Educational-Net303 t1_jc3zudu wrote
Reply to comment by icedrift in [R] Stanford-Alpaca 7B model (an instruction tuned version of LLaMA) performs as well as text-davinci-003 by dojoteef
I mean it's probably running on a single 4090 from one of the PhDs personal setup. Just wait for someone to replicate and release an actual model
Educational-Net303 t1_jair4wf wrote
Reply to [D] OpenAI introduces ChatGPT and Whisper APIs (ChatGPT API is 1/10th the cost of GPT-3 API) by minimaxir
Definitely a loss-leader to cut off Claude/bard, electricity alone would cost more than that. Expect a rise in price in 1 or 2 months
Educational-Net303 t1_j20r9rd wrote
Looks like the gens are very generic and limited, cool idea though
Educational-Net303 t1_jegxm3h wrote
Reply to HuggingGPT - Solving AI Tasks with ChatGPT and its Friends in HuggingFace by visarga
This is just connecting GPT to huggingface models. OpenAI probably experimented with this years ago considering GPT4's vision abilities.