Quick-Hovercraft-997 t1_jbx9gcj wrote on March 12, 2023 at 12:52 PM Reply to [D] Best way to run LLMs in the cloud? by QTQRQD if latency is not a critical requirement, you can try serverless GPU cloud like banana.dev, pipeline.ai . These platform provide an easy to use template for deploying LLM. Permalink 1
Quick-Hovercraft-997 t1_jbx9gcj wrote
Reply to [D] Best way to run LLMs in the cloud? by QTQRQD
if latency is not a critical requirement, you can try serverless GPU cloud like banana.dev, pipeline.ai . These platform provide an easy to use template for deploying LLM.