How to Run Any LLM using Cloud GPUs and Ollama with Runpod.io

Tyler AI β€’ March 9, 2024
Video Thumbnail
Tyler AI Logo

Tyler AI

@tylerreedai

About

πŸ€–My name is Tyler. Im a Software Engineer with Years of Experience in AI. ⚑️I Have Helped Hundreds Navigate Through the World of AI and Create AI Agents to Automate their Lives and Businesses πŸš€Join my Skool Community πŸ‘‡πŸ‘‡ https://www.skool.com/the-ai-agent-5174/about πŸ”” Don't Forget to Subscribe: Hit that subscribe button and turn on notifications so you never miss out on the latest insights, tutorials, and discussions about Autogen and AI. Connect With Me: πŸ’» Skool: The AI Agent 🐦 Twitter: TylerReedAI πŸ“Έ Instagram: TylerReedAI πŸ’Ό LinkedIn: TylerReedAI

Video Description

Hello, and welcome to my video on how to run a server on runpod.io. The reason for this is because not everybody has a good enough computer with the hardware needed to run a local LLM. This is a cheap way to use a server that is much more powerful for your needs. Don't forget to sign up for the newsletter below to give updates in AI, what I'm working on and struggles I've dealt with (which you may have too!): ========================================================= πŸ“° Newsletter Sign-up: https://bit.ly/tylerreed ========================================================= πŸ™‹β€β™‚οΈ My GitHub: https://github.com/tylerprogramming/ai πŸ™‹β€β™‚οΈ 31 Day Challenge: https://github.com/tylerprogramming/31-day-challenge-ai πŸ₯§ PyCharm Download: https://www.jetbrains.com/pycharm/download 🐍 Anaconda Download: https://www.anaconda.com/download πŸ¦™ Ollama Download: https://ollama.com/ πŸ€– LM Studio Download: https://lmstudio.ai/ πŸ“– Chapters: 00:00 Intro 00:19 What is runpod.io? 01:11 How to setup 03:26 Install Ollama 05:40 Run Example 06:26 Outro πŸ’¬ If you have any issues, let me know in the comments and I will help you out!