Have you ever self hosted AI setup?
Brody Slade
7 replies
I would love to know what motivated you to self-host your AI setup?
Was it a specific project or use case? What tools and frameworks did you use? :)
Replies
Ethan Samuel Bennett@ethansamuelbennett
Yeah, we've been running our own self-hosted AI setup for a while now. It's been great for integrating models from top providers like OpenAI and Anthropic into our apps. The combo of cloud flexibility and on-prem control & speed is 🔥 Excited to see more no-code options making custom AI accessible to all. What stack are you using?
Share
I've been experimenting with self-hosting open source AI models like Meta's LLaMA on a home server. It's pretty slick once you get the dependencies sorted out. Gives great performance for personal projects without any API costs. Curious what hardware setup others are running?
Yes, we’ve built and hosted our own AI infrastructure! (easybits.tech) It’s designed to allow developers to integrate pre-trained models from leading providers like Anthropic, OpenAI, or Hugging Face into their applications seamlessly.
On top of that, we also run certain smaller models locally, which helps with tasks that need faster responses or more privacy. It’s been amazing to balance the flexibility of cloud-based solutions with the efficiency and control of local setups.
What’s exciting is that our infrastructure isn’t just for experienced developers – we’ve made it accessible through no-code modules. This makes creating and hosting custom AI setups smooth and adaptable, whether you’re scaling big or solving niche problems.
I self-hosted a chatbot fine-tuned for FAQs about my product.
A developer I know tried self-hosting an image recognition model for their startup. They eventually switched to a cloud-hosted option because maintaining the infrastructure was eating into their development time.
i tried it to learn kubernetes and train model it really opened my eyes!