LLama2 is free... Will you build with it?

Cyril Gupta
10 replies
Meta just made it's Large Language Model, Llama2 free for commercial use. You can use it in your projects without paying any licensing fee or api costs. So just host and use.. What would you like to build with it?

Replies

Vincent Lonij
Thread - Wireframe Generator
Thread - Wireframe Generator
For online apps, The trade-off here sounds similar to companies that offer a open-source version of their product next to a paid cloud-hosted version. Either you pay with cash or you pay with time. Where I see this really making a difference is for apps that cannot rely on a permanent internet connection. Does anyone know what kind of hardware is needed to run this model?
Vincent Lonij
Thread - Wireframe Generator
Thread - Wireframe Generator
@cyriljeet Hmm that does not exactly sound easy or cheap to set up. Don't get me wrong, I still think it's really cool that they released this, but I'm not yet seeing the commercial benefit.
Cyril Gupta
CloudFunnels AI
CloudFunnels AI
@vincentropy I think a capable server cluster with GPUs
André J
@vincentropy maxed out MacBook Pro. $5-10k
Kunal Mehta
Absolutely! LLama2 being free makes it an excellent choice for building our edtech application. Its cost-effectiveness allows us to focus on delivering high-quality education solutions to a broader audience without worrying about financial barriers. Let's harness LLama2's potential to create a revolutionary and accessible learning platform for all.
Cyril Gupta
CloudFunnels AI
CloudFunnels AI
@realkunalmehta Awesome! Looking forward to checking out what you build
Stas Prohetamine
I may be wrong, but I have experience with llama and I can say that a cloud version deployed somewhere on a server where you pay for API access will probably be more profitable than renting your own servers and maintaining a language model, perhaps this is not the case..
Cyril Gupta
CloudFunnels AI
CloudFunnels AI
@prohetamine Thanks for some practical insight!
Stas Prohetamine
The fact is that I was running llama on one of my own servers, which is quite powerful, and I saw a performance problem
Cyril Gupta
CloudFunnels AI
CloudFunnels AI
@prohetamine Thank you! This makes things clearer!