All activity
![Xavier M](https://ph-avatars.imgix.net/105910/original.jpeg?auto=compress&codec=mozjpeg&cs=strip&auto=format&w=48&h=48&fit=crop&frame=1)
LLaMA is a collection of foundation language models ranging from 7B to 65B parameters and show that it is possible to train state-of-the-art models using publicly available datasets exclusively, without resorting to proprietary and inaccessible datasets.
![LLaMA](https://ph-files.imgix.net/52195bb8-3308-464b-ad4b-2d4a7f41977d.png?auto=compress&codec=mozjpeg&cs=strip&auto=format&w=48&h=48&fit=crop&frame=1)
LLaMA
A foundational, 65-billion-parameter large language model