A grassroots collective of researchers working to open source AI research. GPT-Neo is the name of the codebase for transformer-based language models loosely styled around the GPT architecture.
GPT-J-6B, a 6 billion parameter model trained on the Pile, is now available for use with our new codebase, Mesh Transformer JAX.
EleutherAI GPT-Neo
A grassroots collective of researchers working to open source AI research. GPT-Neo is the name of the codebase for transformer-based language models loosely styled around the GPT architecture.