ChattyUI

ChattyUI

Run open-source LLMs locally in the browser using WebGPU
0 reviews
92followers
Visit website
Do you use ChattyUI?
What is ChattyUI?
Open-source, feature rich Gemini/ChatGPT-like interface for running open-source models (Gemma, Mistral, LLama3 etc.) locally in the browser using WebGPU. No server-side processing - your data never leaves your pc!
ChattyUI media 1
ChattyUI media 2

Recent launches

ChattyUI
Open-source, feature rich Gemini/ChatGPT-like interface for running open-source models (Gemma, Mistral, LLama3 etc.) locally in the browser using WebGPU. No server-side processing - your data never leaves your pc!
ChattyUI image