Experiment with AI models locally without the need to setup a full-blown ML stack. Powered by a native app designed to simplify the whole process from model downloading to starting an inference server. No GPU required!
Recent launches
local.ai
Experiment with AI models locally without the need to setup a full-blown ML stack. Powered by a native app created using Rust, and designed to simplify the whole process from model downloading to starting an inference server. No GPU required!