Department No. 01
Model Store
Download and manage local AI models. All models run entirely in your browser using WebGPU — no data ever leaves your device.
WebGPU not available
Your browser does not support WebGPU. Try the latest Chrome or Edge. Alternatively, install Ollama for local AI via API.
Models are powered by WebLLM and run entirely in your browser using WebGPU. Downloaded models are cached in IndexedDB for offline use. No data is sent to any server.