We made a ChatGPT/Gemini-like interface for WebGPU open-surce LLMs in the browser. Chat with Gemma, Llam3, Mistral and more with no server side processing - everything happens locally in your browser. Utilizing WebGPU and WASM.
Try it out: https://chattyui.com/
Repository: https://github.com/addyosmani/chatty