React App to interface with locally hosted LLMs
Figured I would share this app I have been making, mainly to learn react. I host large language models locally using Ollama. With Ollama, you can download and host open source models for chat completions and generation, as well as opensource embedding models as well. This exposes an API endpoint to your local network and can be called using a similar payload structure as OpenAI.
Payload: {"model":"llama3.1:latest","messages":[{"role":"user","content":"Hello there!"}],"stream":false}
This app is just a Chatbot UI using React, Tailwind CSS, Radix UI components, and a Python Django rest API to handle storing conversations, messages, available models, user data, etc.
Aside from being a chatbot, LLMs are also used for other features, like generating a meaningful titles for your conversations based on the messages.
There are mutilmodal models that are open source, like llava: llava (ollama.com), which I could use for analyzing images in the future. I also run image generation models locally as well, like Stable Diffusion, which could be used for generating images for users.
Ollama: Ollama
2
2 comments
Ethan Christensen
1
React App to interface with locally hosted LLMs
AI Think Tank
skool.com/rk-software-services-2370
Let’s build a community of AI Enthusiasts
Leaderboard (30-day)
Powered by