a vapi custom llm solution
in case anybody was looking to do this...
i am in love with vapi (for the most part). they've taken care of a lot of the heavy lifting when it comes to the telephony part of building agents. but i have had a lot of trouble getting llm's to behave when feeding queries through their integrations. talking to some people a lot smarter than me, the opinion is that my prompting is being mixed in with some vapi blackbox prompts and may be suffering from "lost in the middle". so i built this solution that gives you a lot more control over your llm's. its not low-code, mainly because low code tools dont have the ability to stream events (sse) back to the client (in this case vapi). at least i dont think they do. it would be cool but for now i think you actually need a real server. ive included both a flask and a quart version of the app in the repo. this first iteration is not chatable (no memory). i did that to keep things as basic as possible so you could see how we are langchaining back to vapi. no explainer video yet but if enough interest id be happy to do walkthrough of the code. i also have a chatbot (conversation history) and an agent version talking to vapi but those are a bit more intricate. but if the interest is there, i can make those available. enjoy
3
2 comments
Jose Madarieta
2
a vapi custom llm solution
Brendan's AI Community
skool.com/brendan
Learn To Make Money With AI!
- 50+ Free AI Agent Templates
- 60+ Free AI Course Videos
(n8n, Make, Vapi, Voiceflow)
- AI Software Discounts 💰
Leaderboard (30-day)
Powered by