Local LLM api error
Good morning Guys I'm trying to use Ollama local adding this line of code
def __init__(self):
self.llm = ChatOpenAI(
open_api_base="http://localhost:/1234/v1",
open_api_key="",
model_name="crewai-mistral"
)
and I have the following error "Did not find openai_api_key, please add an environment variable `OPENAI_API_KEY` which contains it, or pass `openai_api_key` as a named parameter. (type=value_error)" but I don't need any key because is local.
Can someone help?
0
17 comments
Stefan Padron
3
Local LLM api error
AI Developer Accelerator
skool.com/ai-developer-accelerator
Master AI & software development to build apps and unlock new income streams. Transform ideas into profits. 💡➕🤖➕👨‍💻🟰💰
Leaderboard (30-day)
Powered by