Prompt API Token Quota in Edge and Chrome
I previously got this wrong because I relied on ChatGPT (or maybe it was Perplexity?), but Edge actually supports only about 4k tokens in the input context, while Chrome allows around 9k tokens — nearly double.
The only real advantage Edge still has is that it’s freely available, meaning it can also be used in tools like Ollama or Jan AI.
You can check the model’s true quota like this — the output shows how much of your input limit has already been used:
const session = await LanguageModel.create();
console.log(`${session.inputUsage}/${session.inputQuota}`);
0
0 comments
Stephan Haewß
2
Prompt API Token Quota in Edge and Chrome
powered by
AI Dev Community
skool.com/ai-dev-community-de-7617
Lerne, wie du die Zukunft baust. Integriere ChatGPT & lokale Modelle in deine Apps, teile Projekte & entwickle dich mit anderen Programmierern weiter.
Build your own community
Bring people together around your passion and get paid.
Powered by