Few shot prompting for long examples
I am working on a project in which the inputs and outputs are supposed to be quiet long and it's a very complicated use-case, now I want to give it examples to the LLM but I don't think its a good idea to increase the token usage too much. As that also can cause issues, what I'm seeing is that the more the tokens the more mistakes its making. Currently I am giving it just one example which seems like isn't enough, the total token usage is around 25k
0
0 comments
Muhammad Zafar
1
Few shot prompting for long examples
AI Developer Accelerator
skool.com/ai-developer-accelerator
Master AI & software development to build apps and unlock new income streams. Transform ideas into profits. 💡➕🤖➕👨‍💻🟰💰
Leaderboard (30-day)
Powered by