Pre-prompting your LLM increases performance
Research done at UoW shows that pre-prompting your LLM, or providing context prior to asking your question leads to better results. Even when the context is self generated.
For example asking,
"What should I do while in Rome?"
is less effective than a series of prompts,
"What are the top restaraunts in Rome?"
"What are the top sight seeing locations in Rome?"
"Best things to do in Rome"
"What should I do in Rome?"
I always figured this was the case from anecdotal evidence but good to see people who are way starter than me explain it in this paper.
And while chain prompting is a little more time consuming there's chrome extensions like ChatGPT Queue that ease up the process.
Are their any other "hacks" to squeeze out better performance ?
5
4 comments
Joel Cedano
5
Pre-prompting your LLM increases performance
ChatGPT Users
skool.com/chatgpt
A home for entrepreneurs who use ChatGPT to discuss, discover, and connect with others using this incredible AI technology. ⭐️ Invite your friends ⭐️
Leaderboard (30-day)
Powered by