using combined models
How to use GPT-4o/4.5, Deep Research and o1/o3 in a combination for better results.
Using multiple LLMs as an "LLM council, "Consult multiple LLMs by asking them the same question and synthesizing the responses. For example, when seeking travel recommendations, ask Gemini, Claude, and Grok for suggestions.
Starting a new chat for each topic, To keep the context window clear and focused, start a new chat when switching topics. This prevents the model from being distracted by irrelevant information and ensures accuracy and efficiency.
Combining system-wide transcription with LLMs, On desktop, use a system-wide transcription app (like Super Whisper) to convert speech to text, which is then fed into the LLM. This enables quick, hands-free interaction without the need to type.
Reading books with LLMs, Upload chapters from books into an LLM and ask it to summarize and clarify sections. This approach helps with understanding and retention, especially for complex or older texts.
Vibe coding with Cursor and Composer, Instead of using web-based interfaces for coding, use the Cursor app with its Composer feature—described as "vibe coding." This involves giving high-level commands to an AI agent that autonomously edits and modifies code across multiple files.
Using custom GPTs for language learning, Create custom GPTs tailored for specific language learning tasks, such as vocabulary extraction and detailed translation. These custom GPTs save prompting time and often provide more accurate translations than other online tools.
Generating custom podcasts, Utilize Google's NotebookLM to generate custom podcasts from uploaded documents or web pages on niche topics of personal interest. This allows for passive learning while walking or driving.
Applying deep research for product comparisons, Leverage deep research capabilities to generate thorough reports comparing different products. For example, research various browsers to determine which one offers better privacy.
Checking and scrutinizing the output, especially from Advanced Data AnalysisEven though Advanced Data Analysis can create impressive figures, it’s important to understand what the code is doing, scrutinize it, and monitor the outputs closely, as it can sometimes be slightly off.
Double checking answers with citations, After an LLM provides an answer, use the provided citations to verify that the information is accurate and not a hallucination.
Switching to a reasoning model, If the model struggles with solving problems, especially in math, code, or reasoning, consider switching to a reasoning-focused model.
Using a Python interpreter, For generating figures or plots and displaying them, use a Python interpreter or similar tool, such as Advanced Data Analysis.
Being aware of multimodality, Stay mindful of different modalities—audio, images, and video—and whether these are handled natively within the language model.
Using memory features, Utilize memory features to allow the LLM to learn preferences over time, making its responses more relevant to your needs.
Using custom instructions, Customize your LLM by adding instructions to have it communicate in your preferred manner.
2
4 comments
Ray Merlin
6
using combined models
AI Marketing
skool.com/ai-community
Improve your marketing with AI. For entrepreneurs, business owners, marketers and creators.
Leaderboard (30-day)
Powered by