Content length issue with LM studio local models.
I'm asking the LM studio models to summarize long text of youtube transcripts and seems like they cannot ingest large amount of data.
does anyone have a solution to this?
i also need the output in Korean.
1
1 comment
Timmy Kim
1
Content length issue with LM studio local models.
powered by
Zero2Launch AI Automation
skool.com/zero2launch-ai-automation-5951
We simplify advanced AI automation. "Explain like I'm five" - this is the place.
Build your own community
Bring people together around your passion and get paid.
Powered by