Hey All - I have a workflow where I am downloading PDF catering menus, parsing and formatting with GPT-4o, and then outputting in JSON. Some of the menus are 50+ pages, so the really long ones are too long for GPT-4o output token limits.
Any ideas how I can process these large files? Is there a better model to use? Is there an easy way to chunk/batch the large menus, process them, and then put them back together as one JSON?
Any help much appreciated. Thanks!