Hello everyone,
I am currently working on a project where I need to load around 100 Excel files into a data pipeline. Each file is approximately 3 MB in size. I’ve been using the Copy Activity in the data pipeline to accomplish this task.
However, I’ve been encountering an issue. Each time I try to load the files, I receive an error message. Despite my best efforts, I haven’t been able to figure out a solution to this problem.
I was wondering if anyone in this forum has encountered a similar issue and could provide some guidance. Alternatively, if there’s a more efficient way to load and append multiple Excel files in a Lakehouse, I would greatly appreciate any suggestions.
Thank you in advance for your help!
ErrorCode=ExcelInvalidDataSize,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The excel file 'EXPORT_2022.XLSX' size is big and will lead interactive operation timeout, please use sample file to have import schema, preview data and get worksheet list.,Source=Microsoft.DataTransfer.ClientLibrary,'