Hello everyone this is my first post, recently join a company in my first official data eng role, I am looking to have some help here, I am being tasked of transfering a 150M records from a RDS MySQL table to S3 (.parquet files). The table is so hard to query that data is dropped daily, storing only 90 days. And this is one of the problems of the migration, that querying from it is impossible my first approach was with a simple lambda, mysql connector and python script and chunk it, but would take me like 2 days if i do that. Also idea is to have this data somewhere else before thinking in a Lakehouse solution. My questions are: - What services do you recommend to make this migration as fast and smooth as possible, just once. First though is glue (used before but different purpose), or DMS service (have not never used it). - What ETL you propose to make this process daily (1.5M records) comes to glue again if i am succesfull with the first bullet point - Lastly, this data is desired to be used for analytics, initially will be in S3 to make queries using athena while the team gains idea about the kpis they want to track, in the future the idea is to have it somewhere else that make it fast to query and build models with it. The whole company env. is in AWS so my first though is RedShift but I really like the efficiency and how GoogleBigQuery handles this amount of data Thank you so much for reading!