Hello Everyone, I'm completely new to Fabric. I'm currently working on a project focused on forecasting electricity prices for two different locations. The project involves pulling data from a REST API and storing it in OneLake. I recently watched a video that provided insights into pulling specific data from an API. However, my project requires a different approach: I need to pull data from the API every 15 minutes and ingest this time-series data into OneLake in real time. Here's a summary of what I'm looking to accomplish: 1. Data Ingestion Pipeline: I need to create a data pipeline that continuously ingests data from the API every 15 minutes into OneLake. 2. Handling Multiple Datasets: I also need guidance on pulling multiple datasets from the API and storing them in OneLake, especially for different nodes/locations. 3. Machine Learning Integration: After the data is loaded into OneLake, I plan to implement the machine learning part of the project. I'm struggling to find suitable resources or videos that address these specific needs. Could someone provide guidance on how to set up this pipeline, or point me toward resources or procedures that would help me complete this project? Your assistance would be greatly appreciated. Thank you!