I am working with a large dataset stored in Google BigQuery and need to load a table exceeding 10GB into Power BI Premium. The table changes entirely every day, so a full daily refresh of the entire table is required.
Challenges:
- BigQuery’s 10GB Query Result Limit: The query result size exceeds this limit, preventing me from fetching the entire table in a single query.
 
- Power BI Import Mode: The dataset must be fully refreshed daily, ruling out incremental refresh or partial updates as a solution.
 
- Daily Full Table Updates: The entire table changes daily, so I cannot rely on partitioning or historical partitions for refresh.
 
What I’ve Considered:
Constraints:
- I am not interested in approaches to reduce the table size below 10GB (e.g., aggregating data, reducing columns, or filtering). Let’s assume all such optimizations have already been implemented.
 
- The solution must support scheduled refresh in Power BI Premium.
 
Question:
What is the best scalable and robust solution for loading large tables exceeding BigQuery’s 10GB limit into Power BI Premium, ensuring the entire table is refreshed daily?
I am looking for practical and proven approaches to:
- Overcome BigQuery’s query result size limit.
 
- Automate the workflow for seamless daily updates.
 
- Leverage Power BI Premium’s capabilities for large datasets.
 
Additional Details:
- Data Source: Google BigQuery
 
- Table Size: ~50GB
 
- Power BI Environment: Premium
 
- Frequency: Daily full-table refresh required
 
I’d greatly appreciate any insights, examples, or step-by-step approaches to address this challenge.