Skip to content

bigfunctions > refresh_powerbi

refresh_powerbi

Call or Deploy refresh_powerbi ?

✅ You can call this refresh_powerbi bigfunction directly from your Google Cloud Project (no install required).

  • This refresh_powerbi function is deployed in bigfunctions GCP project in 39 datasets for all of the 39 BigQuery regions. You need to use the dataset in the same region as your datasets (otherwise you may have a function not found error).
  • Function is public, so it can be called by anyone. Just copy / paste examples below in your BigQuery console. It just works!
  • You may prefer to deploy the BigFunction in your own project if you want to build and manage your own catalog of functions. This is particularly useful if you want to create private functions (for example calling your internal APIs). Discover the framework

Public BigFunctions Datasets:

Region Dataset
eu bigfunctions.eu
us bigfunctions.us
europe-west1 bigfunctions.europe_west1
asia-east1 bigfunctions.asia_east1
... ...

Description

Signature

refresh_powerbi(dataset_id, workspace_id, tenant_id, app_id, token_secret, custom_refresh_param)

Description

Refresh a Power BI dataset (semantic model) by it's id dataset_id.

Use case:

After model refresh, launch Power BI dataset (semantic model) refresh from BigQuery in SQL

graph refresh power bi

Optionnal:

on premium capacity, you can pass json argument (xmla like) to launch a custom refresh (ex: Full refresh only a given table )

Docs:

Encrypt your secrets

We advise NOT TO write your token in plain text in token_secret argument.

Otherwise, they will be stored in plain text in your BigQuery logs for months.

Instead, you can use the following snippet to generate an encrypted version of token_secret that you can copy safely as token_secret argument.

This public bigfunction (deployed on bigfunctions GCP project) will be able to decrypt it. But no one else can.

Examples

1. Refresh of a dataset

select bigfunctions.eu.refresh_powerbi('xxx-xxx-xxx', 'xxx-xxx-xxx', 'xxx-xxx-xxx', 'xxx-xxx-xxx', 'ENCRYPTED_SECRET(GvVm...)', null)
select bigfunctions.us.refresh_powerbi('xxx-xxx-xxx', 'xxx-xxx-xxx', 'xxx-xxx-xxx', 'xxx-xxx-xxx', 'ENCRYPTED_SECRET(GvVm...)', null)
select bigfunctions.europe_west1.refresh_powerbi('xxx-xxx-xxx', 'xxx-xxx-xxx', 'xxx-xxx-xxx', 'xxx-xxx-xxx', 'ENCRYPTED_SECRET(GvVm...)', null)
+----------+
| response |
+----------+
| ok       |
+----------+

2. custom refresh (xmla like) - premium capacity only

select bigfunctions.eu.refresh_powerbi('xxx-xxx-xxx', 'xxx-xxx-xxx', 'xxx-xxx-xxx', 'xxx-xxx-xxx', 'ENCRYPTED_SECRET(GvVm...)', json '{ "type": "Full", "objects": [ { "table": "table_name" } ] }')
select bigfunctions.us.refresh_powerbi('xxx-xxx-xxx', 'xxx-xxx-xxx', 'xxx-xxx-xxx', 'xxx-xxx-xxx', 'ENCRYPTED_SECRET(GvVm...)', json '{ "type": "Full", "objects": [ { "table": "table_name" } ] }')
select bigfunctions.europe_west1.refresh_powerbi('xxx-xxx-xxx', 'xxx-xxx-xxx', 'xxx-xxx-xxx', 'xxx-xxx-xxx', 'ENCRYPTED_SECRET(GvVm...)', json '{ "type": "Full", "objects": [ { "table": "table_name" } ] }')
+----------+
| response |
+----------+
| ok       |
+----------+

Need help using refresh_powerbi?

The community can help! Engage the conversation on Slack

For professional suppport, don't hesitate to chat with us.

Found a bug using refresh_powerbi?

If the function does not work as expected, please

  • report a bug so that it can be improved.
  • or open the discussion with the community on Slack.

For professional suppport, don't hesitate to chat with us.

Use cases

A common use case for the refresh_powerbi function is automating the refresh of a Power BI dataset after data in its connected BigQuery tables has been updated.

Scenario: Imagine you have a BigQuery data warehouse that is used as a source for a Power BI dashboard. You have a daily ETL process that updates several tables in BigQuery. After this process completes, you want to ensure that the Power BI dataset is refreshed so that the dashboard reflects the latest data.

Implementation: You could use an orchestration tool like Airflow, Cloud Composer, or Cloud Functions to schedule the ETL process and the subsequent Power BI dataset refresh. After the ETL tasks have successfully completed, a final task would call the refresh_powerbi function. This function would trigger the refresh of the Power BI dataset using the provided credentials and parameters.

Example (using Airflow):

from airflow import DAG
from airflow.providers.google.cloud.operators.bigquery import BigQueryInsertJobOperator
from datetime import datetime

with DAG(
    dag_id="refresh_powerbi_example",
    start_date=datetime(2023, 10, 26),
    schedule_interval="@daily",
    catchup=False,
) as dag:
    # ETL tasks (e.g., loading data into BigQuery)
    etl_task_1 = BigQueryInsertJobOperator(
        task_id="etl_task_1",
        configuration={
            "query": {
                "query": "your_etl_query_1",
                "useLegacySql": False,
            }
        },
    )

    etl_task_2 = BigQueryInsertJobOperator(
        task_id="etl_task_2",
        configuration={
            "query": {
                "query": "your_etl_query_2",
                "useLegacySql": False,
            }
        },
    )


    # Refresh Power BI dataset after ETL completes
    refresh_powerbi_task = BigQueryInsertJobOperator(
        task_id="refresh_powerbi",
        configuration={
            "query": {
                "query": f"""
                    SELECT bigfunctions.{your_region}.refresh_powerbi(
                        '{your_dataset_id}',
                        '{your_workspace_id}',
                        '{your_tenant_id}',
                        '{your_app_id}',
                        'ENCRYPTED_SECRET({your_encrypted_token})',
                        NULL
                    );
                """,
                "useLegacySql": False,
            }
        },
    )


    [etl_task_1, etl_task_2] >> refresh_powerbi_task

Replace the placeholder values with your actual configuration. This setup ensures that the Power BI dataset is automatically refreshed after the ETL process finishes, keeping the dashboard up-to-date. This automation simplifies data management and provides users with the most current insights.

Spread the word

BigFunctions is fully open-source. Help make it a success by spreading the word!

Share on Add a on