bigfunctions > export_to_datastore
export_to_datastore¶
Call or Deploy export_to_datastore
?
✅ You can call this export_to_datastore
bigfunction directly from your Google Cloud Project (no install required).
- This
export_to_datastore
function is deployed inbigfunctions
GCP project in 39 datasets for all of the 39 BigQuery regions. You need to use the dataset in the same region as your datasets (otherwise you may have a function not found error). - Function is public, so it can be called by anyone. Just copy / paste examples below in your BigQuery console. It just works!
- You may prefer to deploy the BigFunction in your own project if you want to build and manage your own catalog of functions. This is particularly useful if you want to create private functions (for example calling your internal APIs). Discover the framework
Public BigFunctions Datasets:
Region | Dataset |
---|---|
eu |
bigfunctions.eu |
us |
bigfunctions.us |
europe-west1 |
bigfunctions.europe_west1 |
asia-east1 |
bigfunctions.asia_east1 |
... | ... |
Description¶
Signature
export_to_datastore(datastore_path, key, data)
Description
Exports data
to Datastore
(Firestore in Datastore mode) to key
at datastore_path
(like project/database/namespace/kind
)
💡 For this to work,
bigfunction@bigfunctions.iam.gserviceaccount.com
must have datastore user role in your project.
Examples¶
1. Export data
to default database, to default namespace with auto-generated key
.
select bigfunctions.eu.export_to_datastore('your-project/default/default/users', null, json '{"name": "Marc Harris", "email": "marc@harris.com"}')
select bigfunctions.us.export_to_datastore('your-project/default/default/users', null, json '{"name": "Marc Harris", "email": "marc@harris.com"}')
select bigfunctions.europe_west1.export_to_datastore('your-project/default/default/users', null, json '{"name": "Marc Harris", "email": "marc@harris.com"}')
+------------------+
| key |
+------------------+
| 4503604769587200 |
+------------------+
2. Export data
with email as key
.
select bigfunctions.eu.export_to_datastore('your-project/default/default/users', 'marc@harris.com', json '{"name": "Marc Harris"}')
select bigfunctions.us.export_to_datastore('your-project/default/default/users', 'marc@harris.com', json '{"name": "Marc Harris"}')
select bigfunctions.europe_west1.export_to_datastore('your-project/default/default/users', 'marc@harris.com', json '{"name": "Marc Harris"}')
+-----------------+
| key |
+-----------------+
| marc@harris.com |
+-----------------+
Need help using export_to_datastore
?
The community can help! Engage the conversation on Slack
For professional suppport, don't hesitate to chat with us.
Found a bug using export_to_datastore
?
If the function does not work as expected, please
- report a bug so that it can be improved.
- or open the discussion with the community on Slack.
For professional suppport, don't hesitate to chat with us.
Use cases¶
This export_to_datastore
function is useful for scenarios where you need to move or synchronize data from BigQuery to Datastore (Firestore in Datastore mode). Here are a few use cases:
-
Materialized Views in Datastore: You might have complex analytical queries in BigQuery that produce aggregated data. Instead of recomputing these queries every time you need the results, you could use
export_to_datastore
to periodically store the aggregated data in Datastore. This makes accessing these aggregations much faster for applications that don't need the full power of BigQuery. -
Data Synchronization for Microservices: Imagine a microservice architecture where one service uses BigQuery for analytics and another service relies on Datastore for operational data. You can use this function to keep the relevant data synchronized between the two data stores. For example, BigQuery might store a user's purchase history, while Datastore stores their profile information. You can use
export_to_datastore
to update the Datastore profile with aggregated purchase statistics calculated in BigQuery. -
Creating Lookups for Real-time Applications: BigQuery is great for analytical workloads, but not ideal for low-latency lookups. If your application needs to quickly retrieve data based on a key, you can use
export_to_datastore
to create a lookup table in Datastore. For instance, you might have product information stored in BigQuery, and you want to quickly retrieve product details by their SKU. You could export the SKU and relevant product details to Datastore for faster retrieval by your application. -
Simplifying Data Access for Non-technical Users: Datastore often provides a simpler interface for accessing data compared to BigQuery, especially for users who are not familiar with SQL. You can use
export_to_datastore
to make specific datasets available in Datastore, allowing non-technical users to access and manipulate data more easily. -
Backup and Restore: While not a primary backup solution,
export_to_datastore
could be used in conjunction with other backup methods to create a copy of specific BigQuery data in Datastore, particularly for smaller, critical datasets.
Example: Building a Real-time Product Catalog
Let's say you have product information stored in a BigQuery table called products
. You want to display this information on your website, which requires low-latency data access. Here's how you could use export_to_datastore
:
- BigQuery Query: Write a query that selects the relevant product information (e.g., product_id, name, price, description) from the
products
table. export_to_datastore
Function: Use the function within your BigQuery query to export the results to Datastore. You would use theproduct_id
as thekey
and the remaining product information as thedata
.
SELECT bigfunctions.us.export_to_datastore(
'your-project',
null, -- Use default namespace
'product_catalog', -- Kind for product data
CAST(product_id as STRING), -- Product ID as key
TO_JSON_STRING(STRUCT(name, price, description)) -- Product details as JSON
)
FROM `your_project.your_dataset.products`;
- Website Integration: Your website's backend can now efficiently retrieve product information from Datastore using the
product_id
as the key, providing a fast and responsive user experience.
This is just one example. The versatility of the export_to_datastore
function allows you to bridge the gap between BigQuery's analytical capabilities and Datastore's operational strengths in a variety of situations.
Spread the word¶
BigFunctions is fully open-source. Help make it a success by spreading the word!