get_json¶
get_json(url, headers)
Description¶
GET json data
from url
Examples¶
Call or Deploy get_json
?
Call get_json
directly
The easiest way to use bigfunctions
get_json
function is deployed in 39 public datasets for all of the 39 BigQuery regions.- It can be called by anyone. Just copy / paste examples below in your BigQuery console. It just works!
- (You need to use the dataset in the same region as your datasets otherwise you may have a function not found error)
Public BigFunctions Datasets
Region | Dataset |
---|---|
eu |
bigfunctions.eu |
us |
bigfunctions.us |
europe-west1 |
bigfunctions.europe_west1 |
asia-east1 |
bigfunctions.asia_east1 |
... | ... |
Deploy get_json
in your project
Why deploy?
- You may prefer to deploy
get_json
in your own project to build and manage your own catalog of functions. - This is particularly useful if you want to create private functions (for example calling your internal APIs).
- Get started by reading the framework page
Deployment
get_json
function can be deployed with:
pip install bigfunctions
bigfun get get_json
bigfun deploy get_json
select bigfunctions.eu.get_json('https://api.github.com/repos/unytics/bigfunctions', null)
select bigfunctions.us.get_json('https://api.github.com/repos/unytics/bigfunctions', null)
select bigfunctions.europe_west1.get_json('https://api.github.com/repos/unytics/bigfunctions', null)
+-------+
| data |
+-------+
| {...} |
+-------+
Need help or Found a bug using get_json
?
Get help using get_json
The community can help! Engage the conversation on Slack
We also provide professional suppport.
Report a bug about get_json
If the function does not work as expected, please
- report a bug so that it can be improved.
- or open the discussion with the community on Slack.
We also provide professional suppport.
Use cases¶
A use case for the get_json
function is enriching data in BigQuery with information from an external API.
Scenario: You have a table in BigQuery containing information about GitHub repositories, including their names. You want to enrich this data with details like the number of stars, forks, and open issues for each repository. The GitHub API provides this information in JSON format.
Implementation:
-
BigQuery Table: Let's assume your BigQuery table is named
github_repos
and has a column namedrepo_name
containing the names of the repositories (e.g., "unytics/bigfunctions"). -
BigQuery Query using
get_json
: You can use the following query to fetch data from the GitHub API and extract the desired information:
SELECT
repo_name,
JSON_EXTRACT_SCALAR(get_json(CONCAT('https://api.github.com/repos/', repo_name), NULL), '$.stargazers_count') AS stars,
JSON_EXTRACT_SCALAR(get_json(CONCAT('https://api.github.com/repos/', repo_name), NULL), '$.forks_count') AS forks,
JSON_EXTRACT_SCALAR(get_json(CONCAT('https://api.github.com/repos/', repo_name), NULL), '$.open_issues_count') AS open_issues
FROM
`your-project.your-dataset.github_repos`;
Explanation:
CONCAT('https://api.github.com/repos/', repo_name)
dynamically constructs the URL for each repository's API endpoint.get_json(..., NULL)
fetches the JSON data from the constructed URL. The second argumentNULL
indicates that no custom headers are needed for this request.JSON_EXTRACT_SCALAR(..., '$.stargazers_count')
extracts the value associated with the keystargazers_count
from the JSON response. Similarly, we extractforks_count
andopen_issues_count
.- Remember to replace
your-project.your-dataset.github_repos
with the actual path to your BigQuery table and select the correct regional dataset forbigfunctions
(e.g.,bigfunctions.us
,bigfunctions.eu
).
Result: This query will produce a new table with the original repo_name
and the newly fetched stars
, forks
, and open_issues
columns.
This example demonstrates how get_json
can be used to integrate external API data directly into BigQuery for analysis and reporting, avoiding the need for intermediate data extraction and loading steps. You can adapt this pattern to interact with other APIs that provide JSON data.
Spread the word!¶
BigFunctions is fully open-source. Help make it a success by spreading the word!