mirror of https://github.com/apache/superset.git
Provide documentation for using a Service Account to connect to BigQuery (#8462)
* Provide documentation for using a Service Account to connect to BigQuery * Alter line wrapping for shorter lines * Whitespace commit to trigger another build (flake) * Another meaningless whitespace change to trigger another build
This commit is contained in:
parent
8b74745f9e
commit
1adf7426c2
|
@ -434,8 +434,40 @@ The connection string for BigQuery looks like this ::
|
|||
|
||||
bigquery://{project_id}
|
||||
|
||||
Additionally, you will need to configure authentication via a
|
||||
Service Account. Create your Service Account via the Google
|
||||
Cloud Platform control panel, provide it access to the appropriate
|
||||
BigQuery datasets, and download the JSON configuration file
|
||||
for the service account. In Superset, Add a JSON blob to
|
||||
the "Secure Extra" field in the database configuration page
|
||||
with the following format ::
|
||||
|
||||
{
|
||||
"credentials_info": <contents of credentials JSON file>
|
||||
}
|
||||
|
||||
The resulting file should have this structure ::
|
||||
|
||||
{
|
||||
"credentials_info": {
|
||||
"type": "service_account",
|
||||
"project_id": "...",
|
||||
"private_key_id": "...",
|
||||
"private_key": "...",
|
||||
"client_email": "...",
|
||||
"client_id": "...",
|
||||
"auth_uri": "...",
|
||||
"token_uri": "...",
|
||||
"auth_provider_x509_cert_url": "...",
|
||||
"client_x509_cert_url": "...",
|
||||
}
|
||||
}
|
||||
|
||||
You should then be able to connect to your BigQuery datasets.
|
||||
|
||||
To be able to upload data, e.g. sample data, the python library `pandas_gbq` is required.
|
||||
|
||||
|
||||
Elasticsearch
|
||||
-------------
|
||||
|
||||
|
|
Loading…
Reference in New Issue