Provide documentation for using a Service Account to connect to BigQuery (#8462)

* Provide documentation for using a Service Account to connect to BigQuery

* Alter line wrapping for shorter lines

* Whitespace commit to trigger another build (flake)

* Another meaningless whitespace change to trigger another build
This commit is contained in:
Will Barrett 2019-10-29 12:31:45 -07:00 committed by Maxime Beauchemin
parent 8b74745f9e
commit 1adf7426c2
1 changed files with 32 additions and 0 deletions

View File

@ -434,8 +434,40 @@ The connection string for BigQuery looks like this ::
bigquery://{project_id}
Additionally, you will need to configure authentication via a
Service Account. Create your Service Account via the Google
Cloud Platform control panel, provide it access to the appropriate
BigQuery datasets, and download the JSON configuration file
for the service account. In Superset, Add a JSON blob to
the "Secure Extra" field in the database configuration page
with the following format ::
{
"credentials_info": <contents of credentials JSON file>
}
The resulting file should have this structure ::
{
"credentials_info": {
"type": "service_account",
"project_id": "...",
"private_key_id": "...",
"private_key": "...",
"client_email": "...",
"client_id": "...",
"auth_uri": "...",
"token_uri": "...",
"auth_provider_x509_cert_url": "...",
"client_x509_cert_url": "...",
}
}
You should then be able to connect to your BigQuery datasets.
To be able to upload data, e.g. sample data, the python library `pandas_gbq` is required.
Elasticsearch
-------------