From 1adf7426c2fdf5b805eaf13b46a5275c7293234e Mon Sep 17 00:00:00 2001 From: Will Barrett Date: Tue, 29 Oct 2019 12:31:45 -0700 Subject: [PATCH] Provide documentation for using a Service Account to connect to BigQuery (#8462) * Provide documentation for using a Service Account to connect to BigQuery * Alter line wrapping for shorter lines * Whitespace commit to trigger another build (flake) * Another meaningless whitespace change to trigger another build --- docs/installation.rst | 32 ++++++++++++++++++++++++++++++++ 1 file changed, 32 insertions(+) diff --git a/docs/installation.rst b/docs/installation.rst index d41a0cdf1f..2eb28fac61 100644 --- a/docs/installation.rst +++ b/docs/installation.rst @@ -434,8 +434,40 @@ The connection string for BigQuery looks like this :: bigquery://{project_id} +Additionally, you will need to configure authentication via a +Service Account. Create your Service Account via the Google +Cloud Platform control panel, provide it access to the appropriate +BigQuery datasets, and download the JSON configuration file +for the service account. In Superset, Add a JSON blob to +the "Secure Extra" field in the database configuration page +with the following format :: + + { + "credentials_info": + } + +The resulting file should have this structure :: + + { + "credentials_info": { + "type": "service_account", + "project_id": "...", + "private_key_id": "...", + "private_key": "...", + "client_email": "...", + "client_id": "...", + "auth_uri": "...", + "token_uri": "...", + "auth_provider_x509_cert_url": "...", + "client_x509_cert_url": "...", + } + } + +You should then be able to connect to your BigQuery datasets. + To be able to upload data, e.g. sample data, the python library `pandas_gbq` is required. + Elasticsearch -------------