mirror of https://github.com/apache/superset.git
Compare commits
244 Commits
8ab058ae0f
...
6cb70f2b7e
Author | SHA1 | Date |
---|---|---|
GerbenvdHuizen | 6cb70f2b7e | |
Maxime Beauchemin | f5843fe588 | |
Maxime Beauchemin | 49231da42f | |
Frank Zimper | 517f254726 | |
dependabot[bot] | f95d9cde40 | |
Lily Kuang | 49992dd9d2 | |
Asaf Levy | 3e74ff174c | |
John Bodley | b4c4ab7790 | |
Đỗ Trọng Hải | 5331dc740a | |
John Bodley | 27952e7057 | |
John Bodley | 0ce5864fc7 | |
Đỗ Trọng Hải | 593c653ab5 | |
John Bodley | d36bccdc8c | |
Maxime Beauchemin | 513852b7c3 | |
John Bodley | e94360486e | |
dependabot[bot] | b17db6d669 | |
dependabot[bot] | f4b6c3049b | |
dependabot[bot] | 55391bb587 | |
Beto Dealmeida | 38e2843b24 | |
JUST.in DO IT | 7c8423a522 | |
Maxime Beauchemin | ec8351d336 | |
Evan Rusackas | e4f93b293f | |
Maxime Beauchemin | 2b4b771449 | |
Maxime Beauchemin | 538d1bb245 | |
Maxime Beauchemin | 3ac387bb66 | |
Beto Dealmeida | fe37d914e5 | |
Maxime Beauchemin | 51da5adbc7 | |
Evan Rusackas | 3cc8434c5a | |
Radek Antoniuk | c641bbfb9e | |
Đỗ Trọng Hải | 2e9cc654ef | |
Mathias Bögl | f03de27a92 | |
Ross Mabbett | 601896b1fc | |
Ross Mabbett | 2e5f3ed851 | |
Ross Mabbett | a38dc90abe | |
Ross Mabbett | efda57e8a5 | |
JUST.in DO IT | 44690fb299 | |
Maxime Beauchemin | f9f0bc687d | |
Ross Mabbett | 4d2247a7e1 | |
JUST.in DO IT | 743c0bde7e | |
dependabot[bot] | c975f97ce8 | |
Evan Rusackas | 76d897eaa2 | |
Maxime Beauchemin | a08c24c4aa | |
Sam Firke | fca3a525d0 | |
Sam Firke | db5edb3a42 | |
Sam Firke | 3a2a930ad3 | |
JUST.in DO IT | 7e94dc5b40 | |
JUST.in DO IT | cdbf8f394a | |
Beto Dealmeida | 173d5d09bf | |
John Bodley | c5e7d870f0 | |
Evan Rusackas | 1e47e65ac5 | |
Ville Brofeldt | 271510f0a8 | |
Martin Spudich | 63afa24c11 | |
jdrui | ce1d18e534 | |
Shao Yu-Lung (Allen) | 4afeabe042 | |
Beto Dealmeida | 6cf681df68 | |
Cykablyat | 52f8734662 | |
Maxime Beauchemin | 7b40b6426c | |
Maxime Beauchemin | 2d63722150 | |
Braum | e8a678b75a | |
John Bodley | bc65c245fe | |
Geido | a9075fdb1f | |
Maxime Beauchemin | 8baf754615 | |
Abhishek-kumar-samsung | cf90def462 | |
Maxime Beauchemin | 9db431b430 | |
dependabot[bot] | f155138659 | |
dependabot[bot] | efc1ab83d4 | |
Maxime Beauchemin | 68c77d6e9f | |
Maxime Beauchemin | 063914af04 | |
Jack | b7f3e0bb50 | |
Daniel Vaz Gaspar | de82d90b9c | |
Maxime Beauchemin | cfc440c56c | |
Elizabeth Thompson | 83fedcc9ea | |
Omar Al-Ithawi | cddcff2c63 | |
Lennard Scheibel | e889f17421 | |
dependabot[bot] | c23876d536 | |
dependabot[bot] | fe2be5ffea | |
dependabot[bot] | 1ef839ca6d | |
dependabot[bot] | f5bd183cc5 | |
dependabot[bot] | 7906e7b5ff | |
dependabot[bot] | 11dbae78cb | |
dependabot[bot] | 86d7b04eaa | |
dependabot[bot] | bfa2c46430 | |
Daniel Vaz Gaspar | e465876ed4 | |
Jon Edmiston | 37f900a264 | |
dependabot[bot] | 783dbb5040 | |
dependabot[bot] | 452e26ea79 | |
dependabot[bot] | 529954fbcb | |
dependabot[bot] | b168520ad2 | |
dependabot[bot] | 7b21e8fbae | |
dependabot[bot] | 76568807c4 | |
dependabot[bot] | 3f86df1445 | |
dependabot[bot] | 95b3f31d9c | |
dependabot[bot] | 17a65b1e03 | |
dependabot[bot] | 0f0f299b3e | |
dependabot[bot] | 6c045f4228 | |
Maxime Beauchemin | 947391778e | |
dependabot[bot] | 4c1e2bcc7a | |
dependabot[bot] | 6748c4361f | |
dependabot[bot] | 7e91d18853 | |
John Bodley | 2f11f66167 | |
Edgar Ulloa | 7263c7cb47 | |
Geido | 9ee56bbc5b | |
dependabot[bot] | 96cea5aa1f | |
dependabot[bot] | 9967f52657 | |
dependabot[bot] | 84ef17d818 | |
Braum | fda9c60e25 | |
Beto Dealmeida | 68a982dfe6 | |
Geido | 69a7bfc88d | |
Evan Rusackas | 0e096e8001 | |
Daniel Vaz Gaspar | 5ece57bd34 | |
Evan Rusackas | 8a390f03e3 | |
Geido | f3c538a3dd | |
dependabot[bot] | cd15655ed3 | |
dependabot[bot] | c457add848 | |
Evan Rusackas | 19170d94c8 | |
Daniel Vaz Gaspar | 594e5a50a3 | |
Maxime Beauchemin | 3310315d4b | |
Evan Rusackas | 80f76947d1 | |
Geido | 89da4f82d3 | |
Maxime Beauchemin | 5a9ddbba2e | |
Evan Rusackas | 181a901f75 | |
Michael S. Molina | aab645e01c | |
Maxime Beauchemin | e9c0ca545f | |
Michael S. Molina | 6e01a68276 | |
Evan Rusackas | 4b0452b87b | |
Maxime Beauchemin | de9daf7ad9 | |
Maxime Beauchemin | c225e17a75 | |
Elizabeth Thompson | 99c414e4da | |
Maxime Beauchemin | 0c12369084 | |
Siva Kumar Edupuganti | 8538796128 | |
Geido | 2a62c40526 | |
Guen Prawiroatmodjo | 08aaebbf7c | |
Michael S. Molina | caad29b5b3 | |
Daniel Vaz Gaspar | 54387b4589 | |
Maxime Beauchemin | 40e77be813 | |
Maxime Beauchemin | 8afe973968 | |
Beto Dealmeida | 99a1601aea | |
Michael S. Molina | 06077d42a8 | |
Antonio Rivero | 6844735a45 | |
Sam Firke | 02b69709bb | |
sowo | 35c8b7a162 | |
soniagtm | 4f363e1180 | |
Artem Gusev | 7e679d56ea | |
Jack | aef325a416 | |
Maxime Beauchemin | 9998a11fdd | |
Maxime Beauchemin | cd136ad847 | |
Maxime Beauchemin | a29cdefedf | |
Maxime Beauchemin | a6d16ed477 | |
Maxime Beauchemin | dea430649d | |
Evan Rusackas | 9423d59132 | |
Evan Rusackas | cbfdba2ca6 | |
Jack | eb4ca010ae | |
Evan Rusackas | 8200261506 | |
Evan Rusackas | 601d011986 | |
Evan Rusackas | c5c5f4dbc1 | |
Evan Rusackas | 717a3991f4 | |
Geido | 4202fba0f1 | |
dependabot[bot] | 71d174bad6 | |
Evan Rusackas | a5e65d572a | |
lodu | 93b83febc2 | |
John Bodley | 481a63da55 | |
Kamil Gabryjelski | ae0f2ce3c1 | |
Kamil Gabryjelski | 4ecfce98f6 | |
Maxime Beauchemin | e80d194b8f | |
Evan Rusackas | 7c8e1bb46e | |
Michael S. Molina | 996cced3d4 | |
Maxime Beauchemin | 5377b6cb2f | |
listeng | c5b7f7a08c | |
Elizabeth Thompson | 34b1db219c | |
Michael S. Molina | 662c1ed618 | |
Brad Greenlee | 4428bde024 | |
Spencer Torres | a1983e468b | |
JUST.in DO IT | eda304bda9 | |
Beto Dealmeida | 9377227e06 | |
joshkoeneHawking | 62433c14a7 | |
dependabot[bot] | bbe209a9e8 | |
Evan Rusackas | 3e6d966513 | |
Daniel Vaz Gaspar | 559605e393 | |
dependabot[bot] | 31d0c542e2 | |
dependabot[bot] | 104299a211 | |
dependabot[bot] | 47d629f50c | |
Jamie King | fb919c718d | |
Ross Mabbett | 265390c243 | |
listeng | ebcf4e044b | |
John Bodley | 5ed48760fb | |
John Bodley | 601432ad82 | |
dependabot[bot] | a9681fa3f3 | |
dependabot[bot] | e5837b46e5 | |
dependabot[bot] | 949e6b52e6 | |
Lily Kuang | cfa0556df7 | |
John Bodley | 27acc0b133 | |
dependabot[bot] | 5beda309ab | |
dependabot[bot] | 394abced43 | |
dependabot[bot] | 9a4bdeabbd | |
dependabot[bot] | 3f24083ed6 | |
dependabot[bot] | 5cde275965 | |
dependabot[bot] | ebdf1bbdd5 | |
dependabot[bot] | c661518bb1 | |
github-actions[bot] | 870e94809c | |
dependabot[bot] | 1b9e2581d2 | |
Kamil Gabryjelski | a498d6d10f | |
John Bodley | 30bc8f06dc | |
Michael S. Molina | 24fc2b67d8 | |
github-actions[bot] | 2a06c08c6b | |
dependabot[bot] | 6e8ea2753b | |
John Bodley | c38529741e | |
dependabot[bot] | 848a7ffbf3 | |
Kamil Gabryjelski | c3149994ac | |
dependabot[bot] | d318df96ae | |
github-actions[bot] | 208afc96a1 | |
github-actions[bot] | 6052ef656d | |
github-actions[bot] | 87e1c3f2fd | |
github-actions[bot] | 9d0928633a | |
github-actions[bot] | 1d3fdc74dc | |
github-actions[bot] | 271fbc064e | |
github-actions[bot] | 976b098421 | |
github-actions[bot] | 9c3915d42c | |
github-actions[bot] | e39bb57c07 | |
github-actions[bot] | 08700f8cb9 | |
Maxime Beauchemin | 6683d292ce | |
Braum | ad752f04c7 | |
Maxime Beauchemin | c990baf96a | |
dependabot[bot] | 59f0057017 | |
Maxime Beauchemin | 8e3cecda9f | |
Beto Dealmeida | 9022f5c519 | |
dependabot[bot] | fdc2dbe7db | |
dependabot[bot] | fa74d32a6a | |
John Bodley | 5ab95aaf7d | |
Maxime Beauchemin | 1c742f5866 | |
Evan Rusackas | 5603453c18 | |
dependabot[bot] | 29a5b72d5f | |
dependabot[bot] | e2b708e8f7 | |
dependabot[bot] | 48bff6b352 | |
Maxime Beauchemin | 12fe2929a4 | |
Maxime Beauchemin | 9fea3154fa | |
EugeneTorap | 3a34c7ff7c | |
JUST.in DO IT | f25795c4e4 | |
Craig Rueda | 8bdf457dfa | |
dependabot[bot] | 9fece4f811 | |
Maxime Beauchemin | 3e147f8693 | |
github-actions[bot] | 0d0e47acf7 | |
GerbenvdHuizen | 40a5454597 | |
GerbenvdHuizen | 2f8f77c813 | |
GerbenvdHuizen | 6fc619d643 |
19
.asf.yaml
19
.asf.yaml
|
@ -67,15 +67,16 @@ github:
|
|||
- cypress-matrix (2, chrome)
|
||||
- cypress-matrix (3, chrome)
|
||||
- frontend-build
|
||||
- pre-commit (3.9)
|
||||
- python-lint (3.9)
|
||||
- test-mysql (3.9)
|
||||
- test-postgres (3.9)
|
||||
- test-postgres (3.10)
|
||||
- test-postgres-hive (3.9)
|
||||
- test-postgres-presto (3.9)
|
||||
- test-sqlite (3.9)
|
||||
- unit-tests (3.9)
|
||||
- pre-commit
|
||||
- python-lint
|
||||
- test-mysql
|
||||
- test-postgres (current)
|
||||
- test-postgres (next)
|
||||
- test-postgres-hive
|
||||
- test-postgres-presto
|
||||
- test-sqlite
|
||||
- unit-tests (current)
|
||||
- unit-tests (next)
|
||||
|
||||
required_pull_request_reviews:
|
||||
dismiss_stale_reviews: false
|
||||
|
|
|
@ -7,6 +7,7 @@ body:
|
|||
value: |
|
||||
Hello Superset Community member! Please keep things tidy by putting your post in the proper place:
|
||||
|
||||
🚨 Reporting a security issue: send an email to security@superset.apache.org. DO NOT USE GITHUB ISSUES TO REPORT SECURITY PROBLEMS.
|
||||
🐛 Reporting a bug: use this form.
|
||||
🙏 Asking a question or getting help: post in the [Superset Slack chat](http://bit.ly/join-superset-slack) or [GitHub Discussions](https://github.com/apache/superset/discussions) under "Q&A / Help".
|
||||
💡 Requesting a new feature: Search [GitHub Discussions](https://github.com/apache/superset/discussions) to see if it exists already. If not, add a new post there under "Ideas".
|
||||
|
@ -45,8 +46,8 @@ body:
|
|||
label: Superset version
|
||||
options:
|
||||
- master / latest-dev
|
||||
- "3.1.1"
|
||||
- "3.0.4"
|
||||
- "4.0.0"
|
||||
- "3.1.2"
|
||||
validations:
|
||||
required: true
|
||||
- type: dropdown
|
||||
|
|
|
@ -0,0 +1,31 @@
|
|||
name: 'Change Detector'
|
||||
description: 'Detects file changes for pull request and push events'
|
||||
inputs:
|
||||
token:
|
||||
description: 'GitHub token for authentication'
|
||||
required: true
|
||||
outputs:
|
||||
python:
|
||||
description: 'Whether Python-related files were changed'
|
||||
value: ${{ steps.change-detector.outputs.python }}
|
||||
frontend:
|
||||
description: 'Whether frontend-related files were changed'
|
||||
value: ${{ steps.change-detector.outputs.frontend }}
|
||||
docker:
|
||||
description: 'Whether docker-related files were changed'
|
||||
value: ${{ steps.change-detector.outputs.docker }}
|
||||
docs:
|
||||
description: 'Whether docs-related files were changed'
|
||||
value: ${{ steps.change-detector.outputs.docs }}
|
||||
runs:
|
||||
using: 'composite'
|
||||
steps:
|
||||
- name: Detect file changes
|
||||
id: change-detector
|
||||
run: |
|
||||
python --version
|
||||
python scripts/change_detector.py
|
||||
shell: bash
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ inputs.token }}
|
||||
GITHUB_OUTPUT: ${{ github.output }}
|
|
@ -2,9 +2,9 @@ name: 'Setup Python Environment'
|
|||
description: 'Set up Python and install dependencies with optional configurations.'
|
||||
inputs:
|
||||
python-version:
|
||||
description: 'Python version to set up.'
|
||||
description: 'Python version to set up. Accepts a version number, "current", or "next".'
|
||||
required: true
|
||||
default: '3.9'
|
||||
default: 'current'
|
||||
cache:
|
||||
description: 'Cache dependencies. Options: pip'
|
||||
required: false
|
||||
|
@ -13,22 +13,39 @@ inputs:
|
|||
description: 'Type of requirements to install. Options: base, development, default'
|
||||
required: false
|
||||
default: 'dev'
|
||||
install-superset:
|
||||
description: 'Whether to install Superset itself. If false, only python is installed'
|
||||
required: false
|
||||
default: 'true'
|
||||
|
||||
runs:
|
||||
using: 'composite'
|
||||
steps:
|
||||
- name: Set up Python ${{ inputs.python-version }}
|
||||
- name: Interpret Python Version
|
||||
id: set-python-version
|
||||
shell: bash
|
||||
run: |
|
||||
if [ "${{ inputs.python-version }}" = "current" ]; then
|
||||
echo "PYTHON_VERSION=3.10" >> $GITHUB_ENV
|
||||
elif [ "${{ inputs.python-version }}" = "next" ]; then
|
||||
echo "PYTHON_VERSION=3.11" >> $GITHUB_ENV
|
||||
else
|
||||
echo "PYTHON_VERSION=${{ inputs.python-version }}" >> $GITHUB_ENV
|
||||
fi
|
||||
- name: Set up Python ${{ env.PYTHON_VERSION }}
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: ${{ inputs.python-version }}
|
||||
python-version: ${{ env.PYTHON_VERSION }}
|
||||
cache: ${{ inputs.cache }}
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
sudo apt-get update && sudo apt-get -y install libldap2-dev libsasl2-dev
|
||||
pip install --upgrade pip setuptools wheel
|
||||
if [ "${{ inputs.requirements-type }}" = "dev" ]; then
|
||||
pip install -r requirements/development.txt
|
||||
elif [ "${{ inputs.requirements-type }}" = "base" ]; then
|
||||
pip install -r requirements/base.txt
|
||||
if [ "${{ inputs.install-superset }}" = "true" ]; then
|
||||
sudo apt-get update && sudo apt-get -y install libldap2-dev libsasl2-dev
|
||||
pip install --upgrade pip setuptools wheel
|
||||
if [ "${{ inputs.requirements-type }}" = "dev" ]; then
|
||||
pip install -r requirements/development.txt
|
||||
elif [ "${{ inputs.requirements-type }}" = "base" ]; then
|
||||
pip install -r requirements/base.txt
|
||||
fi
|
||||
fi
|
||||
shell: bash
|
||||
|
|
|
@ -1,11 +1,40 @@
|
|||
name: 'Setup supersetbot'
|
||||
description: 'Sets up supersetbot npm lib from the repo'
|
||||
description: 'Sets up supersetbot npm lib from the repo or npm'
|
||||
inputs:
|
||||
from-npm:
|
||||
description: 'Install from npm instead of local setup'
|
||||
required: false
|
||||
default: 'true' # Defaults to using the local setup
|
||||
runs:
|
||||
using: 'composite'
|
||||
steps:
|
||||
- name: Install dependencies
|
||||
|
||||
- name: Setup Node Env
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: '20'
|
||||
|
||||
- name: Install supersetbot from npm
|
||||
if: ${{ inputs.from-npm == 'true' }}
|
||||
shell: bash
|
||||
run: npm install -g supersetbot
|
||||
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
if: ${{ inputs.from-npm == 'false' }}
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
repository: apache-superset/supersetbot
|
||||
path: supersetbot
|
||||
|
||||
- name: Setup supersetbot from repo
|
||||
if: ${{ inputs.from-npm == 'false' }}
|
||||
shell: bash
|
||||
working-directory: supersetbot
|
||||
run: |
|
||||
cd .github/supersetbot
|
||||
npm install
|
||||
npm link
|
||||
# simple trick to install globally with dependencies
|
||||
npm pack
|
||||
npm install -g ./supersetbot*.tgz
|
||||
|
||||
- name: echo supersetbot version
|
||||
shell: bash
|
||||
run: supersetbot version
|
||||
|
|
|
@ -19,14 +19,11 @@ updates:
|
|||
open-pull-requests-limit: 30
|
||||
versioning-strategy: increase
|
||||
|
||||
- package-ecosystem: "pip"
|
||||
directory: "/requirements/"
|
||||
schedule:
|
||||
interval: "monthly"
|
||||
labels:
|
||||
- pip
|
||||
- dependabot
|
||||
open-pull-requests-limit: 30
|
||||
|
||||
# - package-ecosystem: "pip"
|
||||
# NOTE: as dependabot isn't compatible with our python
|
||||
# dependency setup (pip-compile-multi), we'll be using
|
||||
# `supersetbot` instead
|
||||
|
||||
- package-ecosystem: "npm"
|
||||
directory: ".github/actions"
|
||||
|
|
|
@ -72,6 +72,11 @@
|
|||
- any-glob-to-any-file:
|
||||
- 'superset/translations/zh/**'
|
||||
|
||||
"i18n:traditional-chinese":
|
||||
- changed-files:
|
||||
- any-glob-to-any-file:
|
||||
- 'superset/translations/zh_TW/**'
|
||||
|
||||
"i18n:dutch":
|
||||
- changed-files:
|
||||
- any-glob-to-any-file:
|
||||
|
|
|
@ -1,22 +0,0 @@
|
|||
{
|
||||
"extends": "airbnb-base",
|
||||
"rules": {
|
||||
"import/extensions": 0,
|
||||
"import/prefer-default-export": 0,
|
||||
"func-names": 0,
|
||||
"no-console": 0,
|
||||
"class-methods-use-this": 0
|
||||
},
|
||||
"parserOptions": {
|
||||
"ecmaVersion": 2020,
|
||||
"sourceType": "module"
|
||||
},
|
||||
"parserOptions": {
|
||||
"ecmaVersion": "latest",
|
||||
"sourceType": "module",
|
||||
"requireConfigFile": false
|
||||
},
|
||||
"env": {
|
||||
"jest": true
|
||||
}
|
||||
}
|
|
@ -1,37 +0,0 @@
|
|||
# supersetbot
|
||||
|
||||
supersetbot is a utility bot that can be used to help around GitHub, CI and beyond.
|
||||
|
||||
The bot can be used as a local CLI OR, for a subset of fitted use cases, can be invoked directly
|
||||
from GitHub comments.
|
||||
|
||||
Because it's its own npm app, it can be tested/deployed/used in isolation from the rest of
|
||||
Superset, and take on some of the complexity from GitHub actions and onto a nifty
|
||||
utility that can be used in different contexts.
|
||||
|
||||
## Features
|
||||
|
||||
```bash
|
||||
$ use nvm 20
|
||||
$ npm i -g supersetbot
|
||||
$ supersetbot
|
||||
Usage: supersetbot [options] [command]
|
||||
|
||||
Options:
|
||||
-v, --verbose Output extra debugging information
|
||||
-r, --repo <repo> The GitHub repo to use (ie: "apache/superset")
|
||||
-d, --dry-run Run the command in dry-run mode
|
||||
-a, --actor <actor> The actor
|
||||
-h, --help display help for command
|
||||
|
||||
Commands:
|
||||
label [options] <label> Add a label to an issue or PR
|
||||
unlabel [options] <label> Remove a label from an issue or PR
|
||||
release-label-pr [options] <prId> Figure out first release for PR and label it
|
||||
version Prints supersetbot's version number
|
||||
release-label-prs [options] Given a set of PRs, auto-release label them
|
||||
release-label [options] <release> Figure out first release for PR and label it
|
||||
orglabel [options] Add an org label based on the author
|
||||
docker [options] Generates/run docker build commands use in CI
|
||||
help [command] display help for command
|
||||
```
|
|
@ -1,8 +0,0 @@
|
|||
export default {
|
||||
transform: {
|
||||
},
|
||||
testEnvironment: 'node',
|
||||
moduleNameMapper: {
|
||||
'^(\\.{1,2}/.*)\\.js$': '$1',
|
||||
},
|
||||
};
|
File diff suppressed because it is too large
Load Diff
|
@ -1,36 +0,0 @@
|
|||
{
|
||||
"name": "supersetbot",
|
||||
"version": "0.4.2",
|
||||
"description": "A bot for the Superset GitHub repo",
|
||||
"type": "module",
|
||||
"main": "src/index.js",
|
||||
"scripts": {
|
||||
"test": "node --experimental-vm-modules node_modules/jest/bin/jest.js",
|
||||
"eslint": "eslint",
|
||||
"supersetbot": "supersetbot"
|
||||
},
|
||||
"keywords": [],
|
||||
"author": "",
|
||||
"license": "ISC",
|
||||
"dependencies": {
|
||||
"@octokit/plugin-throttling": "^8.1.3",
|
||||
"@octokit/rest": "^20.0.2",
|
||||
"commander": "^11.0.0",
|
||||
"semver": "^7.6.0",
|
||||
"simple-git": "^3.22.0",
|
||||
"string-argv": "^0.3.2"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@jest/globals": "^29.7.0",
|
||||
"eslint": "^8.56.0",
|
||||
"eslint-config-airbnb": "^19.0.4",
|
||||
"eslint-plugin-import": "^2.29.1",
|
||||
"eslint-plugin-jsx-a11y": "^6.8.0",
|
||||
"eslint-plugin-react": "^7.33.2",
|
||||
"eslint-plugin-react-hooks": "^4.6.0",
|
||||
"jest": "^29.7.0"
|
||||
},
|
||||
"bin": {
|
||||
"supersetbot": "./src/supersetbot"
|
||||
}
|
||||
}
|
|
@ -1,175 +0,0 @@
|
|||
/**
|
||||
* Licensed to the Apache Software Foundation (ASF) under one
|
||||
* or more contributor license agreements. See the NOTICE file
|
||||
* distributed with this work for additional information
|
||||
* regarding copyright ownership. The ASF licenses this file
|
||||
* to you under the Apache License, Version 2.0 (the
|
||||
* "License"); you may not use this file except in compliance
|
||||
* with the License. You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
import { Command, Option } from 'commander';
|
||||
|
||||
import * as docker from './docker.js';
|
||||
import * as utils from './utils.js';
|
||||
import Github from './github.js';
|
||||
import Git from './git.js';
|
||||
|
||||
export default function getCLI(context) {
|
||||
const program = new Command();
|
||||
|
||||
// Some reusable options
|
||||
const issueOption = new Option('-i, --issue <issue>', 'The issue number', process.env.GITHUB_ISSUE_NUMBER);
|
||||
const excludeCherriesOption = new Option('-c, --exclude-cherries', 'Generate cherry labels point to each release where the PR has been cherried');
|
||||
|
||||
// Setting up top-level CLI options
|
||||
program
|
||||
.option('-v, --verbose', 'Output extra debugging information')
|
||||
.option('-r, --repo <repo>', 'The GitHub repo to use (ie: "apache/superset")', process.env.GITHUB_REPOSITORY)
|
||||
.option('-d, --dry-run', 'Run the command in dry-run mode')
|
||||
.option('-a, --actor <actor>', 'The actor', process.env.GITHUB_ACTOR);
|
||||
|
||||
program.command('label <label>')
|
||||
.description('Add a label to an issue or PR')
|
||||
.addOption(issueOption)
|
||||
.action(async function (label) {
|
||||
const opts = context.processOptions(this, ['issue', 'repo']);
|
||||
const github = new Github({ context, issue: opts.issue });
|
||||
await github.label(opts.issue, label, context, opts.actor, opts.verbose, opts.dryRun);
|
||||
});
|
||||
|
||||
program.command('unlabel <label>')
|
||||
.description('Remove a label from an issue or PR')
|
||||
.addOption(issueOption)
|
||||
.action(async function (label) {
|
||||
const opts = context.processOptions(this, ['issue', 'repo']);
|
||||
const github = new Github({ context, issueNumber: opts.issue });
|
||||
await github.unlabel(opts.issue, label, context, opts.actor, opts.verbose, opts.dryRun);
|
||||
});
|
||||
|
||||
program.command('release-label-pr <prId>')
|
||||
.description('Figure out first release for PR and label it')
|
||||
.addOption(excludeCherriesOption)
|
||||
.action(async function (prId) {
|
||||
const opts = context.processOptions(this, ['repo']);
|
||||
const git = new Git(context);
|
||||
await git.loadReleases();
|
||||
|
||||
let wrapped = context.commandWrapper({
|
||||
func: git.getReleaseLabels,
|
||||
verbose: opts.verbose,
|
||||
});
|
||||
const labels = await wrapped(parseInt(prId, 10), opts.verbose, opts.excludeCherries);
|
||||
const github = new Github({ context, issueNumber: opts.issue });
|
||||
wrapped = context.commandWrapper({
|
||||
func: github.syncLabels,
|
||||
verbose: opts.verbose,
|
||||
});
|
||||
await wrapped({labels, prId, actor: opts.actor, verbose: opts.verbose, dryRun: opts.dryRun});
|
||||
});
|
||||
|
||||
program.command('version')
|
||||
.description("Prints supersetbot's version number")
|
||||
.action(async () => {
|
||||
const version = await utils.currentPackageVersion();
|
||||
context.log(version);
|
||||
});
|
||||
|
||||
if (context.source === 'CLI') {
|
||||
program.command('release-label-prs')
|
||||
.description('Given a set of PRs, auto-release label them')
|
||||
.option('-s, --search <search>', 'extra search string to append using the GitHub mini-language')
|
||||
.option('-p, --pages <pages>', 'the number of pages (100 per page) to fetch and process', 10)
|
||||
.action(async function () {
|
||||
const opts = context.processOptions(this, ['repo']);
|
||||
|
||||
const github = new Github({ context, issueNumber: opts.issue });
|
||||
const prs = await github.searchMergedPRs({
|
||||
query: opts.search,
|
||||
onlyUnlabeled: true,
|
||||
verbose: opts.verbose,
|
||||
pages: opts.pages,
|
||||
});
|
||||
const prIdLabelMap = new Map(prs.map((pr) => [pr.number, pr.labels]));
|
||||
const git = new Git(context);
|
||||
await git.loadReleases();
|
||||
|
||||
const prsPromises = prs.map(async (pr) => {
|
||||
const labels = await git.getReleaseLabels(pr.number, opts.verbose);
|
||||
return { prId: pr.number, labels };
|
||||
});
|
||||
const prsTargetLabel = await Promise.all(prsPromises);
|
||||
// eslint-disable-next-line no-restricted-syntax
|
||||
for (const { prId, labels } of prsTargetLabel) {
|
||||
// Running sequentially to avoid rate limiting
|
||||
// eslint-disable-next-line no-await-in-loop
|
||||
await github.syncLabels({
|
||||
labels,
|
||||
existingLabels: prIdLabelMap.get(prId).map(l => l.name),
|
||||
prId,
|
||||
...opts,
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
program.command('release-label <release>')
|
||||
.description('Figure out first release for PR and label it')
|
||||
.addOption(excludeCherriesOption)
|
||||
.action(async function (release) {
|
||||
const opts = context.processOptions(this, ['repo']);
|
||||
const git = new Git(context);
|
||||
await git.loadReleases();
|
||||
const prs = await git.getPRsToSync(release, opts.verbose, opts.excludeCherries);
|
||||
|
||||
const github = new Github({ context });
|
||||
// eslint-disable-next-line no-restricted-syntax
|
||||
for (const { prId, labels } of prs) {
|
||||
// Running sequentially to avoid rate limiting
|
||||
// eslint-disable-next-line no-await-in-loop
|
||||
await github.syncLabels({
|
||||
prId,
|
||||
labels,
|
||||
...opts,
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
program.command('orglabel')
|
||||
.description('Add an org label based on the author')
|
||||
.addOption(issueOption)
|
||||
.action(async function () {
|
||||
const opts = context.processOptions(this, ['issue', 'repo']);
|
||||
const github = new Github({ context, issueNumber: opts.issue });
|
||||
await github.assignOrgLabel(opts.issue, opts.verbose, opts.dryRun);
|
||||
});
|
||||
|
||||
|
||||
program.command('docker')
|
||||
.description('Generates/run docker build commands use in CI')
|
||||
.option('-t, --preset <preset>', 'Build preset', /^(lean|dev|dockerize|websocket|py310|ci)$/i, 'lean')
|
||||
.option('-c, --context <context>', 'Build context', /^(push|pull_request|release)$/i, 'local')
|
||||
.option('-r, --context-ref <ref>', 'Reference to the PR, release, or branch')
|
||||
.option('-p, --platform <platform...>', 'Platforms (multiple values allowed)')
|
||||
.option('-f, --force-latest', 'Force the "latest" tag on the release')
|
||||
.option('-v, --verbose', 'Print more info')
|
||||
.action(function () {
|
||||
const opts = context.processOptions(this, ['preset']);
|
||||
opts.platform = opts.platform || ['linux/arm64'];
|
||||
const cmd = docker.getDockerCommand({ ...opts });
|
||||
context.log(cmd);
|
||||
if (!opts.dryRun) {
|
||||
utils.runShellCommand(cmd, false);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
return program;
|
||||
}
|
|
@ -1,12 +0,0 @@
|
|||
import { spawnSync } from 'child_process';
|
||||
|
||||
describe('CLI Test', () => {
|
||||
test.each([
|
||||
['./src/supersetbot', ['docker', '--preset', 'dev', '--dry-run'], '--target dev'],
|
||||
['./src/supersetbot', ['docker', '--dry-run'], '--target lean'],
|
||||
])('returns %s for release %s', (command, arg, contains) => {
|
||||
const result = spawnSync(command, arg);
|
||||
const output = result.stdout.toString();
|
||||
expect(result.stdout.toString()).toContain(contains);
|
||||
});
|
||||
});
|
|
@ -1,152 +0,0 @@
|
|||
/**
|
||||
* Licensed to the Apache Software Foundation (ASF) under one
|
||||
* or more contributor license agreements. See the NOTICE file
|
||||
* distributed with this work for additional information
|
||||
* regarding copyright ownership. The ASF licenses this file
|
||||
* to you under the Apache License, Version 2.0 (the
|
||||
* "License"); you may not use this file except in compliance
|
||||
* with the License. You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
import { parseArgsStringToArgv } from 'string-argv';
|
||||
|
||||
class Context {
|
||||
constructor(source) {
|
||||
this.hasErrors = false;
|
||||
this.source = source;
|
||||
this.options = {};
|
||||
this.errorLogs = [];
|
||||
this.logs = [];
|
||||
this.repo = null;
|
||||
this.optToEnvMap = {
|
||||
issue: 'GITHUB_ISSUE_NUMBER',
|
||||
repo: 'GITHUB_REPOSITORY',
|
||||
};
|
||||
}
|
||||
|
||||
requireOption(optionName, options) {
|
||||
const optionValue = options[optionName];
|
||||
if (optionValue === undefined || optionValue === null) {
|
||||
this.logError(`option [${optionName}] is required`);
|
||||
this.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
parseArgs(s) {
|
||||
return parseArgsStringToArgv(s);
|
||||
}
|
||||
|
||||
requireOptions(optionNames, options) {
|
||||
optionNames.forEach((optionName) => {
|
||||
this.requireOption(optionName, options);
|
||||
});
|
||||
}
|
||||
|
||||
processOptions(command, requiredOptions = []) {
|
||||
const raw = command.parent?.rawArgs;
|
||||
this.command = '???';
|
||||
if (raw) {
|
||||
this.command = raw.map((s) => (s.includes(' ') ? `"${s}"` : s)).join(' ').replace('node ', '');
|
||||
}
|
||||
this.options = { ...command.opts(), ...command.parent.opts() };
|
||||
|
||||
// Runtime defaults for unit tests since commanders can't receive callables as default
|
||||
Object.entries(this.optToEnvMap).forEach(([k, v]) => {
|
||||
if (!this.options[k]) {
|
||||
this.options[k] = process.env[v];
|
||||
}
|
||||
});
|
||||
this.requireOptions(requiredOptions, this.options);
|
||||
this.issueNumber = this.options.issue;
|
||||
|
||||
if (this.source === 'GHA') {
|
||||
this.options.actor = process.env.GITHUB_ACTOR || 'UNKNOWN';
|
||||
this.options.repo = process.env.GITHUB_REPOSITORY;
|
||||
}
|
||||
this.repo = this.options.repo;
|
||||
|
||||
return this.options;
|
||||
}
|
||||
|
||||
log(msg) {
|
||||
console.log(msg);
|
||||
this.logs = [...this.logs, msg];
|
||||
}
|
||||
|
||||
logSuccess(msg) {
|
||||
const augMsg = `🟢 SUCCESS: ${msg}`;
|
||||
console.log(augMsg);
|
||||
this.logs.push(augMsg);
|
||||
}
|
||||
|
||||
logError(msg) {
|
||||
this.hasErrors = true;
|
||||
const augMsg = `🔴 ERROR: ${msg}`;
|
||||
console.error(augMsg);
|
||||
this.errorLogs.push(augMsg);
|
||||
}
|
||||
|
||||
exit(code = 0) {
|
||||
this.onDone();
|
||||
process.exit(code);
|
||||
}
|
||||
|
||||
commandWrapper({
|
||||
func, successMsg, errorMsg = null, verbose = false, dryRun = false,
|
||||
}) {
|
||||
return async (...args) => {
|
||||
let resp;
|
||||
let hasError = false;
|
||||
|
||||
try {
|
||||
if (!dryRun) {
|
||||
resp = await func(...args);
|
||||
}
|
||||
if (verbose && resp) {
|
||||
console.log(resp);
|
||||
}
|
||||
} catch (error) {
|
||||
hasError = true;
|
||||
if (errorMsg) {
|
||||
this.logError(errorMsg);
|
||||
} else {
|
||||
this.logError(error);
|
||||
}
|
||||
throw (error);
|
||||
}
|
||||
if (successMsg && !hasError) {
|
||||
this.logSuccess(successMsg);
|
||||
}
|
||||
return resp;
|
||||
};
|
||||
}
|
||||
|
||||
doneComment() {
|
||||
const msgs = [...this.logs, ...this.errorLogs];
|
||||
let comment = '';
|
||||
comment += `> \`${this.command}\`\n`;
|
||||
comment += '```\n';
|
||||
comment += msgs.join('\n');
|
||||
comment += '\n```';
|
||||
return comment;
|
||||
}
|
||||
|
||||
async onDone() {
|
||||
let msg;
|
||||
if (this.source === 'GHA') {
|
||||
msg = this.doneComment();
|
||||
}
|
||||
return msg;
|
||||
}
|
||||
}
|
||||
|
||||
export default Context;
|
|
@ -1,142 +0,0 @@
|
|||
import { spawnSync } from 'child_process';
|
||||
|
||||
const REPO = 'apache/superset';
|
||||
const CACHE_REPO = `${REPO}-cache`;
|
||||
const BASE_PY_IMAGE = '3.9-slim-bookworm';
|
||||
|
||||
export function runCmd(command, raiseOnFailure = true) {
|
||||
const { stdout, stderr } = spawnSync(command, { shell: true, encoding: 'utf-8', env: process.env });
|
||||
|
||||
if (stderr && raiseOnFailure) {
|
||||
throw new Error(stderr);
|
||||
}
|
||||
return stdout;
|
||||
}
|
||||
|
||||
function getGitSha() {
|
||||
return runCmd('git rev-parse HEAD').trim();
|
||||
}
|
||||
|
||||
function getBuildContextRef(buildContext) {
|
||||
const event = buildContext || process.env.GITHUB_EVENT_NAME;
|
||||
const githubRef = process.env.GITHUB_REF || '';
|
||||
|
||||
if (event === 'pull_request') {
|
||||
const githubHeadRef = process.env.GITHUB_HEAD_REF || '';
|
||||
return githubHeadRef.replace(/[^a-zA-Z0-9]/g, '-').slice(0, 40);
|
||||
} if (event === 'release') {
|
||||
return githubRef.replace('refs/tags/', '').slice(0, 40);
|
||||
} if (event === 'push') {
|
||||
return githubRef.replace('refs/heads/', '').replace(/[^a-zA-Z0-9]/g, '-').slice(0, 40);
|
||||
}
|
||||
return '';
|
||||
}
|
||||
|
||||
export function isLatestRelease(release) {
|
||||
const output = runCmd(`../../scripts/tag_latest_release.sh ${release} --dry-run`, false) || '';
|
||||
return output.includes('SKIP_TAG::false');
|
||||
}
|
||||
|
||||
function makeDockerTag(parts) {
|
||||
return `${REPO}:${parts.filter((part) => part).join('-')}`;
|
||||
}
|
||||
|
||||
export function getDockerTags({
|
||||
preset, platforms, sha, buildContext, buildContextRef, forceLatest = false,
|
||||
}) {
|
||||
const tags = new Set();
|
||||
const tagChunks = [];
|
||||
|
||||
const isLatest = isLatestRelease(buildContextRef);
|
||||
|
||||
if (preset !== 'lean') {
|
||||
tagChunks.push(preset);
|
||||
}
|
||||
|
||||
if (platforms.length === 1) {
|
||||
const platform = platforms[0];
|
||||
const shortBuildPlatform = platform.replace('linux/', '').replace('64', '');
|
||||
if (shortBuildPlatform !== 'amd') {
|
||||
tagChunks.push(shortBuildPlatform);
|
||||
}
|
||||
}
|
||||
|
||||
tags.add(makeDockerTag([sha, ...tagChunks]));
|
||||
tags.add(makeDockerTag([sha.slice(0, 7), ...tagChunks]));
|
||||
|
||||
if (buildContext === 'release') {
|
||||
tags.add(makeDockerTag([buildContextRef, ...tagChunks]));
|
||||
if (isLatest || forceLatest) {
|
||||
tags.add(makeDockerTag(['latest', ...tagChunks]));
|
||||
}
|
||||
} else if (buildContext === 'push' && buildContextRef === 'master') {
|
||||
tags.add(makeDockerTag(['master', ...tagChunks]));
|
||||
} else if (buildContext === 'pull_request') {
|
||||
tags.add(makeDockerTag([`pr-${buildContextRef}`, ...tagChunks]));
|
||||
}
|
||||
|
||||
return [...tags];
|
||||
}
|
||||
|
||||
export function getDockerCommand({
|
||||
preset, platform, buildContext, buildContextRef, forceLatest = false,
|
||||
}) {
|
||||
const platforms = platform;
|
||||
|
||||
let buildTarget = '';
|
||||
let pyVer = BASE_PY_IMAGE;
|
||||
let dockerContext = '.';
|
||||
|
||||
if (preset === 'dev') {
|
||||
buildTarget = 'dev';
|
||||
} else if (preset === 'lean') {
|
||||
buildTarget = 'lean';
|
||||
} else if (preset === 'py310') {
|
||||
buildTarget = 'lean';
|
||||
pyVer = '3.10-slim-bookworm';
|
||||
} else if (preset === 'websocket') {
|
||||
dockerContext = 'superset-websocket';
|
||||
} else if (preset === 'ci') {
|
||||
buildTarget = 'ci';
|
||||
} else if (preset === 'dockerize') {
|
||||
dockerContext = '-f dockerize.Dockerfile .';
|
||||
} else {
|
||||
console.error(`Invalid build preset: ${preset}`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
let ref = buildContextRef;
|
||||
if (!ref) {
|
||||
ref = getBuildContextRef(buildContext);
|
||||
}
|
||||
const sha = getGitSha();
|
||||
const tags = getDockerTags({
|
||||
preset, platforms, sha, buildContext, buildContextRef: ref, forceLatest,
|
||||
}).map((tag) => `-t ${tag}`).join(' \\\n ');
|
||||
const isAuthenticated = !!(process.env.DOCKERHUB_TOKEN);
|
||||
|
||||
const dockerArgs = isAuthenticated ? '--push' : '--load';
|
||||
const targetArgument = buildTarget ? `--target ${buildTarget}` : '';
|
||||
const cacheRef = `${CACHE_REPO}:${pyVer}`;
|
||||
const platformArg = `--platform ${platforms.join(',')}`;
|
||||
const cacheFromArg = `--cache-from=type=registry,ref=${cacheRef}`;
|
||||
const cacheToArg = isAuthenticated ? `--cache-to=type=registry,mode=max,ref=${cacheRef}` : '';
|
||||
const buildArg = pyVer ? `--build-arg PY_VER=${pyVer}` : '';
|
||||
const actor = process.env.GITHUB_ACTOR;
|
||||
|
||||
return `docker buildx build \\
|
||||
${dockerArgs} \\
|
||||
${tags} \\
|
||||
${cacheFromArg} \\
|
||||
${cacheToArg} \\
|
||||
${targetArgument} \\
|
||||
${buildArg} \\
|
||||
${platformArg} \\
|
||||
--label sha=${sha} \\
|
||||
--label target=${buildTarget} \\
|
||||
--label build_trigger=${ref} \\
|
||||
--label base=${pyVer} \\
|
||||
--label build_actor=${actor} \\
|
||||
${dockerContext}
|
||||
`;
|
||||
}
|
|
@ -1,244 +0,0 @@
|
|||
import * as dockerUtils from './docker.js';
|
||||
|
||||
const SHA = '22e7c602b9aa321ec7e0df4bb0033048664dcdf0';
|
||||
const PR_ID = '666';
|
||||
const OLD_REL = '2.1.0';
|
||||
const NEW_REL = '2.1.1';
|
||||
const REPO = 'apache/superset';
|
||||
|
||||
beforeEach(() => {
|
||||
process.env.TEST_ENV = 'true';
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
delete process.env.TEST_ENV;
|
||||
});
|
||||
|
||||
describe('isLatestRelease', () => {
|
||||
test.each([
|
||||
['2.1.0', false],
|
||||
['2.1.1', true],
|
||||
['1.0.0', false],
|
||||
['3.0.0', true],
|
||||
])('returns %s for release %s', (release, expectedBool) => {
|
||||
expect(dockerUtils.isLatestRelease(release)).toBe(expectedBool);
|
||||
});
|
||||
});
|
||||
|
||||
describe('getDockerTags', () => {
|
||||
test.each([
|
||||
// PRs
|
||||
[
|
||||
'lean',
|
||||
['linux/arm64'],
|
||||
SHA,
|
||||
'pull_request',
|
||||
PR_ID,
|
||||
[`${REPO}:22e7c60-arm`, `${REPO}:${SHA}-arm`, `${REPO}:pr-${PR_ID}-arm`],
|
||||
],
|
||||
[
|
||||
'ci',
|
||||
['linux/amd64'],
|
||||
SHA,
|
||||
'pull_request',
|
||||
PR_ID,
|
||||
[`${REPO}:22e7c60-ci`, `${REPO}:${SHA}-ci`, `${REPO}:pr-${PR_ID}-ci`],
|
||||
],
|
||||
[
|
||||
'lean',
|
||||
['linux/amd64'],
|
||||
SHA,
|
||||
'pull_request',
|
||||
PR_ID,
|
||||
[`${REPO}:22e7c60`, `${REPO}:${SHA}`, `${REPO}:pr-${PR_ID}`],
|
||||
],
|
||||
[
|
||||
'dev',
|
||||
['linux/arm64'],
|
||||
SHA,
|
||||
'pull_request',
|
||||
PR_ID,
|
||||
[
|
||||
`${REPO}:22e7c60-dev-arm`,
|
||||
`${REPO}:${SHA}-dev-arm`,
|
||||
`${REPO}:pr-${PR_ID}-dev-arm`,
|
||||
],
|
||||
],
|
||||
[
|
||||
'dev',
|
||||
['linux/amd64'],
|
||||
SHA,
|
||||
'pull_request',
|
||||
PR_ID,
|
||||
[`${REPO}:22e7c60-dev`, `${REPO}:${SHA}-dev`, `${REPO}:pr-${PR_ID}-dev`],
|
||||
],
|
||||
// old releases
|
||||
[
|
||||
'lean',
|
||||
['linux/arm64'],
|
||||
SHA,
|
||||
'release',
|
||||
OLD_REL,
|
||||
[`${REPO}:22e7c60-arm`, `${REPO}:${SHA}-arm`, `${REPO}:${OLD_REL}-arm`],
|
||||
],
|
||||
[
|
||||
'lean',
|
||||
['linux/amd64'],
|
||||
SHA,
|
||||
'release',
|
||||
OLD_REL,
|
||||
[`${REPO}:22e7c60`, `${REPO}:${SHA}`, `${REPO}:${OLD_REL}`],
|
||||
],
|
||||
[
|
||||
'dev',
|
||||
['linux/arm64'],
|
||||
SHA,
|
||||
'release',
|
||||
OLD_REL,
|
||||
[
|
||||
`${REPO}:22e7c60-dev-arm`,
|
||||
`${REPO}:${SHA}-dev-arm`,
|
||||
`${REPO}:${OLD_REL}-dev-arm`,
|
||||
],
|
||||
],
|
||||
[
|
||||
'dev',
|
||||
['linux/amd64'],
|
||||
SHA,
|
||||
'release',
|
||||
OLD_REL,
|
||||
[`${REPO}:22e7c60-dev`, `${REPO}:${SHA}-dev`, `${REPO}:${OLD_REL}-dev`],
|
||||
],
|
||||
// new releases
|
||||
[
|
||||
'lean',
|
||||
['linux/arm64'],
|
||||
SHA,
|
||||
'release',
|
||||
NEW_REL,
|
||||
[
|
||||
`${REPO}:22e7c60-arm`,
|
||||
`${REPO}:${SHA}-arm`,
|
||||
`${REPO}:${NEW_REL}-arm`,
|
||||
`${REPO}:latest-arm`,
|
||||
],
|
||||
],
|
||||
[
|
||||
'lean',
|
||||
['linux/amd64'],
|
||||
SHA,
|
||||
'release',
|
||||
NEW_REL,
|
||||
[`${REPO}:22e7c60`, `${REPO}:${SHA}`, `${REPO}:${NEW_REL}`, `${REPO}:latest`],
|
||||
],
|
||||
[
|
||||
'dev',
|
||||
['linux/arm64'],
|
||||
SHA,
|
||||
'release',
|
||||
NEW_REL,
|
||||
[
|
||||
`${REPO}:22e7c60-dev-arm`,
|
||||
`${REPO}:${SHA}-dev-arm`,
|
||||
`${REPO}:${NEW_REL}-dev-arm`,
|
||||
`${REPO}:latest-dev-arm`,
|
||||
],
|
||||
],
|
||||
[
|
||||
'dev',
|
||||
['linux/amd64'],
|
||||
SHA,
|
||||
'release',
|
||||
NEW_REL,
|
||||
[
|
||||
`${REPO}:22e7c60-dev`,
|
||||
`${REPO}:${SHA}-dev`,
|
||||
`${REPO}:${NEW_REL}-dev`,
|
||||
`${REPO}:latest-dev`,
|
||||
],
|
||||
],
|
||||
// merge on master
|
||||
[
|
||||
'lean',
|
||||
['linux/arm64'],
|
||||
SHA,
|
||||
'push',
|
||||
'master',
|
||||
[`${REPO}:22e7c60-arm`, `${REPO}:${SHA}-arm`, `${REPO}:master-arm`],
|
||||
],
|
||||
[
|
||||
'lean',
|
||||
['linux/amd64'],
|
||||
SHA,
|
||||
'push',
|
||||
'master',
|
||||
[`${REPO}:22e7c60`, `${REPO}:${SHA}`, `${REPO}:master`],
|
||||
],
|
||||
[
|
||||
'dev',
|
||||
['linux/arm64'],
|
||||
SHA,
|
||||
'push',
|
||||
'master',
|
||||
[
|
||||
`${REPO}:22e7c60-dev-arm`,
|
||||
`${REPO}:${SHA}-dev-arm`,
|
||||
`${REPO}:master-dev-arm`,
|
||||
],
|
||||
],
|
||||
[
|
||||
'dev',
|
||||
['linux/amd64'],
|
||||
SHA,
|
||||
'push',
|
||||
'master',
|
||||
[`${REPO}:22e7c60-dev`, `${REPO}:${SHA}-dev`, `${REPO}:master-dev`],
|
||||
],
|
||||
|
||||
])('returns expected tags', (preset, platforms, sha, buildContext, buildContextRef, expectedTags) => {
|
||||
const tags = dockerUtils.getDockerTags({
|
||||
preset, platforms, sha, buildContext, buildContextRef,
|
||||
});
|
||||
expect(tags).toEqual(expect.arrayContaining(expectedTags));
|
||||
});
|
||||
});
|
||||
|
||||
describe('getDockerCommand', () => {
|
||||
test.each([
|
||||
[
|
||||
'lean',
|
||||
['linux/amd64'],
|
||||
true,
|
||||
SHA,
|
||||
'push',
|
||||
'master',
|
||||
['--push', `-t ${REPO}:master `],
|
||||
],
|
||||
[
|
||||
'dev',
|
||||
['linux/amd64'],
|
||||
false,
|
||||
SHA,
|
||||
'push',
|
||||
'master',
|
||||
['--load', `-t ${REPO}:master-dev `],
|
||||
],
|
||||
// multi-platform
|
||||
[
|
||||
'lean',
|
||||
['linux/arm64', 'linux/amd64'],
|
||||
true,
|
||||
SHA,
|
||||
'push',
|
||||
'master',
|
||||
['--platform linux/arm64,linux/amd64'],
|
||||
],
|
||||
])('returns expected docker command', (preset, platform, isAuthenticated, sha, buildContext, buildContextRef, contains) => {
|
||||
const cmd = dockerUtils.getDockerCommand({
|
||||
preset, platform, isAuthenticated, sha, buildContext, buildContextRef,
|
||||
});
|
||||
contains.forEach((expectedSubstring) => {
|
||||
expect(cmd).toContain(expectedSubstring);
|
||||
});
|
||||
});
|
||||
});
|
|
@ -1,120 +0,0 @@
|
|||
import simpleGit from 'simple-git';
|
||||
import semver from 'semver';
|
||||
|
||||
import GitRelease from './git_release.js';
|
||||
|
||||
export default class Git {
|
||||
#releaseTags;
|
||||
|
||||
constructor(context, mainBranch = 'master') {
|
||||
this.context = context;
|
||||
this.mainBranch = mainBranch;
|
||||
this.releases = new Map();
|
||||
this.git = simpleGit();
|
||||
this.mainBranchGitRelease = this.mainBranchGitRelease.bind(this);
|
||||
this.getReleaseLabels = this.getReleaseLabels.bind(this);
|
||||
}
|
||||
|
||||
async mainBranchGitRelease() {
|
||||
let rel = this.releases.get(this.mainBranch);
|
||||
if (!rel) {
|
||||
rel = await this.loadRelease(this.mainBranch);
|
||||
}
|
||||
return rel;
|
||||
}
|
||||
|
||||
async releaseTags() {
|
||||
if (!this.#releaseTags) {
|
||||
const tags = await this.git.tags();
|
||||
// Filter tags to include only those that match semver and are official releases
|
||||
const semverTags = tags.all.filter((tag) => semver.valid(tag) && !tag.includes('-') && !tag.includes('v'));
|
||||
semverTags.sort((a, b) => semver.compare(a, b));
|
||||
this.#releaseTags = semverTags;
|
||||
}
|
||||
return this.#releaseTags;
|
||||
}
|
||||
|
||||
async loadMainBranch() {
|
||||
await this.loadRelease(this.mainBranch);
|
||||
}
|
||||
|
||||
async loadReleases(tags = null) {
|
||||
const tagsToFetch = tags || await this.releaseTags();
|
||||
if (!tags) {
|
||||
await this.loadMainBranch();
|
||||
}
|
||||
const promises = [];
|
||||
tagsToFetch.forEach((tag) => {
|
||||
promises.push(this.loadRelease(tag));
|
||||
});
|
||||
await Promise.all(promises);
|
||||
}
|
||||
|
||||
async loadRelease(tag) {
|
||||
const release = new GitRelease(tag, this.context);
|
||||
await release.load();
|
||||
this.releases.set(tag, release);
|
||||
return release;
|
||||
}
|
||||
|
||||
static shortenSHA(sha) {
|
||||
return sha.substring(0, 7);
|
||||
}
|
||||
|
||||
async getReleaseLabels(prNumber, verbose, excludeCherries = false) {
|
||||
const labels = [];
|
||||
const main = await this.mainBranchGitRelease();
|
||||
const commit = main.prIdCommitMap.get(prNumber);
|
||||
if (commit) {
|
||||
const { sha } = commit;
|
||||
const shortSHA = Git.shortenSHA(sha);
|
||||
if (verbose) {
|
||||
console.log(`PR ${prNumber} is ${shortSHA} on branch ${this.mainBranch}`);
|
||||
}
|
||||
|
||||
let firstGitReleased = null;
|
||||
const tags = await this.releaseTags();
|
||||
tags.forEach((tag) => {
|
||||
const release = this.releases.get(tag);
|
||||
if (release.shaCommitMap.get(sha) && !firstGitReleased && release.tag !== this.mainBranch) {
|
||||
firstGitReleased = release.tag;
|
||||
labels.push(`🚢 ${release.tag}`);
|
||||
}
|
||||
const commitInGitRelease = release.prIdCommitMap.get(prNumber);
|
||||
if (!excludeCherries && commitInGitRelease && commitInGitRelease.sha !== sha) {
|
||||
labels.push(`🍒 ${release.tag}`);
|
||||
}
|
||||
});
|
||||
if (labels.length >= 1) {
|
||||
// using this emoji to show it's been labeled by the bot
|
||||
labels.push('🏷️ bot');
|
||||
}
|
||||
}
|
||||
return labels;
|
||||
}
|
||||
|
||||
async previousRelease(release) {
|
||||
const tags = await this.releaseTags();
|
||||
return tags[tags.indexOf(release) - 1];
|
||||
}
|
||||
|
||||
async getPRsToSync(release, verbose = false, excludeCherries = false) {
|
||||
const prevRelease = await this.previousRelease(release);
|
||||
const releaseRange = new GitRelease(release, this.context, prevRelease);
|
||||
await releaseRange.load();
|
||||
const prIds = releaseRange.prIdCommitMap.keys();
|
||||
|
||||
const prs = [];
|
||||
const promises = [];
|
||||
[...prIds].forEach(prId => {
|
||||
promises.push(
|
||||
this.getReleaseLabels(prId, verbose, excludeCherries)
|
||||
.then((labels) => {
|
||||
prs.push({ prId, labels });
|
||||
}),
|
||||
);
|
||||
});
|
||||
await Promise.all(promises);
|
||||
return prs;
|
||||
}
|
||||
}
|
|
@ -1,50 +0,0 @@
|
|||
import simpleGit from 'simple-git';
|
||||
|
||||
export default class GitRelease {
|
||||
constructor(tag, context, from = null) {
|
||||
this.tag = tag;
|
||||
this.context = context;
|
||||
this.prNumberRegex = /\(#(\d+)\)/;
|
||||
this.shaCommitMap = null;
|
||||
this.prIdCommitMap = null;
|
||||
this.prCommitMap = null;
|
||||
this.git = simpleGit();
|
||||
this.from = from;
|
||||
}
|
||||
|
||||
extractPRNumber(commitMessage) {
|
||||
const match = (commitMessage || '').match(this.prNumberRegex);
|
||||
return match ? parseInt(match[1], 10) : null;
|
||||
}
|
||||
|
||||
async load() {
|
||||
let from = this.from || await this.git.firstCommit();
|
||||
if (from.includes('\n')) {
|
||||
[from] = from.split('\n');
|
||||
}
|
||||
const range = `${this.from || 'first'}..${this.tag}`;
|
||||
const commits = await this.git.log({ from, to: this.tag });
|
||||
this.context.log(`${range} - fetched ${commits.all.length} commits`);
|
||||
|
||||
this.shaCommitMap = new Map();
|
||||
commits.all.forEach((commit) => {
|
||||
const sha = commit.hash.substring(0, 7);
|
||||
this.shaCommitMap.set(
|
||||
sha,
|
||||
{
|
||||
prId: this.extractPRNumber(commit.message),
|
||||
message: commit.message,
|
||||
sha,
|
||||
},
|
||||
);
|
||||
});
|
||||
|
||||
this.prIdCommitMap = new Map();
|
||||
// eslint-disable-next-line no-restricted-syntax
|
||||
for (const commit of this.shaCommitMap.values()) {
|
||||
if (commit.prId) {
|
||||
this.prIdCommitMap.set(commit.prId, commit);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
|
@ -1,252 +0,0 @@
|
|||
import { Octokit } from '@octokit/rest';
|
||||
import { throttling } from '@octokit/plugin-throttling';
|
||||
|
||||
import { ORG_LIST, PROTECTED_LABEL_PATTERNS, COMMITTER_TEAM } from './metadata.js';
|
||||
|
||||
class Github {
|
||||
#userInTeamCache;
|
||||
|
||||
constructor({ context, issueNumber = null, token = null }) {
|
||||
this.context = context;
|
||||
this.issueNumber = issueNumber;
|
||||
const githubToken = token || process.env.GITHUB_TOKEN;
|
||||
if (!githubToken) {
|
||||
const msg = 'GITHUB_TOKEN is not set';
|
||||
this.context.logError(msg);
|
||||
}
|
||||
const throttledOctokit = Octokit.plugin(throttling);
|
||||
// eslint-disable-next-line new-cap
|
||||
this.octokit = new throttledOctokit({
|
||||
auth: githubToken,
|
||||
throttle: {
|
||||
id: 'supersetbot',
|
||||
onRateLimit: (retryAfter, options, octokit, retryCount) => {
|
||||
const howManyRetries = 10;
|
||||
octokit.log.warn(`Retry ${retryCount} out of ${howManyRetries} - retrying in ${retryAfter} seconds!`);
|
||||
if (retryCount < howManyRetries) {
|
||||
return true;
|
||||
}
|
||||
return false;
|
||||
},
|
||||
onSecondaryRateLimit: (retryAfter, options, octokit) => {
|
||||
octokit.log.warn(`SecondaryRateLimit detected for request ${options.method} ${options.url}`);
|
||||
},
|
||||
},
|
||||
});
|
||||
this.syncLabels = this.syncLabels.bind(this);
|
||||
this.#userInTeamCache = new Map();
|
||||
}
|
||||
|
||||
unPackRepo() {
|
||||
const [owner, repo] = this.context.repo.split('/');
|
||||
return { repo, owner };
|
||||
}
|
||||
|
||||
async label(issueNumber, label, actor = null, verbose = false, dryRun = false) {
|
||||
let hasPerm = true;
|
||||
if (actor && Github.isLabelProtected(label)) {
|
||||
hasPerm = await this.checkIfUserInTeam(actor, COMMITTER_TEAM, verbose);
|
||||
}
|
||||
if (hasPerm) {
|
||||
const addLabelWrapped = this.context.commandWrapper({
|
||||
func: this.octokit.rest.issues.addLabels,
|
||||
successMsg: `label "${label}" added to issue ${issueNumber}`,
|
||||
verbose,
|
||||
dryRun,
|
||||
});
|
||||
await addLabelWrapped({
|
||||
...this.unPackRepo(),
|
||||
issue_number: issueNumber,
|
||||
labels: [label],
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
async createComment(body) {
|
||||
if (this.issueNumber) {
|
||||
await this.octokit.rest.issues.createComment({
|
||||
...this.unPackRepo(),
|
||||
body,
|
||||
issue_number: this.issueNumber,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
async unlabel(issueNumber, label, actor = null, verbose = false, dryRun = false) {
|
||||
let hasPerm = true;
|
||||
if (actor && Github.isLabelProtected(label)) {
|
||||
hasPerm = await this.checkIfUserInTeam(actor, COMMITTER_TEAM, verbose);
|
||||
}
|
||||
if (hasPerm) {
|
||||
const removeLabelWrapped = this.context.commandWrapper({
|
||||
func: this.octokit.rest.issues.removeLabel,
|
||||
successMsg: `label "${label}" removed from issue ${issueNumber}`,
|
||||
verbose,
|
||||
dryRun,
|
||||
});
|
||||
await removeLabelWrapped({
|
||||
...this.unPackRepo(),
|
||||
issue_number: issueNumber,
|
||||
name: label,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
async assignOrgLabel(issueNumber, verbose = false, dryRun = false) {
|
||||
const issue = await this.octokit.rest.issues.get({
|
||||
...this.unPackRepo(),
|
||||
issue_number: issueNumber,
|
||||
});
|
||||
const username = issue.data.user.login;
|
||||
const orgs = await this.octokit.orgs.listForUser({ username });
|
||||
const orgNames = orgs.data.map((v) => v.login);
|
||||
|
||||
// get list of matching github orgs
|
||||
const matchingOrgs = orgNames.filter((org) => ORG_LIST.includes(org));
|
||||
if (matchingOrgs.length) {
|
||||
const wrapped = this.context.commandWrapper({
|
||||
func: this.octokit.rest.issues.addLabels,
|
||||
successMsg: `added label(s) ${matchingOrgs} to issue ${issueNumber}`,
|
||||
errorMsg: "couldn't add labels to issue",
|
||||
verbose,
|
||||
dryRun,
|
||||
});
|
||||
wrapped({
|
||||
...this.unPackRepo(),
|
||||
issue_number: issueNumber,
|
||||
labels: matchingOrgs,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
async searchMergedPRs({
|
||||
query = '',
|
||||
onlyUnlabeled = true,
|
||||
verbose = false,
|
||||
startPage = 0,
|
||||
pages = 5,
|
||||
}) {
|
||||
// look for PRs
|
||||
let q = `repo:${this.context.repo} is:merged ${query}`;
|
||||
if (onlyUnlabeled) {
|
||||
q = `${q} -label:"🏷️ bot"`;
|
||||
}
|
||||
if (verbose) {
|
||||
this.context.log(`Query: ${q}`);
|
||||
}
|
||||
let prs = [];
|
||||
for (let i = 0; i < pages; i += 1) {
|
||||
if (verbose) {
|
||||
this.context.log(`Fetching PRs to process page ${i + 1} out of ${pages}`);
|
||||
}
|
||||
// eslint-disable-next-line no-await-in-loop
|
||||
const data = await this.octokit.search.issuesAndPullRequests({
|
||||
q,
|
||||
per_page: 100,
|
||||
page: startPage + i,
|
||||
});
|
||||
prs = [...prs, ...data.data.items];
|
||||
}
|
||||
if (verbose) {
|
||||
this.context.log(`Fetched ${prs.length}`);
|
||||
}
|
||||
return prs;
|
||||
}
|
||||
|
||||
async syncLabels({
|
||||
labels,
|
||||
prId,
|
||||
actor = null,
|
||||
verbose = false,
|
||||
dryRun = false,
|
||||
existingLabels = null,
|
||||
}) {
|
||||
if (verbose) {
|
||||
this.context.log(`[PR: ${prId}] - sync labels ${labels}`);
|
||||
}
|
||||
let hasPerm = true;
|
||||
if (actor) {
|
||||
hasPerm = await this.checkIfUserInTeam(actor, COMMITTER_TEAM, verbose);
|
||||
}
|
||||
if (!hasPerm) {
|
||||
return;
|
||||
}
|
||||
let targetLabels = existingLabels;
|
||||
if (targetLabels === null) {
|
||||
// No labels have been passed as an array, so checking against GitHub
|
||||
const resp = await this.octokit.issues.listLabelsOnIssue({
|
||||
...this.unPackRepo(),
|
||||
issue_number: prId,
|
||||
});
|
||||
targetLabels = resp.data.map((l) => l.name);
|
||||
}
|
||||
|
||||
if (verbose) {
|
||||
this.context.log(`[PR: ${prId}] - target release labels: ${labels}`);
|
||||
this.context.log(`[PR: ${prId}] - existing labels on issue: ${existingLabels}`);
|
||||
}
|
||||
|
||||
// Extract existing labels with the given prefixes
|
||||
const prefixes = ['🚢', '🍒', '🎯', '🏷️'];
|
||||
const existingPrefixLabels = targetLabels
|
||||
.filter((label) => prefixes.some((s) => typeof(label) === 'string' && label.startsWith(s)));
|
||||
|
||||
// Labels to add
|
||||
const labelsToAdd = labels.filter((label) => !existingPrefixLabels.includes(label));
|
||||
if (verbose) {
|
||||
this.context.log(`[PR: ${prId}] - labels to add: ${labelsToAdd}`);
|
||||
}
|
||||
// Labels to remove
|
||||
const labelsToRemove = existingPrefixLabels.filter((label) => !labels.includes(label));
|
||||
if (verbose) {
|
||||
this.context.log(`[PR: ${prId}] - labels to remove: ${labelsToRemove}`);
|
||||
}
|
||||
|
||||
// Add labels
|
||||
if (labelsToAdd.length > 0 && !dryRun) {
|
||||
await this.octokit.issues.addLabels({
|
||||
...this.unPackRepo(),
|
||||
issue_number: prId,
|
||||
labels: labelsToAdd,
|
||||
});
|
||||
}
|
||||
|
||||
// Remove labels
|
||||
if (labelsToRemove.length > 0 && !dryRun) {
|
||||
await Promise.all(labelsToRemove.map((label) => this.octokit.issues.removeLabel({
|
||||
...this.unPackRepo(),
|
||||
issue_number: prId,
|
||||
name: label,
|
||||
})));
|
||||
}
|
||||
this.context.logSuccess(`synched labels for PR ${prId} with labels ${labels}`);
|
||||
}
|
||||
|
||||
async checkIfUserInTeam(username, team, verbose = false) {
|
||||
let isInTeam = this.#userInTeamCache.get([username, team]);
|
||||
if (isInTeam !== undefined) {
|
||||
return isInTeam;
|
||||
}
|
||||
|
||||
const [org, teamSlug] = team.split('/');
|
||||
const wrapped = this.context.commandWrapper({
|
||||
func: this.octokit.teams.getMembershipForUserInOrg,
|
||||
errorMsg: `User "${username}" is not authorized to alter protected labels.`,
|
||||
verbose,
|
||||
});
|
||||
const resp = await wrapped({
|
||||
org,
|
||||
team_slug: teamSlug,
|
||||
username,
|
||||
});
|
||||
isInTeam = resp?.data?.state === 'active';
|
||||
this.#userInTeamCache.set([username, team], isInTeam);
|
||||
return isInTeam;
|
||||
}
|
||||
|
||||
static isLabelProtected(label) {
|
||||
return PROTECTED_LABEL_PATTERNS.some((pattern) => new RegExp(pattern).test(label));
|
||||
}
|
||||
}
|
||||
|
||||
export default Github;
|
|
@ -1,39 +0,0 @@
|
|||
/**
|
||||
* Licensed to the Apache Software Foundation (ASF) under one
|
||||
* or more contributor license agreements. See the NOTICE file
|
||||
* distributed with this work for additional information
|
||||
* regarding copyright ownership. The ASF licenses this file
|
||||
* to you under the Apache License, Version 2.0 (the
|
||||
* "License"); you may not use this file except in compliance
|
||||
* with the License. You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
import getCLI from './cli.js';
|
||||
import Context from './context.js';
|
||||
import Github from './github.js';
|
||||
|
||||
async function runCommandFromGithubAction(rawCommand) {
|
||||
const context = new Context('GHA');
|
||||
const cli = getCLI(context);
|
||||
const github = new Github(context);
|
||||
|
||||
// Make rawCommand look like argv
|
||||
const cmd = rawCommand.trim().replace('@supersetbot', 'supersetbot');
|
||||
const args = context.parseArgs(cmd);
|
||||
|
||||
await cli.parseAsync(['node', ...args]);
|
||||
const msg = await context.onDone();
|
||||
|
||||
github.createComment(msg);
|
||||
}
|
||||
|
||||
export { runCommandFromGithubAction };
|
|
@ -1,51 +0,0 @@
|
|||
// import * as stringArgv from 'string-argv';
|
||||
import { jest } from '@jest/globals';
|
||||
|
||||
import Context from './context.js';
|
||||
import Github from './github.js';
|
||||
import * as index from './index.js';
|
||||
|
||||
describe('runCommandFromGithubAction', () => {
|
||||
const labelSpy = jest.spyOn(Github.prototype, 'label').mockImplementation(jest.fn());
|
||||
// mocking some of the Context object
|
||||
const onDoneSpy = jest.spyOn(Context.prototype, 'onDone');
|
||||
const doneCommentSpy = jest.spyOn(Context.prototype, 'doneComment');
|
||||
const parseArgsSpy = jest.spyOn(Context.prototype, 'parseArgs');
|
||||
jest.spyOn(Github.prototype, 'createComment').mockImplementation(jest.fn());
|
||||
|
||||
let originalEnv;
|
||||
|
||||
afterEach(() => {
|
||||
process.env = originalEnv;
|
||||
});
|
||||
beforeEach(() => {
|
||||
jest.clearAllMocks();
|
||||
originalEnv = process.env;
|
||||
process.env.GITHUB_ISSUE_NUMBER = '666';
|
||||
process.env.GITHUB_REPOSITORY = 'apache/superset';
|
||||
});
|
||||
|
||||
it('should strip the command', async () => {
|
||||
await index.runCommandFromGithubAction(' @supersetbot label test-label ');
|
||||
expect(parseArgsSpy).toHaveBeenCalledWith('supersetbot label test-label');
|
||||
|
||||
await index.runCommandFromGithubAction(' \n @supersetbot label test-label \n \n \n');
|
||||
expect(parseArgsSpy).toHaveBeenCalledWith('supersetbot label test-label');
|
||||
|
||||
await index.runCommandFromGithubAction(' \n \t@supersetbot label test-label \t \n \n\t \n');
|
||||
expect(parseArgsSpy).toHaveBeenCalledWith('supersetbot label test-label');
|
||||
});
|
||||
|
||||
it('should parse the raw command correctly and call commands.label and context.onDone', async () => {
|
||||
await index.runCommandFromGithubAction('@supersetbot label test-label');
|
||||
|
||||
expect(labelSpy).toHaveBeenCalled();
|
||||
expect(onDoneSpy).toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it('should generate a good comment message', async () => {
|
||||
await index.runCommandFromGithubAction('@supersetbot label test-label');
|
||||
const comment = doneCommentSpy.mock.results[0].value;
|
||||
expect(comment).toContain('> `supersetbot label test-label`');
|
||||
});
|
||||
});
|
|
@ -1,78 +0,0 @@
|
|||
/**
|
||||
* Licensed to the Apache Software Foundation (ASF) under one
|
||||
* or more contributor license agreements. See the NOTICE file
|
||||
* distributed with this work for additional information
|
||||
* regarding copyright ownership. The ASF licenses this file
|
||||
* to you under the Apache License, Version 2.0 (the
|
||||
* "License"); you may not use this file except in compliance
|
||||
* with the License. You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
import { spawn } from 'child_process';
|
||||
import { readFile } from 'fs/promises';
|
||||
import { fileURLToPath } from 'url';
|
||||
import path from 'path';
|
||||
|
||||
const dirname = path.dirname(fileURLToPath(import.meta.url));
|
||||
|
||||
async function loadPackageJson() {
|
||||
try {
|
||||
const packageJsonPath = path.join(dirname, '../package.json');
|
||||
const data = await readFile(packageJsonPath, 'utf8');
|
||||
const packageJson = JSON.parse(data);
|
||||
return packageJson;
|
||||
} catch (error) {
|
||||
console.error('Error reading package.json:', error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
export async function currentPackageVersion() {
|
||||
const data = await loadPackageJson();
|
||||
return data.version;
|
||||
}
|
||||
|
||||
export function runShellCommand(command, raiseOnError = true) {
|
||||
return new Promise((resolve, reject) => {
|
||||
// Split the command string into an array of arguments
|
||||
const args = command.split(/\s+/).filter((s) => !!s && s !== '\\');
|
||||
const childProcess = spawn(args.shift(), args);
|
||||
let stdoutData = '';
|
||||
let stderrData = '';
|
||||
|
||||
// Capture stdout data
|
||||
childProcess.stdout.on('data', (data) => {
|
||||
stdoutData += data;
|
||||
console.log(`stdout: ${data}`);
|
||||
});
|
||||
|
||||
// Capture stderr data
|
||||
childProcess.stderr.on('data', (data) => {
|
||||
stderrData += data;
|
||||
console.error(`stderr: ${data}`);
|
||||
});
|
||||
|
||||
// Handle process exit
|
||||
childProcess.on('close', (code) => {
|
||||
if (code === 0) {
|
||||
resolve(stdoutData);
|
||||
} else {
|
||||
const msg = `Command failed with code ${code}: ${stderrData}`;
|
||||
if (raiseOnError) {
|
||||
reject(new Error(msg));
|
||||
} else {
|
||||
console.error(msg);
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
});
|
||||
});
|
||||
}
|
|
@ -89,6 +89,8 @@ EOF
|
|||
setup-mysql() {
|
||||
say "::group::Initialize database"
|
||||
mysql -h 127.0.0.1 -P 13306 -u root --password=root <<-EOF
|
||||
SET GLOBAL transaction_isolation='READ-COMMITTED';
|
||||
SET GLOBAL TRANSACTION ISOLATION LEVEL READ COMMITTED;
|
||||
DROP DATABASE IF EXISTS superset;
|
||||
CREATE DATABASE superset DEFAULT CHARACTER SET utf8 COLLATE utf8_unicode_ci;
|
||||
DROP DATABASE IF EXISTS sqllab_test_db;
|
||||
|
@ -115,12 +117,6 @@ testdata() {
|
|||
say "::endgroup::"
|
||||
}
|
||||
|
||||
codecov() {
|
||||
say "::group::Upload code coverage"
|
||||
bash ".github/workflows/codecov.sh" "$@"
|
||||
say "::endgroup::"
|
||||
}
|
||||
|
||||
cypress-install() {
|
||||
cd "$GITHUB_WORKSPACE/superset-frontend/cypress-base"
|
||||
|
||||
|
@ -189,11 +185,6 @@ cypress-run-all() {
|
|||
|
||||
cypress-run "sqllab/*" "Backend persist"
|
||||
|
||||
# Upload code coverage separately so each page can have separate flags
|
||||
# -c will clean existing coverage reports, -F means add flags
|
||||
# || true to prevent CI failure on codecov upload
|
||||
codecov -c -F "cypress" || true
|
||||
|
||||
say "::group::Flask log for backend persist"
|
||||
cat "$flasklog"
|
||||
say "::endgroup::"
|
||||
|
@ -221,9 +212,7 @@ cypress-run-applitools() {
|
|||
nohup flask run --no-debugger -p $port >"$flasklog" 2>&1 </dev/null &
|
||||
local flaskProcessId=$!
|
||||
|
||||
$cypress --spec "cypress/e2e/*/**/*.applitools.test.ts" --browser "$browser" --headless --config ignoreTestFiles="[]"
|
||||
|
||||
codecov -c -F "cypress" || true
|
||||
$cypress --spec "cypress/e2e/*/**/*.applitools.test.ts" --browser "$browser" --headless
|
||||
|
||||
say "::group::Flask log for default run"
|
||||
cat "$flasklog"
|
||||
|
|
|
@ -0,0 +1,68 @@
|
|||
name: Bump Python Package
|
||||
|
||||
on:
|
||||
# Can be triggered manually
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
package:
|
||||
required: false
|
||||
description: The python package to bump (all if empty)
|
||||
group:
|
||||
required: false
|
||||
description: The optional dependency group to bump (as defined in pyproject.toml)
|
||||
limit:
|
||||
required: true
|
||||
description: Max number of PRs to open (0 for no limit)
|
||||
default: 5
|
||||
|
||||
jobs:
|
||||
bump-python-package:
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
actions: write
|
||||
contents: write
|
||||
pull-requests: write
|
||||
checks: write
|
||||
steps:
|
||||
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
persist-credentials: true
|
||||
ref: master
|
||||
|
||||
- name: Setup supersetbot
|
||||
uses: ./.github/actions/setup-supersetbot/
|
||||
|
||||
- name: Set up Python ${{ inputs.python-version }}
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: "3.10"
|
||||
|
||||
- name: Install pip-compile-multi
|
||||
run: pip install pip-compile-multi
|
||||
|
||||
- name: supersetbot bump-python -p "${{ github.event.inputs.package }}"
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
run: |
|
||||
git config --global user.email "action@github.com"
|
||||
git config --global user.name "GitHub Action"
|
||||
|
||||
PACKAGE_OPT=""
|
||||
if [ -n "${{ github.event.inputs.package }}" ]; then
|
||||
PACKAGE_OPT="-p ${{ github.event.inputs.package }}"
|
||||
fi
|
||||
|
||||
GROUP_OPT=""
|
||||
if [ -n "${{ github.event.inputs.group }}" ]; then
|
||||
GROUP_OPT="-g ${{ github.event.inputs.group }}"
|
||||
fi
|
||||
|
||||
supersetbot bump-python \
|
||||
--verbose \
|
||||
--use-current-repo \
|
||||
--include-subpackages \
|
||||
--limit ${{ github.event.inputs.limit }} \
|
||||
$PACKAGE_OPT \
|
||||
$GROUP_OPT
|
File diff suppressed because it is too large
Load Diff
|
@ -3,13 +3,9 @@ name: "CodeQL"
|
|||
on:
|
||||
push:
|
||||
branches: ["master", "[0-9].[0-9]"]
|
||||
paths:
|
||||
- "superset/**"
|
||||
pull_request:
|
||||
# The branches below must be a subset of the branches above
|
||||
branches: ["master"]
|
||||
paths:
|
||||
- "superset/**"
|
||||
schedule:
|
||||
- cron: "0 4 * * *"
|
||||
|
||||
|
@ -37,6 +33,12 @@ jobs:
|
|||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Check for file changes
|
||||
id: check
|
||||
uses: ./.github/actions/change-detector/
|
||||
with:
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
# Initializes the CodeQL tools for scanning.
|
||||
- name: Initialize CodeQL
|
||||
uses: github/codeql-action/init@v3
|
||||
|
@ -50,6 +52,7 @@ jobs:
|
|||
# queries: security-extended,security-and-quality
|
||||
|
||||
- name: Perform CodeQL Analysis
|
||||
if: steps.check.outputs.python || steps.check.outputs.frontend
|
||||
uses: github/codeql-action/analyze@v3
|
||||
with:
|
||||
category: "/language:${{matrix.language}}"
|
||||
|
|
|
@ -23,5 +23,22 @@ jobs:
|
|||
# compatible/incompatible licenses addressed here: https://www.apache.org/legal/resolved.html
|
||||
# find SPDX identifiers here: https://spdx.org/licenses/
|
||||
deny-licenses: MS-LPL, BUSL-1.1, QPL-1.0, Sleepycat, SSPL-1.0, CPOL-1.02, AGPL-3.0, GPL-1.0+, BSD-4-Clause-UC, NPL-1.0, NPL-1.1, JSON
|
||||
# adding an exception for an ambigious license on store2, which has been resolved in the latest version. It's MIT: https://github.com/nbubna/store/blob/master/LICENSE-MIT
|
||||
allow-dependencies-licenses: 'pkg:npm/store2@2.14.2'
|
||||
allow-dependencies-licenses:
|
||||
# adding an exception for an ambigious license on store2, which has been resolved in
|
||||
# the latest version. It's MIT: https://github.com/nbubna/store/blob/master/LICENSE-MIT
|
||||
- 'pkg:npm/store2@2.14.2'
|
||||
# adding exception for all applitools modules (eyes-cypress and its dependencies),
|
||||
# which has an explicit OSS license approved by ASF
|
||||
# license: https://applitools.com/legal/open-source-terms-of-use/
|
||||
- 'pkg:npm/applitools/core'
|
||||
- 'pkg:npm/applitools/core-base'
|
||||
- 'pkg:npm/applitools/css-tree'
|
||||
- 'pkg:npm/applitools/ec-client'
|
||||
- 'pkg:npm/applitools/eg-socks5-proxy-server'
|
||||
- 'pkg:npm/applitools/eyes'
|
||||
- 'pkg:npm/applitools/eyes-cypress'
|
||||
- 'pkg:npm/applitools/nml-client'
|
||||
- 'pkg:npm/applitools/tunnel-client'
|
||||
- 'pkg:npm/applitools/utils'
|
||||
# Selecting BSD-3-Clause licensing terms for node-forge to ensure compatibility with Apache
|
||||
- 'pkg:npm/node-forge@1.3.1'
|
||||
|
|
|
@ -21,7 +21,7 @@ jobs:
|
|||
steps:
|
||||
- id: set_matrix
|
||||
run: |
|
||||
MATRIX_CONFIG=$(if [ "${{ github.event_name }}" == "pull_request" ]; then echo '["ci"]'; else echo '["dev", "lean", "py310", "websocket", "dockerize"]'; fi)
|
||||
MATRIX_CONFIG=$(if [ "${{ github.event_name }}" == "pull_request" ]; then echo '["dev"]'; else echo '["dev", "lean", "py310", "websocket", "dockerize"]'; fi)
|
||||
echo "matrix_config=${MATRIX_CONFIG}" >> $GITHUB_OUTPUT
|
||||
echo $GITHUB_OUTPUT
|
||||
|
||||
|
@ -39,34 +39,42 @@ jobs:
|
|||
|
||||
steps:
|
||||
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
persist-credentials: false
|
||||
|
||||
- name: Check for file changes
|
||||
id: check
|
||||
uses: ./.github/actions/change-detector/
|
||||
with:
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
- name: Set up QEMU
|
||||
if: steps.check.outputs.python || steps.check.outputs.frontend || steps.check.outputs.docker
|
||||
uses: docker/setup-qemu-action@v3
|
||||
|
||||
- name: Set up Docker Buildx
|
||||
if: steps.check.outputs.python || steps.check.outputs.frontend || steps.check.outputs.docker
|
||||
uses: docker/setup-buildx-action@v3
|
||||
|
||||
- name: Try to login to DockerHub
|
||||
if: steps.check.outputs.python || steps.check.outputs.frontend || steps.check.outputs.docker
|
||||
continue-on-error: true
|
||||
uses: docker/login-action@v3
|
||||
with:
|
||||
username: ${{ secrets.DOCKERHUB_USER }}
|
||||
password: ${{ secrets.DOCKERHUB_TOKEN }}
|
||||
|
||||
- name: Setup Node Env
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: '20'
|
||||
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
persist-credentials: false
|
||||
|
||||
- name: Setup supersetbot
|
||||
if: steps.check.outputs.python || steps.check.outputs.frontend || steps.check.outputs.docker
|
||||
uses: ./.github/actions/setup-supersetbot/
|
||||
|
||||
- name: Build Docker Image
|
||||
if: steps.check.outputs.python || steps.check.outputs.frontend || steps.check.outputs.docker
|
||||
shell: bash
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
run: |
|
||||
# Single platform builds in pull_request context to speed things up
|
||||
if [ "${{ github.event_name }}" = "push" ]; then
|
||||
|
|
|
@ -31,8 +31,8 @@ jobs:
|
|||
- uses: actions/checkout@v4
|
||||
- uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: "16"
|
||||
registry-url: "https://registry.npmjs.org"
|
||||
node-version: "18"
|
||||
registry-url: 'https://registry.npmjs.org'
|
||||
- run: npm ci
|
||||
- run: npm run ci:release
|
||||
env:
|
||||
|
|
|
@ -21,7 +21,7 @@ jobs:
|
|||
- uses: actions/checkout@v4
|
||||
- uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: "16"
|
||||
node-version: "18"
|
||||
registry-url: 'https://registry.npmjs.org'
|
||||
- run: npm ci
|
||||
- run: npm test
|
||||
|
|
|
@ -10,19 +10,21 @@ on:
|
|||
jobs:
|
||||
superbot-orglabel:
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
contents: read
|
||||
pull-requests: write
|
||||
issues: write
|
||||
steps:
|
||||
- name: Execute SupersetBot Command
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: '20'
|
||||
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
persist-credentials: false
|
||||
|
||||
- name: Setup supersetbot
|
||||
uses: ./.github/actions/setup-supersetbot/
|
||||
|
||||
- name: Execute custom Node.js script
|
||||
- name: Execute supersetbot orglabel command
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
run: |
|
||||
|
|
|
@ -1,84 +0,0 @@
|
|||
# no-op.yml
|
||||
#
|
||||
# Purpose:
|
||||
# This workflow provides a workaround for the "required status checks" feature in GitHub Actions
|
||||
# when using path-specific conditions in other workflows. Required checks might remain in a "Pending"
|
||||
# state if the conditions are not met, thus blocking pull requests from being merged.
|
||||
# This no-op (no operation) workflow provides dummy success statuses for these required jobs when
|
||||
# the real jobs do not run due to path-specific conditions.
|
||||
#
|
||||
# How it works:
|
||||
# - It defines jobs with the same names as the required jobs in the main workflows.
|
||||
# - These jobs simply execute a command (`exit 0`) to succeed immediately.
|
||||
# - When a pull request is created or updated, both this no-op workflow and the main workflows are triggered.
|
||||
# - If the main workflows' jobs don't run (due to path conditions), these no-op jobs provide successful statuses.
|
||||
# - If the main workflows' jobs do run and fail, their failure statuses take precedence,
|
||||
# ensuring that pull requests are not merged with failing checks.
|
||||
#
|
||||
# Usage:
|
||||
# - Ensure that the job names in this workflow match exactly the names of the corresponding jobs in the main workflows.
|
||||
# - This workflow should be kept as-is, without path-specific conditions.
|
||||
|
||||
name: no-op Checks
|
||||
on: pull_request
|
||||
|
||||
jobs:
|
||||
frontend-build:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: No-op for frontend-build
|
||||
run: |
|
||||
echo "This is a no-op step for frontend-build to ensure a successful status."
|
||||
exit 0
|
||||
|
||||
pre-commit:
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: ["3.9"]
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: No-op for pre-commit
|
||||
run: |
|
||||
echo "This is a no-op step for pre-commit to ensure a successful status."
|
||||
exit 0
|
||||
|
||||
python-lint:
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: ["3.9"]
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: No-op for python-lint
|
||||
run: |
|
||||
echo "This is a no-op step for python-lint to ensure a successful status."
|
||||
exit 0
|
||||
test-postgres-hive:
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: ["3.9"]
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: No-op for frontend-build
|
||||
run: |
|
||||
echo "This is a no-op step for test-postgres-postgres to ensure a successful status when skipped."
|
||||
exit 0
|
||||
test-postgres-presto:
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: ["3.9"]
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: No-op for frontend-build
|
||||
run: |
|
||||
echo "This is a no-op step for test-postgres-postgres to ensure a successful status when skipped."
|
||||
exit 0
|
||||
unit-tests:
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: ["3.9"]
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: No-op for frontend-build
|
||||
run: |
|
||||
echo "This is a no-op step for unit-tests to ensure a successful status when skipped."
|
||||
exit 0
|
|
@ -16,9 +16,6 @@ concurrency:
|
|||
jobs:
|
||||
pre-commit:
|
||||
runs-on: ubuntu-20.04
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: ["3.9"]
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v4
|
||||
|
|
|
@ -29,7 +29,7 @@ jobs:
|
|||
|
||||
strategy:
|
||||
matrix:
|
||||
node-version: [16]
|
||||
node-version: [18]
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
|
|
@ -26,7 +26,7 @@ jobs:
|
|||
fail-fast: false
|
||||
matrix:
|
||||
browser: ["chrome"]
|
||||
node: [16]
|
||||
node: [18]
|
||||
env:
|
||||
SUPERSET_ENV: development
|
||||
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
|
||||
|
|
|
@ -30,7 +30,7 @@ jobs:
|
|||
runs-on: ubuntu-20.04
|
||||
strategy:
|
||||
matrix:
|
||||
node: [16]
|
||||
node: [18]
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v4
|
||||
|
|
|
@ -16,9 +16,6 @@ concurrency:
|
|||
jobs:
|
||||
test-load-examples:
|
||||
runs-on: ubuntu-20.04
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: ["3.9"]
|
||||
env:
|
||||
PYTHONPATH: ${{ github.workspace }}
|
||||
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
|
||||
|
@ -44,29 +41,27 @@ jobs:
|
|||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
- name: Check if python changes are present
|
||||
- name: Check for file changes
|
||||
id: check
|
||||
env:
|
||||
GITHUB_REPO: ${{ github.repository }}
|
||||
PR_NUMBER: ${{ github.event.pull_request.number }}
|
||||
continue-on-error: true
|
||||
run: ./scripts/ci_check_no_file_changes.sh python
|
||||
uses: ./.github/actions/change-detector/
|
||||
with:
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
- name: Setup Python
|
||||
if: steps.check.outputs.python
|
||||
uses: ./.github/actions/setup-backend/
|
||||
if: steps.check.outcome == 'failure'
|
||||
- name: Setup Postgres
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: setup-postgres
|
||||
- name: superset init
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python
|
||||
run: |
|
||||
pip install -e .
|
||||
superset db upgrade
|
||||
superset load_test_users
|
||||
- name: superset load_examples
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python
|
||||
run: |
|
||||
# load examples without test data
|
||||
superset load_examples --load-big-data
|
||||
|
|
|
@ -4,9 +4,11 @@ on:
|
|||
push:
|
||||
paths:
|
||||
- "docs/**"
|
||||
- "README.md"
|
||||
branches:
|
||||
- "master"
|
||||
- "[0-9].[0-9]"
|
||||
|
||||
workflow_dispatch: {}
|
||||
|
||||
jobs:
|
||||
config:
|
||||
|
@ -27,27 +29,40 @@ jobs:
|
|||
if: needs.config.outputs.has-secrets
|
||||
name: Build & Deploy
|
||||
runs-on: "ubuntu-latest"
|
||||
defaults:
|
||||
run:
|
||||
working-directory: docs
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
- name: Set up Node.js 16
|
||||
- name: Set up Node.js 18
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: "16"
|
||||
node-version: '18'
|
||||
- name: Setup Python
|
||||
uses: ./.github/actions/setup-backend/
|
||||
- uses: actions/setup-java@v4
|
||||
with:
|
||||
distribution: 'zulu'
|
||||
java-version: '21'
|
||||
- name: Install Graphviz
|
||||
run: sudo apt-get install -y graphviz
|
||||
- name: Compute Entity Relationship diagram (ERD)
|
||||
env:
|
||||
SUPERSET_SECRET_KEY: not-a-secret
|
||||
run: |
|
||||
python scripts/erd/erd.py
|
||||
curl -L http://sourceforge.net/projects/plantuml/files/1.2023.7/plantuml.1.2023.7.jar/download > ~/plantuml.jar
|
||||
java -jar ~/plantuml.jar -v -tsvg -r -o "${{ github.workspace }}/docs/static/img/" "${{ github.workspace }}/scripts/erd/erd.puml"
|
||||
- name: yarn install
|
||||
working-directory: docs
|
||||
run: |
|
||||
yarn install --check-cache
|
||||
- name: yarn build
|
||||
working-directory: docs
|
||||
run: |
|
||||
yarn build
|
||||
- name: deploy docs
|
||||
if: github.ref == 'refs/heads/master'
|
||||
uses: ./.github/actions/github-action-push-to-another-repository
|
||||
env:
|
||||
API_TOKEN_GITHUB: ${{ secrets.SUPERSET_SITE_BUILD }}
|
||||
|
|
|
@ -24,10 +24,10 @@ jobs:
|
|||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
- name: Set up Node.js 16
|
||||
- name: Set up Node.js 18
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: '16'
|
||||
node-version: '18'
|
||||
- name: yarn install
|
||||
run: |
|
||||
yarn install --check-cache
|
||||
|
|
|
@ -60,47 +60,46 @@ jobs:
|
|||
ref: "refs/pull/${{ github.event.number }}/merge"
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
- name: Check if python or frontend changes are present
|
||||
- name: Check for file changes
|
||||
id: check
|
||||
env:
|
||||
GITHUB_REPO: ${{ github.repository }}
|
||||
PR_NUMBER: ${{ github.event.pull_request.number }}
|
||||
continue-on-error: true
|
||||
run: ./scripts/ci_check_no_file_changes.sh python frontend
|
||||
uses: ./.github/actions/change-detector/
|
||||
with:
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
- name: Setup Python
|
||||
uses: ./.github/actions/setup-backend/
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python || steps.check.outputs.frontend
|
||||
- name: Setup postgres
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python || steps.check.outputs.frontend
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: setup-postgres
|
||||
- name: Import test data
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python || steps.check.outputs.frontend
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: testdata
|
||||
- name: Setup Node.js
|
||||
if: steps.check.outputs.python || steps.check.outputs.frontend
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: "16"
|
||||
node-version: "18"
|
||||
- name: Install npm dependencies
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python || steps.check.outputs.frontend
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: npm-install
|
||||
- name: Build javascript packages
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python || steps.check.outputs.frontend
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: build-instrumented-assets
|
||||
- name: Install cypress
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python || steps.check.outputs.frontend
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: cypress-install
|
||||
- name: Run Cypress
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python || steps.check.outputs.frontend
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
env:
|
||||
CYPRESS_BROWSER: ${{ matrix.browser }}
|
||||
|
@ -109,7 +108,7 @@ jobs:
|
|||
run: cypress-run-all
|
||||
- name: Upload Artifacts
|
||||
uses: actions/upload-artifact@v4
|
||||
if: failure()
|
||||
if: steps.check.outputs.python || steps.check.outputs.frontend
|
||||
with:
|
||||
name: screenshots
|
||||
path: ${{ github.workspace }}/superset-frontend/cypress-base/cypress/screenshots
|
||||
|
|
|
@ -5,12 +5,8 @@ on:
|
|||
branches:
|
||||
- "master"
|
||||
- "[0-9].[0-9]"
|
||||
paths:
|
||||
- "superset-frontend/**"
|
||||
pull_request:
|
||||
types: [synchronize, opened, reopened, ready_for_review]
|
||||
paths:
|
||||
- "superset-frontend/**"
|
||||
|
||||
# cancel previous workflow jobs for PRs
|
||||
concurrency:
|
||||
|
@ -28,62 +24,62 @@ jobs:
|
|||
submodules: recursive
|
||||
- name: Check npm lock file version
|
||||
run: ./scripts/ci_check_npm_lock_version.sh ./superset-frontend/package-lock.json
|
||||
- name: Check if frontend changes are present
|
||||
- name: Check for file changes
|
||||
id: check
|
||||
env:
|
||||
GITHUB_REPO: ${{ github.repository }}
|
||||
PR_NUMBER: ${{ github.event.pull_request.number }}
|
||||
continue-on-error: true
|
||||
run: ./scripts/ci_check_no_file_changes.sh frontend
|
||||
uses: ./.github/actions/change-detector/
|
||||
with:
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
- name: Setup Node.js
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.frontend
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: "16"
|
||||
node-version: "18"
|
||||
- name: Install dependencies
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.frontend
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: npm-install
|
||||
- name: eslint
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.frontend
|
||||
working-directory: ./superset-frontend
|
||||
run: |
|
||||
npm run eslint -- . --quiet
|
||||
- name: tsc
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.frontend
|
||||
working-directory: ./superset-frontend
|
||||
run: |
|
||||
npm run type
|
||||
- name: prettier
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.frontend
|
||||
working-directory: ./superset-frontend
|
||||
run: |
|
||||
npm run prettier-check
|
||||
- name: Build plugins packages
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.frontend
|
||||
working-directory: ./superset-frontend
|
||||
run: npm run plugins:build
|
||||
- name: Build plugins Storybook
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.frontend
|
||||
working-directory: ./superset-frontend
|
||||
run: npm run plugins:build-storybook
|
||||
- name: superset-ui/core coverage
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.frontend
|
||||
working-directory: ./superset-frontend
|
||||
run: |
|
||||
npm run core:cover
|
||||
- name: unit tests
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.frontend
|
||||
working-directory: ./superset-frontend
|
||||
run: |
|
||||
npm run test -- --coverage --silent
|
||||
# todo: remove this step when fix generator as a project in root jest.config.js
|
||||
- name: generator-superset unit tests
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.frontend
|
||||
working-directory: ./superset-frontend/packages/generator-superset
|
||||
run: npx jest
|
||||
- name: Upload code coverage
|
||||
if: steps.check.outcome == 'failure'
|
||||
working-directory: ./superset-frontend
|
||||
run: ../.github/workflows/codecov.sh -c -F javascript
|
||||
uses: codecov/codecov-action@v4
|
||||
with:
|
||||
flags: javascript
|
||||
token: ${{ secrets.CODECOV_TOKEN }}
|
||||
verbose: true
|
||||
|
|
|
@ -23,13 +23,14 @@ jobs:
|
|||
fetch-depth: 0
|
||||
|
||||
- name: Set up Helm
|
||||
uses: azure/setup-helm@v3
|
||||
uses: azure/setup-helm@v4
|
||||
with:
|
||||
version: v3.5.4
|
||||
|
||||
- uses: actions/setup-python@v5
|
||||
- name: Setup Python
|
||||
uses: ./.github/actions/setup-backend/
|
||||
with:
|
||||
python-version: "3.9"
|
||||
install-superset: 'false'
|
||||
|
||||
- name: Set up chart-testing
|
||||
uses: ./.github/actions/chart-testing-action
|
||||
|
|
|
@ -28,7 +28,7 @@ jobs:
|
|||
git config user.email "$GITHUB_ACTOR@users.noreply.github.com"
|
||||
|
||||
- name: Install Helm
|
||||
uses: azure/setup-helm@v3
|
||||
uses: azure/setup-helm@v4
|
||||
with:
|
||||
version: v3.5.4
|
||||
|
||||
|
|
|
@ -16,9 +16,6 @@ concurrency:
|
|||
jobs:
|
||||
test-mysql:
|
||||
runs-on: ubuntu-20.04
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: ["3.9"]
|
||||
env:
|
||||
PYTHONPATH: ${{ github.workspace }}
|
||||
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
|
||||
|
@ -32,6 +29,11 @@ jobs:
|
|||
MYSQL_ROOT_PASSWORD: root
|
||||
ports:
|
||||
- 13306:3306
|
||||
options: >-
|
||||
--health-cmd="mysqladmin ping --silent"
|
||||
--health-interval=10s
|
||||
--health-timeout=5s
|
||||
--health-retries=5
|
||||
redis:
|
||||
image: redis:7-alpine
|
||||
options: --entrypoint redis-server
|
||||
|
@ -43,40 +45,37 @@ jobs:
|
|||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
- name: Check if python changes are present
|
||||
- name: Check for file changes
|
||||
id: check
|
||||
env:
|
||||
GITHUB_REPO: ${{ github.repository }}
|
||||
PR_NUMBER: ${{ github.event.pull_request.number }}
|
||||
continue-on-error: true
|
||||
run: ./scripts/ci_check_no_file_changes.sh python
|
||||
uses: ./.github/actions/change-detector/
|
||||
with:
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
- name: Setup Python
|
||||
uses: ./.github/actions/setup-backend/
|
||||
if: steps.check.outcome == 'failure'
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
if: steps.check.outputs.python
|
||||
- name: Setup MySQL
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: |
|
||||
setup-mysql
|
||||
run: setup-mysql
|
||||
- name: Run celery
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python
|
||||
run: celery --app=superset.tasks.celery_app:app worker -Ofair -c 2 &
|
||||
- name: Python integration tests (MySQL)
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python
|
||||
run: |
|
||||
./scripts/python_tests.sh
|
||||
- name: Upload code coverage
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: |
|
||||
bash .github/workflows/codecov.sh -c -F python -F mysql
|
||||
uses: codecov/codecov-action@v4
|
||||
with:
|
||||
flags: python,mysql
|
||||
token: ${{ secrets.CODECOV_TOKEN }}
|
||||
verbose: true
|
||||
test-postgres:
|
||||
runs-on: ubuntu-20.04
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: ["3.9", "3.10"]
|
||||
python-version: ["current", "next"]
|
||||
env:
|
||||
PYTHONPATH: ${{ github.workspace }}
|
||||
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
|
||||
|
@ -102,41 +101,38 @@ jobs:
|
|||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
- name: Check if python changes are present
|
||||
- name: Check for file changes
|
||||
id: check
|
||||
env:
|
||||
GITHUB_REPO: ${{ github.repository }}
|
||||
PR_NUMBER: ${{ github.event.pull_request.number }}
|
||||
continue-on-error: true
|
||||
run: ./scripts/ci_check_no_file_changes.sh python
|
||||
uses: ./.github/actions/change-detector/
|
||||
with:
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
- name: Setup Python
|
||||
uses: ./.github/actions/setup-backend/
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
- name: Setup Postgres
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: |
|
||||
setup-postgres
|
||||
- name: Run celery
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python
|
||||
run: celery --app=superset.tasks.celery_app:app worker -Ofair -c 2 &
|
||||
- name: Python integration tests (PostgreSQL)
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python
|
||||
run: |
|
||||
./scripts/python_tests.sh
|
||||
- name: Upload code coverage
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: |
|
||||
bash .github/workflows/codecov.sh -c -F python -F postgres
|
||||
uses: codecov/codecov-action@v4
|
||||
with:
|
||||
flags: python,postgres
|
||||
token: ${{ secrets.CODECOV_TOKEN }}
|
||||
verbose: true
|
||||
|
||||
test-sqlite:
|
||||
runs-on: ubuntu-20.04
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: ["3.9"]
|
||||
env:
|
||||
PYTHONPATH: ${{ github.workspace }}
|
||||
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
|
||||
|
@ -156,33 +152,31 @@ jobs:
|
|||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
- name: Check if python changes are present
|
||||
- name: Check for file changes
|
||||
id: check
|
||||
env:
|
||||
GITHUB_REPO: ${{ github.repository }}
|
||||
PR_NUMBER: ${{ github.event.pull_request.number }}
|
||||
continue-on-error: true
|
||||
run: ./scripts/ci_check_no_file_changes.sh python
|
||||
uses: ./.github/actions/change-detector/
|
||||
with:
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
- name: Setup Python
|
||||
uses: ./.github/actions/setup-backend/
|
||||
if: steps.check.outcome == 'failure'
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
if: steps.check.outputs.python
|
||||
- name: Install dependencies
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: |
|
||||
# sqlite needs this working directory
|
||||
mkdir ${{ github.workspace }}/.temp
|
||||
- name: Run celery
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python
|
||||
run: celery --app=superset.tasks.celery_app:app worker -Ofair -c 2 &
|
||||
- name: Python integration tests (SQLite)
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python
|
||||
run: |
|
||||
./scripts/python_tests.sh
|
||||
- name: Upload code coverage
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: |
|
||||
bash .github/workflows/codecov.sh -c -F python -F sqlite
|
||||
uses: codecov/codecov-action@v4
|
||||
with:
|
||||
flags: python,sqlite
|
||||
token: ${{ secrets.CODECOV_TOKEN }}
|
||||
verbose: true
|
||||
|
|
|
@ -6,12 +6,8 @@ on:
|
|||
branches:
|
||||
- "master"
|
||||
- "[0-9].[0-9]"
|
||||
paths:
|
||||
- "superset/**"
|
||||
pull_request:
|
||||
types: [synchronize, opened, reopened, ready_for_review]
|
||||
paths:
|
||||
- "superset/**"
|
||||
|
||||
# cancel previous workflow jobs for PRs
|
||||
concurrency:
|
||||
|
@ -21,46 +17,37 @@ concurrency:
|
|||
jobs:
|
||||
python-lint:
|
||||
runs-on: ubuntu-20.04
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: ["3.9"]
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
- name: Check if python changes are present
|
||||
- name: Check for file changes
|
||||
id: check
|
||||
env:
|
||||
GITHUB_REPO: ${{ github.repository }}
|
||||
PR_NUMBER: ${{ github.event.pull_request.number }}
|
||||
continue-on-error: true
|
||||
run: ./scripts/ci_check_no_file_changes.sh python
|
||||
uses: ./.github/actions/change-detector/
|
||||
with:
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
- name: Setup Python
|
||||
uses: ./.github/actions/setup-backend/
|
||||
if: steps.check.outcome == 'failure'
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
- name: pylint
|
||||
if: steps.check.outcome == 'failure'
|
||||
# `-j 0` run Pylint in parallel
|
||||
run: pylint -j 0 superset
|
||||
if: steps.check.outputs.python
|
||||
|
||||
babel-extract:
|
||||
runs-on: ubuntu-20.04
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: ["3.9"]
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
- name: Setup Python
|
||||
uses: ./.github/actions/setup-backend/
|
||||
- name: Check for file changes
|
||||
id: check
|
||||
uses: ./.github/actions/change-detector/
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
- name: Setup Python
|
||||
if: steps.check.outputs.python
|
||||
uses: ./.github/actions/setup-backend/
|
||||
- name: Test babel extraction
|
||||
if: steps.check.outputs.python
|
||||
run: flask fab babel-extract --target superset/translations --output superset/translations/messages.pot --config superset/translations/babel.cfg -k _,__,t,tn,tct
|
||||
|
|
|
@ -6,12 +6,8 @@ on:
|
|||
branches:
|
||||
- "master"
|
||||
- "[0-9].[0-9]"
|
||||
paths:
|
||||
- "superset/**"
|
||||
pull_request:
|
||||
types: [synchronize, opened, reopened, ready_for_review]
|
||||
paths:
|
||||
- "superset/**"
|
||||
|
||||
# cancel previous workflow jobs for PRs
|
||||
concurrency:
|
||||
|
@ -21,9 +17,6 @@ concurrency:
|
|||
jobs:
|
||||
test-postgres-presto:
|
||||
runs-on: ubuntu-20.04
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: ["3.9"]
|
||||
env:
|
||||
PYTHONPATH: ${{ github.workspace }}
|
||||
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
|
||||
|
@ -59,38 +52,37 @@ jobs:
|
|||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
- name: Check if python changes are present
|
||||
- name: Check for file changes
|
||||
id: check
|
||||
env:
|
||||
GITHUB_REPO: ${{ github.repository }}
|
||||
PR_NUMBER: ${{ github.event.pull_request.number }}
|
||||
continue-on-error: true
|
||||
run: ./scripts/ci_check_no_file_changes.sh python
|
||||
uses: ./.github/actions/change-detector/
|
||||
with:
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
- name: Setup Python
|
||||
uses: ./.github/actions/setup-backend/
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python == 'true'
|
||||
- name: Setup Postgres
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: setup-postgres
|
||||
run: |
|
||||
echo "${{ steps.check.outputs.python }}"
|
||||
setup-postgres
|
||||
- name: Run celery
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python
|
||||
run: celery --app=superset.tasks.celery_app:app worker -Ofair -c 2 &
|
||||
- name: Python unit tests (PostgreSQL)
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python
|
||||
run: |
|
||||
./scripts/python_tests.sh -m 'chart_data_flow or sql_json_flow'
|
||||
- name: Upload code coverage
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: |
|
||||
bash .github/workflows/codecov.sh -c -F python -F presto
|
||||
uses: codecov/codecov-action@v4
|
||||
with:
|
||||
flags: python,presto
|
||||
token: ${{ secrets.CODECOV_TOKEN }}
|
||||
verbose: true
|
||||
|
||||
test-postgres-hive:
|
||||
runs-on: ubuntu-20.04
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: ["3.9"]
|
||||
env:
|
||||
PYTHONPATH: ${{ github.workspace }}
|
||||
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
|
||||
|
@ -118,38 +110,38 @@ jobs:
|
|||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
- name: Check if python changes are present
|
||||
- name: Check for file changes
|
||||
id: check
|
||||
env:
|
||||
GITHUB_REPO: ${{ github.repository }}
|
||||
PR_NUMBER: ${{ github.event.pull_request.number }}
|
||||
continue-on-error: true
|
||||
run: ./scripts/ci_check_no_file_changes.sh python
|
||||
uses: ./.github/actions/change-detector/
|
||||
with:
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
- name: Create csv upload directory
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python
|
||||
run: sudo mkdir -p /tmp/.superset/uploads
|
||||
- name: Give write access to the csv upload directory
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python
|
||||
run: sudo chown -R $USER:$USER /tmp/.superset
|
||||
- name: Start hadoop and hive
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python
|
||||
run: docker compose -f scripts/databases/hive/docker-compose.yml up -d
|
||||
- name: Setup Python
|
||||
uses: ./.github/actions/setup-backend/
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python
|
||||
- name: Setup Postgres
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: setup-postgres
|
||||
- name: Run celery
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python
|
||||
run: celery --app=superset.tasks.celery_app:app worker -Ofair -c 2 &
|
||||
- name: Python unit tests (PostgreSQL)
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python
|
||||
run: |
|
||||
./scripts/python_tests.sh -m 'chart_data_flow or sql_json_flow'
|
||||
- name: Upload code coverage
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: |
|
||||
bash .github/workflows/codecov.sh -c -F python -F hive
|
||||
uses: codecov/codecov-action@v4
|
||||
with:
|
||||
flags: python,hive
|
||||
token: ${{ secrets.CODECOV_TOKEN }}
|
||||
verbose: true
|
||||
|
|
|
@ -6,18 +6,8 @@ on:
|
|||
branches:
|
||||
- "master"
|
||||
- "[0-9].[0-9]"
|
||||
paths:
|
||||
- "superset/**"
|
||||
- "requirements/**"
|
||||
- "tests/unit_tests/**"
|
||||
- "scripts/**"
|
||||
pull_request:
|
||||
types: [synchronize, opened, reopened, ready_for_review]
|
||||
paths:
|
||||
- "superset/**"
|
||||
- "requirements/**"
|
||||
- "tests/unit_tests/**"
|
||||
- "scripts/**"
|
||||
|
||||
# cancel previous workflow jobs for PRs
|
||||
concurrency:
|
||||
|
@ -29,7 +19,7 @@ jobs:
|
|||
runs-on: ubuntu-20.04
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: ["3.9", "3.10"]
|
||||
python-version: ["current", "next"]
|
||||
env:
|
||||
PYTHONPATH: ${{ github.workspace }}
|
||||
steps:
|
||||
|
@ -38,26 +28,26 @@ jobs:
|
|||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
- name: Check if python changes are present
|
||||
- name: Check for file changes
|
||||
id: check
|
||||
env:
|
||||
GITHUB_REPO: ${{ github.repository }}
|
||||
PR_NUMBER: ${{ github.event.pull_request.number }}
|
||||
continue-on-error: true
|
||||
run: ./scripts/ci_check_no_file_changes.sh python
|
||||
uses: ./.github/actions/change-detector/
|
||||
with:
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
- name: Setup Python
|
||||
uses: ./.github/actions/setup-backend/
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
- name: Python unit tests
|
||||
if: steps.check.outcome == 'failure'
|
||||
if: steps.check.outputs.python
|
||||
env:
|
||||
SUPERSET_TESTENV: true
|
||||
SUPERSET_SECRET_KEY: not-a-secret
|
||||
run: |
|
||||
pytest --durations-min=0.5 --cov-report= --cov=superset ./tests/common ./tests/unit_tests --cache-clear
|
||||
- name: Upload code coverage
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: |
|
||||
bash .github/workflows/codecov.sh -c -F python -F unit
|
||||
uses: codecov/codecov-action@v4
|
||||
with:
|
||||
flags: python,unit
|
||||
token: ${{ secrets.CODECOV_TOKEN }}
|
||||
verbose: true
|
||||
|
|
|
@ -14,7 +14,7 @@ concurrency:
|
|||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
frontend-check:
|
||||
frontend-check-translations:
|
||||
runs-on: ubuntu-20.04
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
|
@ -22,33 +22,46 @@ jobs:
|
|||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
|
||||
- name: Check for file changes
|
||||
id: check
|
||||
uses: ./.github/actions/change-detector/
|
||||
with:
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
- name: Setup Node.js
|
||||
if: steps.check.outputs.frontend
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: "16"
|
||||
node-version: '18'
|
||||
- name: Install dependencies
|
||||
if: steps.check.outputs.frontend
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: npm-install
|
||||
- name: lint
|
||||
if: steps.check.outputs.frontend
|
||||
working-directory: ./superset-frontend
|
||||
run: |
|
||||
npm run check-translation
|
||||
|
||||
babel-extract:
|
||||
runs-on: ubuntu-20.04
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: ["3.9"]
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
- name: Setup Python
|
||||
uses: ./.github/actions/setup-backend/
|
||||
- name: Check for file changes
|
||||
id: check
|
||||
uses: ./.github/actions/change-detector/
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
- name: Setup Python
|
||||
if: steps.check.outputs.python
|
||||
uses: ./.github/actions/setup-backend/
|
||||
- name: Test babel extraction
|
||||
if: steps.check.outputs.python
|
||||
run: ./scripts/babel_update.sh
|
||||
|
|
|
@ -19,25 +19,28 @@ jobs:
|
|||
if: >
|
||||
github.event_name == 'workflow_dispatch' ||
|
||||
(github.event_name == 'issue_comment' && contains(github.event.comment.body, '@supersetbot'))
|
||||
permissions:
|
||||
contents: read
|
||||
pull-requests: write
|
||||
issues: write
|
||||
steps:
|
||||
- name: Quickly add thumbs up!
|
||||
uses: actions/github-script@v5
|
||||
if: github.event_name == 'issue_comment' && contains(github.event.comment.body, '@supersetbot')
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
script: |
|
||||
const [owner, repo] = process.env.GITHUB_REPOSITORY.split('/')
|
||||
await github.rest.reactions.createForIssueComment({
|
||||
owner,
|
||||
repo,
|
||||
comment_id: ${{ github.event.comment.id }},
|
||||
comment_id: context.payload.comment.id,
|
||||
content: '+1'
|
||||
});
|
||||
- name: Execute SupersetBot Command
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: '20'
|
||||
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
- name: "Checkout ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
persist-credentials: false
|
||||
|
||||
- name: Setup supersetbot
|
||||
uses: ./.github/actions/setup-supersetbot/
|
||||
|
@ -48,16 +51,6 @@ jobs:
|
|||
GITHUB_ACTOR: ${{ github.actor }}
|
||||
GITHUB_REPOSITORY: ${{ github.repository }}
|
||||
GITHUB_ISSUE_NUMBER: ${{ github.event.issue.number }}
|
||||
COMMENT_BODY: ${{ github.event.comment.body }}
|
||||
INPUT_COMMENT_BODY: ${{ github.event.inputs.comment_body }}
|
||||
COMMENT_BODY: ${{ github.event.comment.body }}${{ github.event.inputs.comment_body }}
|
||||
run: |
|
||||
cat <<EOF > script.js
|
||||
const run = async () => {
|
||||
const { runCommandFromGithubAction } = await import('supersetbot');
|
||||
const cmd = process.env.COMMENT_BODY || process.env.INPUT_COMMENT_BODY;
|
||||
console.log("Executing: ", cmd);
|
||||
await runCommandFromGithubAction(cmd);
|
||||
};
|
||||
run().catch(console.error);
|
||||
EOF
|
||||
node script.js
|
||||
supersetbot run "$COMMENT_BODY"
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
name: Docker Publish Release
|
||||
name: Publish a Release
|
||||
|
||||
on:
|
||||
release:
|
||||
|
@ -51,21 +51,28 @@ jobs:
|
|||
- name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@v3
|
||||
|
||||
- name: Setup Node Env
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: '20'
|
||||
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
persist-credentials: false
|
||||
tags: true
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Setup supersetbot
|
||||
uses: ./.github/actions/setup-supersetbot/
|
||||
|
||||
- name: Try to login to DockerHub
|
||||
continue-on-error: true
|
||||
uses: docker/login-action@v3
|
||||
with:
|
||||
username: ${{ secrets.DOCKERHUB_USER }}
|
||||
password: ${{ secrets.DOCKERHUB_TOKEN }}
|
||||
|
||||
- name: Execute custom Node.js script
|
||||
env:
|
||||
DOCKERHUB_USER: ${{ secrets.DOCKERHUB_USER }}
|
||||
DOCKERHUB_TOKEN: ${{ secrets.DOCKERHUB_TOKEN }}
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
run: |
|
||||
RELEASE="${{ github.event.release.tag_name }}"
|
||||
FORCE_LATEST=""
|
||||
|
@ -76,10 +83,7 @@ jobs:
|
|||
if [ "${{ github.event.inputs.force-latest }}" = "true" ]; then
|
||||
FORCE_LATEST="--force-latest"
|
||||
fi
|
||||
# build_docker.py may not exist on that SHA, let's switcharoo to /tmp
|
||||
cp ./scripts/build_docker.py /tmp
|
||||
git checkout "${{ github.event.inputs.git-ref }}"
|
||||
cp /tmp/build_docker.py scripts/
|
||||
EVENT="release"
|
||||
fi
|
||||
|
||||
|
@ -89,3 +93,20 @@ jobs:
|
|||
--context-ref "$RELEASE" $FORCE_LATEST \
|
||||
--platform "linux/arm64" \
|
||||
--platform "linux/amd64"
|
||||
|
||||
# Going back on original branch to allow "post" GHA operations
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
persist-credentials: false
|
||||
|
||||
- name: Label the PRs with the right release-related labels
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
run: |
|
||||
RELEASE="${{ github.event.release.tag_name }}"
|
||||
if [ "${{ github.event_name }}" = "workflow_dispatch" ]; then
|
||||
# in the case of a manually-triggered run, read release from input
|
||||
RELEASE="${{ github.event.inputs.release }}"
|
||||
fi
|
||||
supersetbot release-label $RELEASE
|
|
@ -32,7 +32,7 @@ jobs:
|
|||
- name: Set up Node.js
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: "16"
|
||||
node-version: '18'
|
||||
|
||||
- name: Install Dependencies
|
||||
run: npm install
|
||||
|
|
|
@ -16,6 +16,9 @@ concurrency:
|
|||
|
||||
jobs:
|
||||
update-lock-file:
|
||||
permissions:
|
||||
contents: write
|
||||
pull-requests: write
|
||||
runs-on: ubuntu-latest
|
||||
if: >
|
||||
(github.event_name == 'pull_request' && github.event.pull_request.user.login == 'dependabot[bot]') ||
|
||||
|
@ -32,13 +35,15 @@ jobs:
|
|||
- name: Set up Node.js
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: '16'
|
||||
node-version: '18'
|
||||
|
||||
- name: Install Dependencies and Update Lock File
|
||||
run: |
|
||||
npm install
|
||||
|
||||
- name: Commit and Push Changes
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
run: |
|
||||
git config user.name "GitHub-Actions[bot]"
|
||||
git config user.email "github-actions[bot]@users.noreply.github.com"
|
||||
|
|
|
@ -112,3 +112,4 @@ messages.mo
|
|||
docker/requirements-local.txt
|
||||
|
||||
cache/
|
||||
docker/*local*
|
||||
|
|
|
@ -19,24 +19,6 @@ repos:
|
|||
rev: v0.2.2
|
||||
hooks:
|
||||
- id: auto-walrus
|
||||
- repo: https://github.com/asottile/pyupgrade
|
||||
rev: v3.4.0
|
||||
hooks:
|
||||
- id: pyupgrade
|
||||
args:
|
||||
- --py39-plus
|
||||
- repo: https://github.com/hadialqattan/pycln
|
||||
rev: v2.1.2
|
||||
hooks:
|
||||
- id: pycln
|
||||
args:
|
||||
- --disable-all-dunder-policy
|
||||
- --exclude=superset/config.py
|
||||
- --extend-exclude=tests/integration_tests/superset_test_config.*.py
|
||||
- repo: https://github.com/PyCQA/isort
|
||||
rev: 5.12.0
|
||||
hooks:
|
||||
- id: isort
|
||||
- repo: https://github.com/pre-commit/mirrors-mypy
|
||||
rev: v1.3.0
|
||||
hooks:
|
||||
|
@ -65,18 +47,13 @@ repos:
|
|||
hooks:
|
||||
- id: check-docstring-first
|
||||
- id: check-added-large-files
|
||||
exclude: \.(geojson)$
|
||||
exclude: ^.*\.(geojson)$|^docs/static/img/screenshots/.*
|
||||
- id: check-yaml
|
||||
exclude: ^helm/superset/templates/
|
||||
- id: debug-statements
|
||||
- id: end-of-file-fixer
|
||||
- id: trailing-whitespace
|
||||
args: ["--markdown-linebreak-ext=md"]
|
||||
- repo: https://github.com/psf/black
|
||||
rev: 23.1.0
|
||||
hooks:
|
||||
- id: black
|
||||
language_version: python3
|
||||
- repo: https://github.com/pre-commit/mirrors-prettier
|
||||
rev: v3.1.0 # Use the sha or tag you want to point at
|
||||
hooks:
|
||||
|
@ -94,3 +71,23 @@ repos:
|
|||
hooks:
|
||||
- id: helm-docs
|
||||
files: helm
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
rev: v0.4.0
|
||||
hooks:
|
||||
- id: ruff
|
||||
args: [ --fix ]
|
||||
- id: ruff-format
|
||||
- repo: local
|
||||
hooks:
|
||||
- id: pylint
|
||||
name: pylint
|
||||
entry: pylint
|
||||
language: system
|
||||
types: [python]
|
||||
exclude: ^(tests/|superset/migrations/|scripts/|RELEASING/|docker/)
|
||||
args:
|
||||
[
|
||||
"-rn", # Only display messages
|
||||
"-sn", # Don't display the score
|
||||
"--rcfile=.pylintrc",
|
||||
]
|
||||
|
|
|
@ -77,6 +77,7 @@ disable=
|
|||
cyclic-import, # re-enable once this no longer raises false positives
|
||||
missing-docstring,
|
||||
duplicate-code,
|
||||
line-too-long,
|
||||
unspecified-encoding,
|
||||
too-many-instance-attributes # re-enable once this no longer raises false positives
|
||||
|
||||
|
@ -171,7 +172,7 @@ max-nested-blocks=5
|
|||
[FORMAT]
|
||||
|
||||
# Maximum number of characters on a single line.
|
||||
max-line-length=90
|
||||
max-line-length=100
|
||||
|
||||
# Regexp for a line that is allowed to be longer than the limit.
|
||||
ignore-long-lines=^\s*(# )?<?https?://\S+>?$
|
||||
|
|
|
@ -64,5 +64,10 @@ temporary_superset_ui/*
|
|||
# docs overrides for third party logos we don't have the rights to
|
||||
google-big-query.svg
|
||||
google-sheets.svg
|
||||
ibm-db2.svg
|
||||
postgresql.svg
|
||||
snowflake.svg
|
||||
|
||||
# docs-related
|
||||
erd.puml
|
||||
erd.svg
|
||||
|
|
|
@ -0,0 +1,63 @@
|
|||
<?xml version="1.0"?>
|
||||
<?xml-stylesheet type="text/xsl"?>
|
||||
<rdf:RDF xml:lang="en"
|
||||
xmlns="http://usefulinc.com/ns/doap#"
|
||||
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
|
||||
xmlns:asfext="http://projects.apache.org/ns/asfext#"
|
||||
xmlns:foaf="http://xmlns.com/foaf/0.1/">
|
||||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one or more
|
||||
contributor license agreements. See the NOTICE file distributed with
|
||||
this work for additional information regarding copyright ownership.
|
||||
The ASF licenses this file to You under the Apache License, Version 2.0
|
||||
(the "License"); you may not use this file except in compliance with
|
||||
the License. You may obtain a copy of the License at
|
||||
|
||||
https://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
||||
-->
|
||||
<Project rdf:about="https://superset.apache.org">
|
||||
<created>2024-04-10</created>
|
||||
<license rdf:resource="https://spdx.org/licenses/Apache-2.0" />
|
||||
<name>Apache Superset</name>
|
||||
<homepage rdf:resource="https://superset.apache.org" />
|
||||
<asfext:pmc rdf:resource="https://superset.apache.org" />
|
||||
<shortdesc>Apache Superset™ is an open-source modern data exploration and visualization platform.</shortdesc>
|
||||
<description>Superset is a fast, lightweight, intuitive, business intelligence platform. Loaded with options, Superset makes it easy for users of all skill sets to explore and visualize their data, from simple line charts to highly detailed geospatial charts.
|
||||
|
||||
* Powerful yet easy to use:
|
||||
Superset makes it easy to explore your data, using either our simple no-code viz builder or state-of-the-art SQL IDE.
|
||||
|
||||
* Integrates with modern databases
|
||||
Superset can connect to any SQL-based databases including modern cloud-native databases and engines at petabyte scale.
|
||||
|
||||
* Modern architecture
|
||||
Superset is lightweight and highly scalable, leveraging the power of your existing data infrastructure without requiring yet another ingestion layer.
|
||||
|
||||
* Rich visualizations and dashboards
|
||||
Superset ships with 40+ pre-installed visualization types. Our plug-in architecture makes it easy to build custom visualizations.
|
||||
|
||||
</description>
|
||||
<bug-database rdf:resource="https://github.com/apache/superset/issues" />
|
||||
<mailing-list rdf:resource="https://lists.apache.org/list.html?dev@superset.apache.org" />
|
||||
<download-page rdf:resource="https://dist.apache.org/repos/dist/release/superset/" />
|
||||
<programming-language>JavaScript</programming-language>
|
||||
<programming-language>Python</programming-language>
|
||||
<programming-language>Typescript</programming-language>
|
||||
<category rdf:resource="https://projects.apache.org/category/big-data" />
|
||||
<category rdf:resource="https://projects.apache.org/category/database" />
|
||||
<category rdf:resource="https://projects.apache.org/category/data-engineering" />
|
||||
<category rdf:resource="https://projects.apache.org/category/geospatial" />
|
||||
<repository>
|
||||
<GitRepository>
|
||||
<location rdf:resource="https://github.com/apache/superset.git"/>
|
||||
<browse rdf:resource="https://github.com/apache/superset/"/>
|
||||
</GitRepository>
|
||||
</repository>
|
||||
</Project>
|
||||
</rdf:RDF>
|
|
@ -38,3 +38,5 @@ under the License.
|
|||
- [3.0.4](./CHANGELOG/3.0.4.md)
|
||||
- [3.1.0](./CHANGELOG/3.1.0.md)
|
||||
- [3.1.1](./CHANGELOG/3.1.1.md)
|
||||
- [3.1.2](./CHANGELOG/3.1.2.md)
|
||||
- [4.0.0](./CHANGELOG/4.0.0.md)
|
||||
|
|
|
@ -0,0 +1,93 @@
|
|||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
distributed with this work for additional information
|
||||
regarding copyright ownership. The ASF licenses this file
|
||||
to you under the Apache License, Version 2.0 (the
|
||||
"License"); you may not use this file except in compliance
|
||||
with the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing,
|
||||
software distributed under the License is distributed on an
|
||||
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
|
||||
## Change Log
|
||||
|
||||
### 3.1.2 (Thu Mar 28 11:32:00 2024 -0300)
|
||||
|
||||
**Fixes**
|
||||
|
||||
- [#27706](https://github.com/apache/superset/pull/27706) fix: Select onChange is fired when the same item is selected in single mode (@michael-s-molina)
|
||||
- [#27744](https://github.com/apache/superset/pull/27744) fix: reduce alert error to warning (@eschutho)
|
||||
- [#27644](https://github.com/apache/superset/pull/27644) fix: Provide more inclusive error handling for saved queries (@john-bodley)
|
||||
- [#27646](https://github.com/apache/superset/pull/27646) fix: Leverage actual database for rendering Jinjarized SQL (@john-bodley)
|
||||
- [#27636](https://github.com/apache/superset/pull/27636) fix(sqllab): unable to remove table (@justinpark)
|
||||
- [#27022](https://github.com/apache/superset/pull/27022) fix(Chart Annotation modal): Table and Superset annotation options will paginate, exceeding previous max limit 100 (@rtexelm)
|
||||
- [#27552](https://github.com/apache/superset/pull/27552) fix(AlertReports): defaulting grace period to undefined (@fisjac)
|
||||
- [#27551](https://github.com/apache/superset/pull/27551) fix(AlertReports): clearing custom_width when disabled (@fisjac)
|
||||
- [#27601](https://github.com/apache/superset/pull/27601) fix: Persist query params appended to permalink (@kgabryje)
|
||||
- [#27470](https://github.com/apache/superset/pull/27470) fix(sql_parse): Ensure table extraction handles Jinja templating (@john-bodley)
|
||||
- [#19595](https://github.com/apache/superset/pull/19595) fix: Volatile datasource ordering in dashboard export (@pnadolny13)
|
||||
- [#27613](https://github.com/apache/superset/pull/27613) fix(Dashboard): Add editMode conditional for translate3d fix on charts to allow intended Fullscreen (@rtexelm)
|
||||
- [#27388](https://github.com/apache/superset/pull/27388) fix(utils): fix off-by-one error in how rolling window's min_periods truncates dataframe (@sfirke)
|
||||
- [#27577](https://github.com/apache/superset/pull/27577) fix: sqlglot SQL Server (@betodealmeida)
|
||||
- [#27576](https://github.com/apache/superset/pull/27576) fix: bump sqlglot to support materialized CTEs (@betodealmeida)
|
||||
- [#27567](https://github.com/apache/superset/pull/27567) fix(db_engine_specs): Update convert_dttm to work correctly with CrateDB (@hlcianfagna)
|
||||
- [#27605](https://github.com/apache/superset/pull/27605) fix: Skips Hive tests that are blocking PRs (@michael-s-molina)
|
||||
- [#27566](https://github.com/apache/superset/pull/27566) fix: guest queries (@betodealmeida)
|
||||
- [#27464](https://github.com/apache/superset/pull/27464) fix: pass valid SQL to SM (@betodealmeida)
|
||||
- [#26748](https://github.com/apache/superset/pull/26748) fix: `improve _extract_tables_from_sql` (@betodealmeida)
|
||||
- [#27260](https://github.com/apache/superset/pull/27260) fix(alerts/reports): implementing custom_width as an Antd number input (@fisjac)
|
||||
- [#27487](https://github.com/apache/superset/pull/27487) fix(postprocessing): resample with holes (@villebro)
|
||||
- [#27484](https://github.com/apache/superset/pull/27484) fix: check if guest user modified query (@betodealmeida)
|
||||
- [#27471](https://github.com/apache/superset/pull/27471) fix(webpack): remove double-dotted file extensions in webpack config (@rusackas)
|
||||
- [#27411](https://github.com/apache/superset/pull/27411) fix(dashboard): Only fetch CSS templates for dashboard header menu when in edit mode (@mskelton)
|
||||
- [#27262](https://github.com/apache/superset/pull/27262) fix(Alerts & Reports): Fixing bug that resets cron value to default when empty (@fisjac)
|
||||
- [#27315](https://github.com/apache/superset/pull/27315) fix(deps): resolving canvg and html2canvas module not found (@fisjac)
|
||||
- [#27403](https://github.com/apache/superset/pull/27403) fix: missing shared color in mixed timeseries (@justinpark)
|
||||
- [#27391](https://github.com/apache/superset/pull/27391) fix(sqllab): Close already removed tab (@justinpark)
|
||||
- [#27364](https://github.com/apache/superset/pull/27364) fix(API): Updating assets via the API should preserve ownership configuration (@Vitor-Avila)
|
||||
- [#27395](https://github.com/apache/superset/pull/27395) fix: improve explore REST api validations (@dpgaspar)
|
||||
- [#26205](https://github.com/apache/superset/pull/26205) fix(docker): Remove race condition when building image (@alekseyolg)
|
||||
- [#27366](https://github.com/apache/superset/pull/27366) fix: Results section in Explore shows an infinite spinner (@michael-s-molina)
|
||||
- [#27187](https://github.com/apache/superset/pull/27187) fix: numexpr to fix CVE-2023-39631 (2.8.4 => 2.9.0) (@nigzak)
|
||||
- [#27360](https://github.com/apache/superset/pull/27360) fix: Heatmap numeric sorting (@michael-s-molina)
|
||||
- [#27308](https://github.com/apache/superset/pull/27308) fix(dashboard): table chart drag preview overflowing container (@rtexelm)
|
||||
- [#27295](https://github.com/apache/superset/pull/27295) fix(sqllab): invalid dump sql shown after closing tab (@justinpark)
|
||||
- [#27285](https://github.com/apache/superset/pull/27285) fix(plugin-chart-echarts): calculate Gauge Chart intervals correctly when min value is set (@goto-loop)
|
||||
- [#27307](https://github.com/apache/superset/pull/27307) fix: Incorrect data type on import page (@michael-s-molina)
|
||||
- [#27291](https://github.com/apache/superset/pull/27291) fix: Data zoom with horizontal orientation (@michael-s-molina)
|
||||
- [#27273](https://github.com/apache/superset/pull/27273) fix: Navigating to an invalid page index in lists (@michael-s-molina)
|
||||
- [#27271](https://github.com/apache/superset/pull/27271) fix: Inoperable dashboard filter slider when range is <= 1 (@michael-s-molina)
|
||||
- [#27154](https://github.com/apache/superset/pull/27154) fix(import-datasources): Use "admin" user as default for importing datasources (@ddxv)
|
||||
- [#27258](https://github.com/apache/superset/pull/27258) fix: Sorting charts/dashboards makes the applied filters ineffective (@michael-s-molina)
|
||||
- [#27213](https://github.com/apache/superset/pull/27213) fix(trino): bumping trino to fix hudi schema fetching (@rusackas)
|
||||
- [#27236](https://github.com/apache/superset/pull/27236) fix(reports): fixing unit test (@fisjac)
|
||||
- [#27217](https://github.com/apache/superset/pull/27217) fix(sqlglot): Address regressions introduced in #26476 (@john-bodley)
|
||||
- [#27233](https://github.com/apache/superset/pull/27233) fix: bump FAB to 4.4.1 (perf issue) (@dpgaspar)
|
||||
- [#27167](https://github.com/apache/superset/pull/27167) fix: setting important lower bounds versions on requirements (@dpgaspar)
|
||||
- [#27215](https://github.com/apache/superset/pull/27215) fix: no limit in SELECT \* for TOP dbs (@betodealmeida)
|
||||
- [#27191](https://github.com/apache/superset/pull/27191) fix: Failed to execute importScripts on worker-css (@michael-s-molina)
|
||||
- [#27181](https://github.com/apache/superset/pull/27181) fix(sqllab): typeahead search is broken in db selector (@justinpark)
|
||||
- [#27161](https://github.com/apache/superset/pull/27161) fix(ci): mypy pre-commit issues (@dpgaspar)
|
||||
- [#27135](https://github.com/apache/superset/pull/27135) fix: Duplicated toast messages (@michael-s-molina)
|
||||
- [#27132](https://github.com/apache/superset/pull/27132) fix: Plain error message when visiting a dashboard via permalink without permissions (@michael-s-molina)
|
||||
- [#22840](https://github.com/apache/superset/pull/22840) fix(pivot-table-v2): Added forgotten translation pivot table v2 (@Always-prog)
|
||||
- [#27128](https://github.com/apache/superset/pull/27128) fix: RLS modal overflow (@michael-s-molina)
|
||||
- [#27112](https://github.com/apache/superset/pull/27112) fix: gevent upgrade to 23.9.1 (@dpgaspar)
|
||||
- [#27124](https://github.com/apache/superset/pull/27124) fix: bump grpcio, urllib3 and paramiko (@dpgaspar)
|
||||
- [#27113](https://github.com/apache/superset/pull/27113) fix: upgrade cryptography to major 42 (@dpgaspar)
|
||||
- [#27106](https://github.com/apache/superset/pull/27106) fix: Timeseries Y-axis format with contribution mode (@michael-s-molina)
|
||||
|
||||
**Others**
|
||||
|
||||
- [#27281](https://github.com/apache/superset/pull/27281) chore: bump cryptography minimum to 42.0.4 (@sadpandajoe)
|
||||
- [#27232](https://github.com/apache/superset/pull/27232) chore: Removes Chromatic workflow and dependencies (@michael-s-molina)
|
||||
- [#27159](https://github.com/apache/superset/pull/27159) chore: bump FAB to 4.4.0 (@dpgaspar)
|
||||
- [#27129](https://github.com/apache/superset/pull/27129) chore: lower cryptography min version to 41.0.2 (@sadpandajoe)
|
|
@ -0,0 +1,472 @@
|
|||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
distributed with this work for additional information
|
||||
regarding copyright ownership. The ASF licenses this file
|
||||
to you under the Apache License, Version 2.0 (the
|
||||
"License"); you may not use this file except in compliance
|
||||
with the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing,
|
||||
software distributed under the License is distributed on an
|
||||
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
|
||||
## Change Log
|
||||
|
||||
### 4.0 (Mon Apr 1 10:04:00 2024 -0500)
|
||||
|
||||
**Database Migrations**
|
||||
|
||||
- [#27119](https://github.com/apache/superset/pull/27119) refactor: Updates some database columns to MediumText (@michael-s-molina)
|
||||
- [#27029](https://github.com/apache/superset/pull/27029) chore: Add granular permissions for actions in Dashboard (@geido)
|
||||
- [#26654](https://github.com/apache/superset/pull/26654) chore: add unique constraint to tagged_objects (@mistercrunch)
|
||||
- [#26377](https://github.com/apache/superset/pull/26377) refactor: Removes the deprecated redirect endpoint (@michael-s-molina)
|
||||
- [#26328](https://github.com/apache/superset/pull/26328) refactor: Removes the Filter Box code (@michael-s-molina)
|
||||
- [#26350](https://github.com/apache/superset/pull/26350) refactor: Migrates legacy Sunburst charts to ECharts and removes legacy code (@michael-s-molina)
|
||||
- [#26369](https://github.com/apache/superset/pull/26369) refactor: Removes the filters set feature (@michael-s-molina)
|
||||
- [#26416](https://github.com/apache/superset/pull/26416) fix: improve performance on reports log queries (@dpgaspar)
|
||||
- [#26290](https://github.com/apache/superset/pull/26290) feat(echarts-funnel): Implement % calculation type (@kgabryje)
|
||||
|
||||
**Features**
|
||||
|
||||
- [#27159](https://github.com/apache/superset/pull/27159) feat: bump FAB to 4.4.0 (@dpgaspar)
|
||||
- [#26202](https://github.com/apache/superset/pull/26202) feat(Alerts and Reports): Modal redesign (@rtexelm)
|
||||
- [#26907](https://github.com/apache/superset/pull/26907) feat(storybook): Co-habitating/Upgrading Storybooks to v7 (dependency madness ensues) (@rusackas)
|
||||
- [#27092](https://github.com/apache/superset/pull/27092) feat(plugins): Tooltips on BigNumber with Time Comparison chart (@Antonio-RiveroMartnez)
|
||||
- [#27052](https://github.com/apache/superset/pull/27052) feat(plugins): Adding colors to BigNumber with Time Comparison chart (@Antonio-RiveroMartnez)
|
||||
- [#27054](https://github.com/apache/superset/pull/27054) feat(plugins): Update custom controls for BigNumber with Time Comparison chart (@Antonio-RiveroMartnez)
|
||||
- [#27055](https://github.com/apache/superset/pull/27055) feat(docker): allow for docker release builds to be multi-platform (@mistercrunch)
|
||||
- [#26923](https://github.com/apache/superset/pull/26923) feat: docker image tags documentation + tweaks (@mistercrunch)
|
||||
- [#26945](https://github.com/apache/superset/pull/26945) feat(ci): kill duplicate CI jobs on PRs (@mistercrunch)
|
||||
- [#26639](https://github.com/apache/superset/pull/26639) feat(components): Add static class name with button style (@mskelton)
|
||||
- [#26908](https://github.com/apache/superset/pull/26908) feat: Period over Period Big Number comparison chart (@eschutho)
|
||||
- [#26912](https://github.com/apache/superset/pull/26912) feat(ci): unleash dependabot on our github actions (@mistercrunch)
|
||||
- [#26300](https://github.com/apache/superset/pull/26300) feat(maps): Consolidating all country maps (and TS) into the Jupyter notebook workflow. (@rusackas)
|
||||
- [#26877](https://github.com/apache/superset/pull/26877) feat(ci): add a check to make sure there's no hold label on the PR (@mistercrunch)
|
||||
- [#26880](https://github.com/apache/superset/pull/26880) feat: configuring an extensible PR auto-labeler (@mistercrunch)
|
||||
- [#25323](https://github.com/apache/superset/pull/25323) feat(i18n): add ukranian translations (@GlugovGrGlib)
|
||||
- [#26443](https://github.com/apache/superset/pull/26443) feat: add chart id and dataset id to global logs (@eschutho)
|
||||
- [#26754](https://github.com/apache/superset/pull/26754) feat: Stop editor scrolling to top (@puridach-w)
|
||||
- [#26745](https://github.com/apache/superset/pull/26745) feat: auto-label PRs that contain db migrations (@mistercrunch)
|
||||
- [#26418](https://github.com/apache/superset/pull/26418) feat: global logs context (@eschutho)
|
||||
- [#26604](https://github.com/apache/superset/pull/26604) feat(celery): upgrade celery and its dependencies packages (@Musa10)
|
||||
- [#26407](https://github.com/apache/superset/pull/26407) feat: Add ValuePercent option to LABEL TYPE for Pie and Funnel charts (@kainchow)
|
||||
- [#26278](https://github.com/apache/superset/pull/26278) feat(releasing): adding SHA512 and RSA signature validation script to verify releases (@rusackas)
|
||||
- [#26011](https://github.com/apache/superset/pull/26011) feat(telemetry): Adding Scarf based telemetry to Superset (@rusackas)
|
||||
- [#26196](https://github.com/apache/superset/pull/26196) feat(docker): Add ARM builds (@alekseyolg)
|
||||
- [#26161](https://github.com/apache/superset/pull/26161) feat: Create db_engine_spec ibmi.py (@wAVeckx)
|
||||
|
||||
**Fixes**
|
||||
|
||||
- [#27706](https://github.com/apache/superset/pull/27706) fix: Select onChange is fired when the same item is selected in single mode (@michael-s-molina)
|
||||
- [#27763](https://github.com/apache/superset/pull/27763) fix: Removes filter plugins from viz gallery (@michael-s-molina)
|
||||
- [#27744](https://github.com/apache/superset/pull/27744) fix: reduce alert error to warning (@eschutho)
|
||||
- [#27558](https://github.com/apache/superset/pull/27558) fix(explore): drag and drop indicator UX (@justinpark)
|
||||
- [#27644](https://github.com/apache/superset/pull/27644) fix: Provide more inclusive error handling for saved queries (@john-bodley)
|
||||
- [#27646](https://github.com/apache/superset/pull/27646) fix: Leverage actual database for rendering Jinjarized SQL (@john-bodley)
|
||||
- [#27550](https://github.com/apache/superset/pull/27550) fix(AlertReports): disabling value when not null option is active (@fisjac)
|
||||
- [#27636](https://github.com/apache/superset/pull/27636) fix(sqllab): unable to remove table (@justinpark)
|
||||
- [#27022](https://github.com/apache/superset/pull/27022) fix(Chart Annotation modal): Table and Superset annotation options will paginate, exceeding previous max limit 100 (@rtexelm)
|
||||
- [#27552](https://github.com/apache/superset/pull/27552) fix(AlertReports): defaulting grace period to undefined (@fisjac)
|
||||
- [#27551](https://github.com/apache/superset/pull/27551) fix(AlertReports): clearing custom_width when disabled (@fisjac)
|
||||
- [#27470](https://github.com/apache/superset/pull/27470) fix(sql_parse): Ensure table extraction handles Jinja templating (@john-bodley)
|
||||
- [#27601](https://github.com/apache/superset/pull/27601) fix: Persist query params appended to permalink (@kgabryje)
|
||||
- [#27613](https://github.com/apache/superset/pull/27613) fix(Dashboard): Add editMode conditional for translate3d fix on charts to allow intended Fullscreen (@rtexelm)
|
||||
- [#27388](https://github.com/apache/superset/pull/27388) fix(utils): fix off-by-one error in how rolling window's min_periods truncates dataframe (@sfirke)
|
||||
- [#19595](https://github.com/apache/superset/pull/19595) fix: Volatile datasource ordering in dashboard export (@pnadolny13)
|
||||
- [#27577](https://github.com/apache/superset/pull/27577) fix: sqlglot SQL Server (@betodealmeida)
|
||||
- [#27576](https://github.com/apache/superset/pull/27576) fix: bump sqlglot to support materialized CTEs (@betodealmeida)
|
||||
- [#27567](https://github.com/apache/superset/pull/27567) fix(db_engine_specs): Update convert_dttm to work correctly with CrateDB (@hlcianfagna)
|
||||
- [#27605](https://github.com/apache/superset/pull/27605) fix: Skips Hive tests that are blocking PRs (@michael-s-molina)
|
||||
- [#27566](https://github.com/apache/superset/pull/27566) fix: guest queries (@betodealmeida)
|
||||
- [#27464](https://github.com/apache/superset/pull/27464) fix: pass valid SQL to SM (@betodealmeida)
|
||||
- [#26748](https://github.com/apache/superset/pull/26748) fix: `improve _extract_tables_from_sql` (@betodealmeida)
|
||||
- [#27539](https://github.com/apache/superset/pull/27539) fix(explore): Allow only saved metrics and columns (@justinpark)
|
||||
- [#27260](https://github.com/apache/superset/pull/27260) fix(alerts/reports): implementing custom_width as an Antd number input (@fisjac)
|
||||
- [#27487](https://github.com/apache/superset/pull/27487) fix(postprocessing): resample with holes (@villebro)
|
||||
- [#27484](https://github.com/apache/superset/pull/27484) fix: check if guest user modified query (@betodealmeida)
|
||||
- [#27471](https://github.com/apache/superset/pull/27471) fix(webpack): remove double-dotted file extensions in webpack config (@rusackas)
|
||||
- [#27186](https://github.com/apache/superset/pull/27186) fix: SSH Tunnel configuration settings (@geido)
|
||||
- [#27411](https://github.com/apache/superset/pull/27411) fix(dashboard): Only fetch CSS templates for dashboard header menu when in edit mode (@mskelton)
|
||||
- [#27315](https://github.com/apache/superset/pull/27315) fix(deps): resolving canvg and html2canvas module not found (@fisjac)
|
||||
- [#27403](https://github.com/apache/superset/pull/27403) fix: missing shared color in mixed timeseries (@justinpark)
|
||||
- [#27402](https://github.com/apache/superset/pull/27402) fix: typescript errors in 4.0 (@justinpark)
|
||||
- [#27390](https://github.com/apache/superset/pull/27390) fix: Re-enable CI checks on release branches (@michael-s-molina)
|
||||
- [#27391](https://github.com/apache/superset/pull/27391) fix(sqllab): Close already removed tab (@justinpark)
|
||||
- [#27364](https://github.com/apache/superset/pull/27364) fix(API): Updating assets via the API should preserve ownership configuration (@Vitor-Avila)
|
||||
- [#27395](https://github.com/apache/superset/pull/27395) fix: improve explore REST api validations (@dpgaspar)
|
||||
- [#27262](https://github.com/apache/superset/pull/27262) fix(Alerts & Reports): Fixing bug that resets cron value to default when empty (@fisjac)
|
||||
- [#27366](https://github.com/apache/superset/pull/27366) fix: Results section in Explore shows an infinite spinner (@michael-s-molina)
|
||||
- [#27187](https://github.com/apache/superset/pull/27187) fix: numexpr to fix CVE-2023-39631 (2.8.4 => 2.9.0) (@nigzak)
|
||||
- [#27361](https://github.com/apache/superset/pull/27361) fix: Missing SQL Lab permission (@michael-s-molina)
|
||||
- [#27360](https://github.com/apache/superset/pull/27360) fix: Heatmap numeric sorting (@michael-s-molina)
|
||||
- [#27313](https://github.com/apache/superset/pull/27313) fix(sqllab): Missing empty query result state (@justinpark)
|
||||
- [#27308](https://github.com/apache/superset/pull/27308) fix(dashboard): table chart drag preview overflowing container (@rtexelm)
|
||||
- [#27295](https://github.com/apache/superset/pull/27295) fix(sqllab): invalid dump sql shown after closing tab (@justinpark)
|
||||
- [#27285](https://github.com/apache/superset/pull/27285) fix(plugin-chart-echarts): calculate Gauge Chart intervals correctly when min value is set (@goto-loop)
|
||||
- [#27307](https://github.com/apache/superset/pull/27307) fix: Incorrect data type on import page (@michael-s-molina)
|
||||
- [#27291](https://github.com/apache/superset/pull/27291) fix: Data zoom with horizontal orientation (@michael-s-molina)
|
||||
- [#27273](https://github.com/apache/superset/pull/27273) fix: Navigating to an invalid page index in lists (@michael-s-molina)
|
||||
- [#27271](https://github.com/apache/superset/pull/27271) fix: Inoperable dashboard filter slider when range is <= 1 (@michael-s-molina)
|
||||
- [#27154](https://github.com/apache/superset/pull/27154) fix(import-datasources): Use "admin" user as default for importing datasources (@ddxv)
|
||||
- [#27258](https://github.com/apache/superset/pull/27258) fix: Sorting charts/dashboards makes the applied filters ineffective (@michael-s-molina)
|
||||
- [#27213](https://github.com/apache/superset/pull/27213) fix(trino): bumping trino to fix hudi schema fetching (@rusackas)
|
||||
- [#27236](https://github.com/apache/superset/pull/27236) fix(reports): fixing unit test (@fisjac)
|
||||
- [#27217](https://github.com/apache/superset/pull/27217) fix(sqlglot): Address regressions introduced in #26476 (@john-bodley)
|
||||
- [#27233](https://github.com/apache/superset/pull/27233) fix: bump FAB to 4.4.1 (perf issue) (@dpgaspar)
|
||||
- [#27167](https://github.com/apache/superset/pull/27167) fix: setting important lower bounds versions on requirements (@dpgaspar)
|
||||
- [#27215](https://github.com/apache/superset/pull/27215) fix: no limit in SELECT \* for TOP dbs (@betodealmeida)
|
||||
- [#27214](https://github.com/apache/superset/pull/27214) fix(releasing): fixes npm script for release validation (@rusackas)
|
||||
- [#26074](https://github.com/apache/superset/pull/26074) fix: Translations related to the date range filter (@Ralkion)
|
||||
- [#26699](https://github.com/apache/superset/pull/26699) fix(dashboard): drag and drop indicator UX (@justinpark)
|
||||
- [#27191](https://github.com/apache/superset/pull/27191) fix: Failed to execute importScripts on worker-css (@michael-s-molina)
|
||||
- [#27181](https://github.com/apache/superset/pull/27181) fix(sqllab): typeahead search is broken in db selector (@justinpark)
|
||||
- [#27164](https://github.com/apache/superset/pull/27164) fix: unlock and bump werkzeug (@dpgaspar)
|
||||
- [#27161](https://github.com/apache/superset/pull/27161) fix(ci): mypy pre-commit issues (@dpgaspar)
|
||||
- [#27138](https://github.com/apache/superset/pull/27138) fix(plugins): Apply dashboard filters to comparison query in BigNumber with Time Comparison chart (@Antonio-RiveroMartnez)
|
||||
- [#27135](https://github.com/apache/superset/pull/27135) fix: Duplicated toast messages (@michael-s-molina)
|
||||
- [#27132](https://github.com/apache/superset/pull/27132) fix: Plain error message when visiting a dashboard via permalink without permissions (@michael-s-molina)
|
||||
- [#27130](https://github.com/apache/superset/pull/27130) fix: ID param for DELETE ssh_tunnel endpoint (@geido)
|
||||
- [#22840](https://github.com/apache/superset/pull/22840) fix(pivot-table-v2): Added forgotten translation pivot table v2 (@Always-prog)
|
||||
- [#27128](https://github.com/apache/superset/pull/27128) fix: RLS modal overflow (@michael-s-molina)
|
||||
- [#27112](https://github.com/apache/superset/pull/27112) fix: gevent upgrade to 23.9.1 (@dpgaspar)
|
||||
- [#27117](https://github.com/apache/superset/pull/27117) fix: removes old deprecated sqllab endpoints (@dpgaspar)
|
||||
- [#27124](https://github.com/apache/superset/pull/27124) fix: bump grpcio, urllib3 and paramiko (@dpgaspar)
|
||||
- [#27116](https://github.com/apache/superset/pull/27116) fix(docker): \*-dev tags target right stage from Dockerfile (@lodu)
|
||||
- [#26791](https://github.com/apache/superset/pull/26791) fix(sqllab): flaky json explore modal due to over-rendering (@justinpark)
|
||||
- [#27113](https://github.com/apache/superset/pull/27113) fix: upgrade cryptography to major 42 (@dpgaspar)
|
||||
- [#27106](https://github.com/apache/superset/pull/27106) fix: Timeseries Y-axis format with contribution mode (@michael-s-molina)
|
||||
- [#27098](https://github.com/apache/superset/pull/27098) fix: try to fix cypress with magic (@mistercrunch)
|
||||
- [#27094](https://github.com/apache/superset/pull/27094) fix(helm): typo on ssl_cert_reqs variable (@pcop00)
|
||||
- [#27091](https://github.com/apache/superset/pull/27091) fix(deps): un-bumping dom-to-pdf ro resolve missing file warnings (@rusackas)
|
||||
- [#27087](https://github.com/apache/superset/pull/27087) fix(ci): Docker master builds fail while checking version (@mistercrunch)
|
||||
- [#27085](https://github.com/apache/superset/pull/27085) fix(ci): new PR comments cancel ongoing ephemeral builds (@dpgaspar)
|
||||
- [#26663](https://github.com/apache/superset/pull/26663) fix(helm): Include option to use Redis with SSL (@shakeelansari63)
|
||||
- [#27060](https://github.com/apache/superset/pull/27060) fix(ephemeral): last try fixing this GH action (@mistercrunch)
|
||||
- [#27058](https://github.com/apache/superset/pull/27058) fix(ephemeral): point to the full tag name (@mistercrunch)
|
||||
- [#27057](https://github.com/apache/superset/pull/27057) fix(ephemeral): fix tagging command for ECR (@mistercrunch)
|
||||
- [#27056](https://github.com/apache/superset/pull/27056) fix(ephemeral): fix ephemeral builds in PR (@mistercrunch)
|
||||
- [#27048](https://github.com/apache/superset/pull/27048) fix(actions): correcting malformed labeler configs (@rusackas)
|
||||
- [#19744](https://github.com/apache/superset/pull/19744) fix(webpack-dev-server): parse env args (@jdbranham)
|
||||
- [#27042](https://github.com/apache/superset/pull/27042) fix(ci): fix action script v7 breaking changes v3 (@dpgaspar)
|
||||
- [#27040](https://github.com/apache/superset/pull/27040) fix(ci): fix action script v7 breaking changes v2 (@dpgaspar)
|
||||
- [#27014](https://github.com/apache/superset/pull/27014) fix(maps): france_regions.geojson generated with the notebook, from natural earth data (@qleroy)
|
||||
- [#26966](https://github.com/apache/superset/pull/26966) fix(actions): make tech debt uploader not block CI and skip w/o creds (@rusackas)
|
||||
- [#27001](https://github.com/apache/superset/pull/27001) fix(cypress): resolving random dri3 error on cypress runner (@rusackas)
|
||||
- [#27013](https://github.com/apache/superset/pull/27013) fix(plugins): Fix dashboard filter in Period Over Period KPI plugin (@Antonio-RiveroMartnez)
|
||||
- [#27005](https://github.com/apache/superset/pull/27005) fix(helm): Fix inconsistency for the chart appVersion and default image tag (@dnskr)
|
||||
- [#26995](https://github.com/apache/superset/pull/26995) fix(maps): Move Overseas department and regions closer to France mainland (@qleroy)
|
||||
- [#26987](https://github.com/apache/superset/pull/26987) fix(ci): typo in my bash script (@mistercrunch)
|
||||
- [#26985](https://github.com/apache/superset/pull/26985) fix(plugin): Period Over Period KPI Plugin Feature flag value (@Antonio-RiveroMartnez)
|
||||
- [#26969](https://github.com/apache/superset/pull/26969) fix(ci): support action/script v5 breaking change v2 (@dpgaspar)
|
||||
- [#26968](https://github.com/apache/superset/pull/26968) fix(ci): support action/script v5 breaking change (@dpgaspar)
|
||||
- [#26963](https://github.com/apache/superset/pull/26963) fix(plugin-chart-table): Revert "fix(chart table in dashboard): improve screen reading of table (#26453)" (@kgabryje)
|
||||
- [#26949](https://github.com/apache/superset/pull/26949) fix(actions): specify branch on monorepo lockfile pusher (@rusackas)
|
||||
- [#26921](https://github.com/apache/superset/pull/26921) fix(ci): remove deprecated set-output on github workflows (@dpgaspar)
|
||||
- [#26920](https://github.com/apache/superset/pull/26920) fix(ci): lint issue on update-monorepo-lockfiles.yml (@dpgaspar)
|
||||
- [#26919](https://github.com/apache/superset/pull/26919) fix(ci): ephemeral env build and up dependency (@dpgaspar)
|
||||
- [#26852](https://github.com/apache/superset/pull/26852) fix(ci): ephemeral env build (@dpgaspar)
|
||||
- [#26917](https://github.com/apache/superset/pull/26917) fix: remove ephemeral docker build from required workflow (@dpgaspar)
|
||||
- [#26787](https://github.com/apache/superset/pull/26787) fix(docker): improve docker tags to be cleared and avoid conflicts (@mistercrunch)
|
||||
- [#26904](https://github.com/apache/superset/pull/26904) fix(dependabot): lockfile updater won't fail when there's nothing to push (@rusackas)
|
||||
- [#26888](https://github.com/apache/superset/pull/26888) fix(dependencies): adding auth for dependabot lockfile action (@rusackas)
|
||||
- [#26901](https://github.com/apache/superset/pull/26901) fix(svg): reformatting svgs to allow license without breaking images (@rusackas)
|
||||
- [#26453](https://github.com/apache/superset/pull/26453) fix(chart table in dashboard): improve screen reading of table (@ncar285)
|
||||
- [#26801](https://github.com/apache/superset/pull/26801) fix: docker should always run, even in forks (@mistercrunch)
|
||||
- [#26752](https://github.com/apache/superset/pull/26752) fix: add user to latest-release-tag workflow (@eschutho)
|
||||
- [#26772](https://github.com/apache/superset/pull/26772) fix(docker): credentials issues around superset-cache in forks (@mistercrunch)
|
||||
- [#25510](https://github.com/apache/superset/pull/25510) fix: change the validation logic for python_date_format (@mapledan)
|
||||
- [#26710](https://github.com/apache/superset/pull/26710) fix(dependencies): stopping (and preventing) full lodash library import... now using only method level imports. (@rusackas)
|
||||
- [#26473](https://github.com/apache/superset/pull/26473) fix: docker ephemeral environment, push only on testenv comment (@dpgaspar)
|
||||
- [#26682](https://github.com/apache/superset/pull/26682) fix: Revert "build(deps): bump @mdx-js/react from 1.6.22 to 3.0.0 in /docs" (@rusackas)
|
||||
- [#26679](https://github.com/apache/superset/pull/26679) fix: Revert "buld(deps): bump swagger-ui-react from 4.1.3 to 5.11.0 in docs (#26552) (@michael-s-molina)
|
||||
- [#26648](https://github.com/apache/superset/pull/26648) fix: Removes unused cache cleanup (@michael-s-molina)
|
||||
- [#26649](https://github.com/apache/superset/pull/26649) fix: remove possible unnecessary file 1 (@dpgaspar)
|
||||
- [#26351](https://github.com/apache/superset/pull/26351) fix: stringify scarf pixel value (@eschutho)
|
||||
- [#26205](https://github.com/apache/superset/pull/26205) fix(docker): Remove race condition when building image (@alekseyolg)
|
||||
|
||||
**Others**
|
||||
|
||||
- [#27441](https://github.com/apache/superset/pull/27441) chore: Adds the 4.0 release notes (@michael-s-molina)
|
||||
- [#27768](https://github.com/apache/superset/pull/27768) chore(docs): Cleanup UPDATING.md (@john-bodley)
|
||||
- [#27625](https://github.com/apache/superset/pull/27625) perf(explore): virtualized datasource field sections (@justinpark)
|
||||
- [#27488](https://github.com/apache/superset/pull/27488) perf(sqllab): reduce bootstrap data delay by queries (@justinpark)
|
||||
- [#27281](https://github.com/apache/superset/pull/27281) chore: bump cryptography minimum to 42.0.4 (@sadpandajoe)
|
||||
- [#27232](https://github.com/apache/superset/pull/27232) chore: Removes Chromatic workflow and dependencies (@michael-s-molina)
|
||||
- [#27169](https://github.com/apache/superset/pull/27169) chore: Updates CHANGELOG.md with 3.0.4 data (@michael-s-molina)
|
||||
- [#27166](https://github.com/apache/superset/pull/27166) docs: add Dropit Shopping to users list (@IlyaDropit)
|
||||
- [#27143](https://github.com/apache/superset/pull/27143) refactor: Migrate ErrorBoundary to typescript (@EnxDev)
|
||||
- [#27136](https://github.com/apache/superset/pull/27136) chore(tests): Remove unnecessary explicit Flask-SQLAlchemy session expunges (@john-bodley)
|
||||
- [#27134](https://github.com/apache/superset/pull/27134) docs: add Geotab to users list (@JZ6)
|
||||
- [#26693](https://github.com/apache/superset/pull/26693) chore(hail mary): Update package-lock.json via npm-audit-fix (@rusackas)
|
||||
- [#27129](https://github.com/apache/superset/pull/27129) chore: lower cryptography min version to 41.0.2 (@sadpandajoe)
|
||||
- [#27120](https://github.com/apache/superset/pull/27120) docs(miscellaneous): Export Datasoruces: export datasources exports to ZIP (@ddxv)
|
||||
- [#27078](https://github.com/apache/superset/pull/27078) chore(internet_port): added new ports and removed unnecessary string class (@anirudh-hegde)
|
||||
- [#27118](https://github.com/apache/superset/pull/27118) chore: bump firebolt-sqlalchemy to support service account auth (@Vitor-Avila)
|
||||
- [#27090](https://github.com/apache/superset/pull/27090) chore(plugins): Update dropdown control for BigNumber with Time Comparison range (@Antonio-RiveroMartnez)
|
||||
- [#26909](https://github.com/apache/superset/pull/26909) refactor: Ensure Flask framework leverages the Flask-SQLAlchemy session (Phase II) (@john-bodley)
|
||||
- [#27030](https://github.com/apache/superset/pull/27030) chore: Migrate AlteredSliceTag to typescript (@EnxDev)
|
||||
- [#26773](https://github.com/apache/superset/pull/26773) chore(translations): updating pot -> po -> json files (babel 2.9.1) (@rusackas)
|
||||
- [#27071](https://github.com/apache/superset/pull/27071) chore(docs): adding meta db to Feature Flags page (@rusackas)
|
||||
- [#27072](https://github.com/apache/superset/pull/27072) docs(installation): document multi-platform support in Docker builds (@mistercrunch)
|
||||
- [#27053](https://github.com/apache/superset/pull/27053) chore: prevent prophet from logging non-errors as errors (@betodealmeida)
|
||||
- [#27038](https://github.com/apache/superset/pull/27038) chore(docs): bump version number in docs example (@sfirke)
|
||||
- [#26973](https://github.com/apache/superset/pull/26973) build(deps-dev): bump @types/jest from 26.0.24 to 29.5.12 in /superset-frontend/plugins/plugin-chart-handlebars (@dependabot[bot])
|
||||
- [#26260](https://github.com/apache/superset/pull/26260) chore(dashboard): migrate enzyme to RTL (@justinpark)
|
||||
- [#26989](https://github.com/apache/superset/pull/26989) chore: Remove database ID dependency for SSH Tunnel creation (@geido)
|
||||
- [#26981](https://github.com/apache/superset/pull/26981) build(deps): bump react-js-cron from 1.2.0 to 2.1.2 in /superset-frontend (@dependabot[bot])
|
||||
- [#26893](https://github.com/apache/superset/pull/26893) build(deps-dev): bump copy-webpack-plugin from 9.1.0 to 12.0.2 in /superset-frontend (@dependabot[bot])
|
||||
- [#26171](https://github.com/apache/superset/pull/26171) chore(sqllab): migrate to typescript (@justinpark)
|
||||
- [#27021](https://github.com/apache/superset/pull/27021) chore(plugins): Description, Category and Tags for BigNumber with Period Time Comparison plugin (@Antonio-RiveroMartnez)
|
||||
- [#27020](https://github.com/apache/superset/pull/27020) docs: add a note about database drivers in Docker builds (@mistercrunch)
|
||||
- [#26979](https://github.com/apache/superset/pull/26979) build(deps): bump @types/seedrandom from 2.4.30 to 3.0.8 in /superset-frontend (@dependabot[bot])
|
||||
- [#27000](https://github.com/apache/superset/pull/27000) chore(github): adding code owners for translation and country map wor… (@rusackas)
|
||||
- [#26998](https://github.com/apache/superset/pull/26998) docs: add notes to RELEASING about how to deploy docker images (@mistercrunch)
|
||||
- [#26996](https://github.com/apache/superset/pull/26996) build(deps): bump react-intersection-observer from 9.4.1 to 9.6.0 in /superset-frontend (@dependabot[bot])
|
||||
- [#26986](https://github.com/apache/superset/pull/26986) docs(presto): add Presto SSL connection details (@rusackas)
|
||||
- [#26526](https://github.com/apache/superset/pull/26526) build(deps): bump @vx/legend from 0.0.198 to 0.0.199 in /superset-frontend/plugins/legacy-plugin-chart-histogram (@dependabot[bot])
|
||||
- [#26903](https://github.com/apache/superset/pull/26903) chore(dependencies): bump encodable to 0.7.8 (@rusackas)
|
||||
- [#26977](https://github.com/apache/superset/pull/26977) build(deps-dev): bump webpack from 5.90.0 to 5.90.1 in /docs (@dependabot[bot])
|
||||
- [#26974](https://github.com/apache/superset/pull/26974) build(deps-dev): bump @types/node from 20.11.14 to 20.11.16 in /superset-websocket (@dependabot[bot])
|
||||
- [#26971](https://github.com/apache/superset/pull/26971) build(deps): bump actions/checkout from 2 to 4 (@dependabot[bot])
|
||||
- [#26972](https://github.com/apache/superset/pull/26972) build(deps): bump actions/cache from 1 to 4 (@dependabot[bot])
|
||||
- [#26970](https://github.com/apache/superset/pull/26970) build(deps): bump actions/setup-python from 4 to 5 (@dependabot[bot])
|
||||
- [#26950](https://github.com/apache/superset/pull/26950) chore(actions): getting fancier with labels (@rusackas)
|
||||
- [#26952](https://github.com/apache/superset/pull/26952) build(deps): bump actions/setup-java from 1 to 4 (@dependabot[bot])
|
||||
- [#26958](https://github.com/apache/superset/pull/26958) build(deps-dev): bump mock-socket from 9.0.3 to 9.3.1 in /superset-frontend (@dependabot[bot])
|
||||
- [#26953](https://github.com/apache/superset/pull/26953) build(deps): bump actions/github-script from 3 to 7 (@dependabot[bot])
|
||||
- [#26927](https://github.com/apache/superset/pull/26927) build(deps): bump actions/setup-node from 2 to 4 (@dependabot[bot])
|
||||
- [#26954](https://github.com/apache/superset/pull/26954) build(deps): bump aws-actions/configure-aws-credentials from 1 to 4 (@dependabot[bot])
|
||||
- [#26955](https://github.com/apache/superset/pull/26955) build(deps): bump aws-actions/amazon-ecr-login from 1 to 2 (@dependabot[bot])
|
||||
- [#26956](https://github.com/apache/superset/pull/26956) build(deps): bump github/codeql-action from 2 to 3 (@dependabot[bot])
|
||||
- [#26938](https://github.com/apache/superset/pull/26938) build(deps): bump moment from 2.29.4 to 2.30.1 in /superset-frontend (@dependabot[bot])
|
||||
- [#26943](https://github.com/apache/superset/pull/26943) chore(dependencies): Push lockfile for monorepo updates on rebuild/rebase (@rusackas)
|
||||
- [#26875](https://github.com/apache/superset/pull/26875) chore: make TS enums strictly PascalCase (@villebro)
|
||||
- [#26942](https://github.com/apache/superset/pull/26942) chore(ci): run pre-commit across the repo (@mistercrunch)
|
||||
- [#26935](https://github.com/apache/superset/pull/26935) build(deps): bump interweave from 13.0.0 to 13.1.0 in /superset-frontend (@dependabot[bot])
|
||||
- [#26941](https://github.com/apache/superset/pull/26941) build(deps): bump emotion-rgba from 0.0.9 to 0.0.12 in /superset-frontend (@dependabot[bot])
|
||||
- [#26939](https://github.com/apache/superset/pull/26939) build(deps-dev): bump @babel/core from 7.22.8 to 7.23.9 in /superset-frontend (@dependabot[bot])
|
||||
- [#26940](https://github.com/apache/superset/pull/26940) build(deps): bump shortid from 2.2.14 to 2.2.16 in /superset-frontend (@dependabot[bot])
|
||||
- [#26924](https://github.com/apache/superset/pull/26924) build(deps-dev): bump @types/node from 20.11.10 to 20.11.14 in /superset-websocket (@dependabot[bot])
|
||||
- [#26928](https://github.com/apache/superset/pull/26928) build(deps): bump chromaui/action from 1 to 10 (@dependabot[bot])
|
||||
- [#26929](https://github.com/apache/superset/pull/26929) build(deps): bump azure/setup-helm from 1 to 3 (@dependabot[bot])
|
||||
- [#26930](https://github.com/apache/superset/pull/26930) build(deps): bump actions/upload-artifact from 3 to 4 (@dependabot[bot])
|
||||
- [#26931](https://github.com/apache/superset/pull/26931) build(deps): bump actions/dependency-review-action from 2 to 4 (@dependabot[bot])
|
||||
- [#26918](https://github.com/apache/superset/pull/26918) chore(ci): notify PMCs of changes on required workflows (@dpgaspar)
|
||||
- [#26372](https://github.com/apache/superset/pull/26372) refactor: Removes the deprecated GENERIC_CHART_AXES feature flag (@michael-s-molina)
|
||||
- [#26881](https://github.com/apache/superset/pull/26881) build(deps-dev): update @babel/types requirement from ^7.13.12 to ^7.23.9 in /superset-frontend/plugins/plugin-chart-pivot-table (@dependabot[bot])
|
||||
- [#26727](https://github.com/apache/superset/pull/26727) build(deps): bump @ant-design/icons from 5.0.1 to 5.2.6 in /superset-frontend (@dependabot[bot])
|
||||
- [#26894](https://github.com/apache/superset/pull/26894) build(deps): bump @vx/scale from 0.0.197 to 0.0.199 in /superset-frontend (@dependabot[bot])
|
||||
- [#26840](https://github.com/apache/superset/pull/26840) build(deps): bump d3-selection from 1.4.2 to 3.0.0 in /superset-frontend (@dependabot[bot])
|
||||
- [#26861](https://github.com/apache/superset/pull/26861) build(deps): bump @visx/axis from 3.5.0 to 3.8.0 in /superset-frontend (@dependabot[bot])
|
||||
- [#26272](https://github.com/apache/superset/pull/26272) chore(explore): migrate enzyme to RTL (@justinpark)
|
||||
- [#26899](https://github.com/apache/superset/pull/26899) build(deps): bump @types/rison from 0.0.6 to 0.0.9 in /superset-frontend (@dependabot[bot])
|
||||
- [#26831](https://github.com/apache/superset/pull/26831) build(deps): bump @types/rison from 0.0.6 to 0.0.9 in /superset-frontend/packages/superset-ui-core (@dependabot[bot])
|
||||
- [#26869](https://github.com/apache/superset/pull/26869) build(deps): bump dom-to-image-more from 2.16.0 to 3.2.0 in /superset-frontend (@dependabot[bot])
|
||||
- [#26902](https://github.com/apache/superset/pull/26902) chore(docs): remove misplaced k8s installation instructions (@sfirke)
|
||||
- [#26897](https://github.com/apache/superset/pull/26897) build(deps-dev): bump webpack-bundle-analyzer from 4.9.0 to 4.10.1 in /superset-frontend (@dependabot[bot])
|
||||
- [#26900](https://github.com/apache/superset/pull/26900) chore(ci): make action/labeler work on fork PRs (@mistercrunch)
|
||||
- [#26879](https://github.com/apache/superset/pull/26879) chore(dependabot): ignore css-minimizer-webpack-plugin (@mistercrunch)
|
||||
- [#26860](https://github.com/apache/superset/pull/26860) build(deps): bump rehype-sanitize from 5.0.1 to 6.0.0 in /superset-frontend (@dependabot[bot])
|
||||
- [#26872](https://github.com/apache/superset/pull/26872) chore(dependabot): auto-update lockfiles for monorepo package bumps (@rusackas)
|
||||
- [#26859](https://github.com/apache/superset/pull/26859) build(deps): bump @types/enzyme from 3.10.10 to 3.10.18 in /superset-frontend (@dependabot[bot])
|
||||
- [#26874](https://github.com/apache/superset/pull/26874) chore(license): adding a missing license blurb to a translation file (@rusackas)
|
||||
- [#26870](https://github.com/apache/superset/pull/26870) build(deps): bump yargs and @types/yargs in /superset-frontend (@dependabot[bot])
|
||||
- [#26841](https://github.com/apache/superset/pull/26841) chore(dependencies): bump less from 3.12.2 to 4.2.0 in /superset-frontend (@dependabot[bot])
|
||||
- [#26868](https://github.com/apache/superset/pull/26868) chore(actions): run docs actions on Node 16 to conform with the project (@rusackas)
|
||||
- [#26857](https://github.com/apache/superset/pull/26857) chore(actions): generate FOSSA report on master, and ALWAYS check for… (@rusackas)
|
||||
- [#26826](https://github.com/apache/superset/pull/26826) build(deps-dev): bump @types/uuid from 9.0.7 to 9.0.8 in /superset-websocket (@dependabot[bot])
|
||||
- [#26867](https://github.com/apache/superset/pull/26867) build(deps): bump @testing-library/react-hooks from 5.0.3 to 5.1.3 in /superset-frontend (@dependabot[bot])
|
||||
- [#26866](https://github.com/apache/superset/pull/26866) build(deps): bump mousetrap and @types/mousetrap in /superset-frontend (@dependabot[bot])
|
||||
- [#26865](https://github.com/apache/superset/pull/26865) build(deps): bump react-redux from 7.2.8 to 7.2.9 in /superset-frontend (@dependabot[bot])
|
||||
- [#26855](https://github.com/apache/superset/pull/26855) chore(dependabot): lowering bump cadence from weekly to monthly (@rusackas)
|
||||
- [#26854](https://github.com/apache/superset/pull/26854) chore(CI): get docs building on ALL branches. (@rusackas)
|
||||
- [#26825](https://github.com/apache/superset/pull/26825) build(deps-dev): bump @types/node from 20.11.5 to 20.11.10 in /superset-websocket (@dependabot[bot])
|
||||
- [#26820](https://github.com/apache/superset/pull/26820) chore(lint/a11y): fixing and locking down jsx-a11y/anchor-is-valid (@rusackas)
|
||||
- [#26819](https://github.com/apache/superset/pull/26819) chore(dependencies): bumps match-sorter (@rusackas)
|
||||
- [#26798](https://github.com/apache/superset/pull/26798) chore: Add permission to view and drill on Dashboard context (@geido)
|
||||
- [#26827](https://github.com/apache/superset/pull/26827) build(deps): bump use-immer from 0.8.1 to 0.9.0 in /superset-frontend (@dependabot[bot])
|
||||
- [#24272](https://github.com/apache/superset/pull/24272) chore(deps): bump typescript to 4.8.4 (@jansule)
|
||||
- [#26832](https://github.com/apache/superset/pull/26832) build(deps): bump @types/react-table from 7.0.29 to 7.7.19 in /superset-frontend (@dependabot[bot])
|
||||
- [#26834](https://github.com/apache/superset/pull/26834) build(deps-dev): bump @docusaurus/module-type-aliases from 3.1.0 to 3.1.1 in /docs (@dependabot[bot])
|
||||
- [#26839](https://github.com/apache/superset/pull/26839) build(deps-dev): bump webpack from 5.89.0 to 5.90.0 in /docs (@dependabot[bot])
|
||||
- [#23873](https://github.com/apache/superset/pull/23873) chore: Slovenian translation update (@dkrat7)
|
||||
- [#26702](https://github.com/apache/superset/pull/26702) chore: fix GitHub 'Unchanged files with check annotations' reports in PR (@mistercrunch)
|
||||
- [#26726](https://github.com/apache/superset/pull/26726) build(deps): bump prism-react-renderer from 1.2.1 to 2.3.1 in /docs (@dependabot[bot])
|
||||
- [#26813](https://github.com/apache/superset/pull/26813) chore(ci): change code owners for .github (@dpgaspar)
|
||||
- [#26794](https://github.com/apache/superset/pull/26794) chore(dependencies): bumping jinja2 (@rusackas)
|
||||
- [#26816](https://github.com/apache/superset/pull/26816) chore: add google-auth for new example dashboard (@betodealmeida)
|
||||
- [#26815](https://github.com/apache/superset/pull/26815) chore: Reformat changelogs (@geido)
|
||||
- [#26793](https://github.com/apache/superset/pull/26793) chore(dependencies): bumping fonttools (@rusackas)
|
||||
- [#26442](https://github.com/apache/superset/pull/26442) chore: Technical Debt Metrics (@rusackas)
|
||||
- [#26800](https://github.com/apache/superset/pull/26800) chore: Splits the CHANGELOG into multiple files (@michael-s-molina)
|
||||
- [#26621](https://github.com/apache/superset/pull/26621) build(deps): update jquery requirement from ^3.4.1 to ^3.7.1 in /superset-frontend/packages/superset-ui-demo (@dependabot[bot])
|
||||
- [#26789](https://github.com/apache/superset/pull/26789) chore(RESOURCES): fix markdown for table formatting (@qleroy)
|
||||
- [#26759](https://github.com/apache/superset/pull/26759) chore: Add Embed Modal extension override and tests (@geido)
|
||||
- [#26656](https://github.com/apache/superset/pull/26656) build(deps-dev): bump css-minimizer-webpack-plugin from 3.4.1 to 6.0.0 in /superset-frontend (@dependabot[bot])
|
||||
- [#26704](https://github.com/apache/superset/pull/26704) chore: improve/decouple eslint and tsc 'npm run' commands (@mistercrunch)
|
||||
- [#26728](https://github.com/apache/superset/pull/26728) build(deps): bump @visx/grid from 3.0.1 to 3.5.0 in /superset-frontend (@dependabot[bot])
|
||||
- [#26729](https://github.com/apache/superset/pull/26729) build(deps): update classnames requirement from ^2.3.2 to ^2.5.1 in /superset-frontend/plugins/plugin-chart-table (@dependabot[bot])
|
||||
- [#26766](https://github.com/apache/superset/pull/26766) chore: prevent CI double runs on push + pull_request (@mistercrunch)
|
||||
- [#26528](https://github.com/apache/superset/pull/26528) build(deps-dev): bump jest from 26.6.3 to 29.7.0 in /superset-frontend/plugins/plugin-chart-handlebars (@dependabot[bot])
|
||||
- [#26513](https://github.com/apache/superset/pull/26513) build(deps): bump d3-color from 1.4.1 to 3.1.0 in /superset-frontend/plugins/legacy-plugin-chart-world-map (@dependabot[bot])
|
||||
- [#26596](https://github.com/apache/superset/pull/26596) build(deps): update @types/math-expression-evaluator requirement from ^1.2.1 to ^1.3.3 in /superset-frontend/packages/superset-ui-core (@dependabot[bot])
|
||||
- [#26595](https://github.com/apache/superset/pull/26595) build(deps-dev): update @types/lodash requirement from ^4.14.149 to ^4.14.202 in /superset-frontend/plugins/plugin-chart-handlebars (@dependabot[bot])
|
||||
- [#26698](https://github.com/apache/superset/pull/26698) build: Parallelize the CI image builds (continued) (@mistercrunch)
|
||||
- [#26499](https://github.com/apache/superset/pull/26499) build(deps): update d3-cloud requirement from ^1.2.5 to ^1.2.7 in /superset-frontend/plugins/plugin-chart-word-cloud (@dependabot[bot])
|
||||
- [#26481](https://github.com/apache/superset/pull/26481) build(deps-dev): bump @types/jest from 26.0.24 to 29.5.11 in /superset-frontend/plugins/plugin-chart-pivot-table (@dependabot[bot])
|
||||
- [#26546](https://github.com/apache/superset/pull/26546) build(deps-dev): bump @docusaurus/module-type-aliases from 2.4.1 to 3.1.0 in /docs (@dependabot[bot])
|
||||
- [#26105](https://github.com/apache/superset/pull/26105) docs(storybook): fix typo in TimeFormatStories.tsx (@HurSungYun)
|
||||
- [#26594](https://github.com/apache/superset/pull/26594) build(deps): update whatwg-fetch requirement from ^3.0.0 to ^3.6.20 in /superset-frontend/packages/superset-ui-core (@dependabot[bot])
|
||||
- [#26753](https://github.com/apache/superset/pull/26753) chore: do not mark helm releases as github latest (@eschutho)
|
||||
- [#26718](https://github.com/apache/superset/pull/26718) build(deps): bump @svgr/webpack from 5.5.0 to 8.1.0 in /docs (@dependabot[bot])
|
||||
- [#26714](https://github.com/apache/superset/pull/26714) build(deps): bump @visx/axis from 3.0.1 to 3.5.0 in /superset-frontend (@dependabot[bot])
|
||||
- [#26760](https://github.com/apache/superset/pull/26760) docs: update fixed CVEs for version 3.0.3 (@dpgaspar)
|
||||
- [#26483](https://github.com/apache/superset/pull/26483) build(deps): update @types/d3-cloud requirement from ^1.2.1 to ^1.2.9 in /superset-frontend/plugins/plugin-chart-word-cloud (@dependabot[bot])
|
||||
- [#26616](https://github.com/apache/superset/pull/26616) build(deps): bump fuse.js from 6.4.6 to 7.0.0 in /superset-frontend (@dependabot[bot])
|
||||
- [#26717](https://github.com/apache/superset/pull/26717) build(deps-dev): bump webpack from 5.76.0 to 5.89.0 in /docs (@dependabot[bot])
|
||||
- [#26570](https://github.com/apache/superset/pull/26570) build(deps-dev): bump prettier-plugin-packagejson from 2.2.15 to 2.4.9 in /superset-frontend (@dependabot[bot])
|
||||
- [#26556](https://github.com/apache/superset/pull/26556) build(deps-dev): bump @babel/register from 7.22.5 to 7.23.7 in /superset-frontend (@dependabot[bot])
|
||||
- [#26522](https://github.com/apache/superset/pull/26522) build(deps): update react-table requirement from ^7.6.3 to ^7.8.0 in /superset-frontend/plugins/plugin-chart-table (@dependabot[bot])
|
||||
- [#26613](https://github.com/apache/superset/pull/26613) build(deps): bump react-github-btn from 1.2.1 to 1.4.0 in /docs (@dependabot[bot])
|
||||
- [#26572](https://github.com/apache/superset/pull/26572) build(deps-dev): bump eslint-plugin-react-hooks from 4.2.0 to 4.6.0 in /superset-frontend (@dependabot[bot])
|
||||
- [#26576](https://github.com/apache/superset/pull/26576) build(deps): bump @emotion/babel-preset-css-prop from 11.2.0 to 11.11.0 in /superset-frontend (@dependabot[bot])
|
||||
- [#26724](https://github.com/apache/superset/pull/26724) build(deps): bump @saucelabs/theme-github-codeblock from 0.1.1 to 0.2.3 in /docs (@dependabot[bot])
|
||||
- [#26720](https://github.com/apache/superset/pull/26720) build(deps): bump @docsearch/react from 3.3.3 to 3.5.2 in /docs (@dependabot[bot])
|
||||
- [#26708](https://github.com/apache/superset/pull/26708) chore(dependencies): loosen constraints on dependency checker (@rusackas)
|
||||
- [#25665](https://github.com/apache/superset/pull/25665) build(deps): bump @babel/traverse from 7.22.8 to 7.23.2 in /superset-frontend (@dependabot[bot])
|
||||
- [#26694](https://github.com/apache/superset/pull/26694) chore(dependencies): removes unsued d3-color and d3-array (@rusackas)
|
||||
- [#26692](https://github.com/apache/superset/pull/26692) chore(dependencies): removes unused minimist (@rusackas)
|
||||
- [#26690](https://github.com/apache/superset/pull/26690) chore(dependencies): remove unused global-box (@rusackas)
|
||||
- [#26689](https://github.com/apache/superset/pull/26689) chore(dependencies): remove unused lodash-es (@rusackas)
|
||||
- [#26688](https://github.com/apache/superset/pull/26688) chore(dependencies): remove unused react-datetime (@rusackas)
|
||||
- [#26687](https://github.com/apache/superset/pull/26687) chore(dependencies): remove unused ansi-regex (@rusackas)
|
||||
- [#26686](https://github.com/apache/superset/pull/26686) chore(dependencies): removes unused @visx/tooltip (@rusackas)
|
||||
- [#26685](https://github.com/apache/superset/pull/26685) chore(dependencies): remove unused @babel/runtime-corejs3 (@rusackas)
|
||||
- [#26684](https://github.com/apache/superset/pull/26684) chore(dependencies): removes unused bootstrap-slider (@rusackas)
|
||||
- [#26691](https://github.com/apache/superset/pull/26691) chore(dependencies): npm audit fix for superset-ui-demo (@rusackas)
|
||||
- [#26703](https://github.com/apache/superset/pull/26703) chore: silence SECRET_KEY warning when running tests (@mistercrunch)
|
||||
- [#26733](https://github.com/apache/superset/pull/26733) build(deps-dev): bump @types/node from 20.11.1 to 20.11.5 in /superset-websocket (@dependabot[bot])
|
||||
- [#26329](https://github.com/apache/superset/pull/26329) refactor: Removes the deprecated DASHBOARD_NATIVE_FILTERS feature flag (@michael-s-molina)
|
||||
- [#26347](https://github.com/apache/superset/pull/26347) refactor: Removes the deprecated VERSIONED_EXPORT feature flag (@michael-s-molina)
|
||||
- [#26677](https://github.com/apache/superset/pull/26677) chore: Updates the Release Process link in the issue template (@michael-s-molina)
|
||||
- [#26375](https://github.com/apache/superset/pull/26375) chore: Updates the bug report template (@michael-s-molina)
|
||||
- [#26462](https://github.com/apache/superset/pull/26462) refactor: Removes the Profile feature (@michael-s-molina)
|
||||
- [#26665](https://github.com/apache/superset/pull/26665) build(deps): bump the npm_and_yarn group group in /superset-frontend with 2 updates (@dependabot[bot])
|
||||
- [#26661](https://github.com/apache/superset/pull/26661) chore: Updates CHANGELOG.md and UPDATING.md with 3.1.0 data (@michael-s-molina)
|
||||
- [#26330](https://github.com/apache/superset/pull/26330) refactor: Removes the deprecated DASHBOARD_FILTERS_EXPERIMENTAL feature flag (@michael-s-molina)
|
||||
- [#26547](https://github.com/apache/superset/pull/26547) build(deps): bump @mdx-js/react from 1.6.22 to 3.0.0 in /docs (@dependabot[bot])
|
||||
- [#26552](https://github.com/apache/superset/pull/26552) build(deps): bump swagger-ui-react from 4.1.3 to 5.11.0 in /docs (@dependabot[bot])
|
||||
- [#26555](https://github.com/apache/superset/pull/26555) build(deps-dev): bump @tsconfig/docusaurus from 1.0.7 to 2.0.2 in /docs (@dependabot[bot])
|
||||
- [#26344](https://github.com/apache/superset/pull/26344) refactor: Removes the deprecated ENABLE_EXPLORE_JSON_CSRF_PROTECTION feature flag (@michael-s-molina)
|
||||
- [#26345](https://github.com/apache/superset/pull/26345) refactor: Removes the deprecated ENABLE_TEMPLATE_REMOVE_FILTERS feature flag (@michael-s-molina)
|
||||
- [#25800](https://github.com/apache/superset/pull/25800) docs: update embedded readme with user params context (@jbat)
|
||||
- [#12175](https://github.com/apache/superset/pull/12175) build(deps): bump node-notifier from 8.0.0 to 8.0.1 in /superset-frontend (@dependabot[bot])
|
||||
- [#26549](https://github.com/apache/superset/pull/26549) build(deps): bump clsx from 1.1.1 to 2.1.0 in /docs (@dependabot[bot])
|
||||
- [#26560](https://github.com/apache/superset/pull/26560) build(deps-dev): bump typescript from 4.4.4 to 5.3.3 in /docs (@dependabot[bot])
|
||||
- [#26650](https://github.com/apache/superset/pull/26650) chore: Updates CHANGELOG.md and UPDATING.md with 3.0.3 data (@michael-s-molina)
|
||||
- [#26346](https://github.com/apache/superset/pull/26346) refactor: Removes the deprecated REMOVE_SLICE_LEVEL_LABEL_COLORS feature flag (@michael-s-molina)
|
||||
- [#26200](https://github.com/apache/superset/pull/26200) refactor: Ensure Flask framework leverages the Flask-SQLAlchemy session (Phase I) (@john-bodley)
|
||||
- [#26633](https://github.com/apache/superset/pull/26633) chore: Deprecates the DASHBOARD_CROSS_FILTERS feature flag (@michael-s-molina)
|
||||
- [#26635](https://github.com/apache/superset/pull/26635) chore: Deprecates the ENABLE_JAVASCRIPT_CONTROLS feature flag (@michael-s-molina)
|
||||
- [#26636](https://github.com/apache/superset/pull/26636) chore: Sets DASHBOARD_VIRTUALIZATION feature flag to True by default (@michael-s-molina)
|
||||
- [#26540](https://github.com/apache/superset/pull/26540) chore(API): Include changed_by.id in Get Charts and Get Datasets API responses (@Vitor-Avila)
|
||||
- [#26637](https://github.com/apache/superset/pull/26637) chore: Sets the DRILL_BY feature flag to True by default (@michael-s-molina)
|
||||
- [#26186](https://github.com/apache/superset/pull/26186) refactor: Ensure Celery leverages the Flask-SQLAlchemy session (@john-bodley)
|
||||
- [#26500](https://github.com/apache/superset/pull/26500) build(deps): update datamaps requirement from ^0.5.8 to ^0.5.9 in /superset-frontend/plugins/legacy-plugin-chart-world-map (@dependabot[bot])
|
||||
- [#25663](https://github.com/apache/superset/pull/25663) build(deps-dev): bump @babel/traverse from 7.16.10 to 7.23.2 in /superset-embedded-sdk (@dependabot[bot])
|
||||
- [#25664](https://github.com/apache/superset/pull/25664) build(deps): bump @babel/traverse from 7.21.4 to 7.23.2 in /superset-frontend/cypress-base (@dependabot[bot])
|
||||
- [#25662](https://github.com/apache/superset/pull/25662) build(deps): bump @babel/traverse from 7.16.3 to 7.23.2 in /docs (@dependabot[bot])
|
||||
- [#26606](https://github.com/apache/superset/pull/26606) docs: fix links (@fenilgmehta)
|
||||
- [#26348](https://github.com/apache/superset/pull/26348) refactor: Removes the deprecated CLIENT_CACHE feature flag (@michael-s-molina)
|
||||
- [#26349](https://github.com/apache/superset/pull/26349) refactor: Removes the deprecated DASHBOARD_CACHE feature flag (@michael-s-molina)
|
||||
- [#26450](https://github.com/apache/superset/pull/26450) chore: Deprecates the KV_STORE feature flag (@michael-s-molina)
|
||||
- [#26343](https://github.com/apache/superset/pull/26343) refactor: Removes the deprecated ENABLE_EXPLORE_DRAG_AND_DROP feature flag (@michael-s-molina)
|
||||
- [#26331](https://github.com/apache/superset/pull/26331) refactor: Removes the deprecated DISABLE_DATASET_SOURCE_EDIT feature flag (@michael-s-molina)
|
||||
- [#26589](https://github.com/apache/superset/pull/26589) build(deps): update lodash requirement from ^4.17.11 to ^4.17.21 in /superset-frontend/plugins/legacy-preset-chart-nvd3 (@dependabot[bot])
|
||||
- [#26506](https://github.com/apache/superset/pull/26506) build(deps): update urijs requirement from ^1.19.8 to ^1.19.11 in /superset-frontend/plugins/legacy-preset-chart-nvd3 (@dependabot[bot])
|
||||
- [#26520](https://github.com/apache/superset/pull/26520) build(deps-dev): bump style-loader from 3.3.3 to 3.3.4 in /superset-frontend (@dependabot[bot])
|
||||
- [#26538](https://github.com/apache/superset/pull/26538) build(deps-dev): bump @types/urijs from 1.19.19 to 1.19.25 in /superset-frontend (@dependabot[bot])
|
||||
- [#26530](https://github.com/apache/superset/pull/26530) build(deps): update lodash requirement from ^4.17.15 to ^4.17.21 in /superset-frontend/packages/superset-ui-chart-controls (@dependabot[bot])
|
||||
- [#26539](https://github.com/apache/superset/pull/26539) build(deps): update xss requirement from ^1.0.10 to ^1.0.14 in /superset-frontend/plugins/plugin-chart-table (@dependabot[bot])
|
||||
- [#26545](https://github.com/apache/superset/pull/26545) build(deps): bump moment-timezone from 0.5.37 to 0.5.44 in /superset-frontend (@dependabot[bot])
|
||||
- [#26562](https://github.com/apache/superset/pull/26562) build(deps): bump less from 4.1.3 to 4.2.0 in /docs (@dependabot[bot])
|
||||
- [#26612](https://github.com/apache/superset/pull/26612) build(deps): bump @docusaurus/preset-classic from 2.4.1 to 2.4.3 in /docs (@dependabot[bot])
|
||||
- [#26619](https://github.com/apache/superset/pull/26619) build(deps-dev): bump @types/node from 20.11.0 to 20.11.1 in /superset-websocket (@dependabot[bot])
|
||||
- [#26503](https://github.com/apache/superset/pull/26503) build(deps): update prop-types requirement from ^15.6.2 to ^15.8.1 in /superset-frontend/plugins/legacy-plugin-chart-sunburst (@dependabot[bot])
|
||||
- [#26509](https://github.com/apache/superset/pull/26509) build(deps): update prop-types requirement from ^15.6.2 to ^15.8.1 in /superset-frontend/plugins/legacy-plugin-chart-rose (@dependabot[bot])
|
||||
- [#26515](https://github.com/apache/superset/pull/26515) build(deps): update prop-types requirement from ^15.6.2 to ^15.8.1 in /superset-frontend/plugins/legacy-plugin-chart-country-map (@dependabot[bot])
|
||||
- [#26524](https://github.com/apache/superset/pull/26524) build(deps): update prop-types requirement from ^15.6.2 to ^15.8.1 in /superset-frontend/plugins/legacy-plugin-chart-partition (@dependabot[bot])
|
||||
- [#26525](https://github.com/apache/superset/pull/26525) build(deps): update prop-types requirement from ^15.6.2 to ^15.8.1 in /superset-frontend/plugins/legacy-plugin-chart-chord (@dependabot[bot])
|
||||
- [#26535](https://github.com/apache/superset/pull/26535) build(deps): update prop-types requirement from ^15.6.2 to ^15.8.1 in /superset-frontend/plugins/legacy-plugin-chart-histogram (@dependabot[bot])
|
||||
- [#26536](https://github.com/apache/superset/pull/26536) build(deps): update prop-types requirement from ^15.6.2 to ^15.8.1 in /superset-frontend/plugins/legacy-plugin-chart-calendar (@dependabot[bot])
|
||||
- [#26541](https://github.com/apache/superset/pull/26541) build(deps): update lodash requirement from ^4.17.15 to ^4.17.21 in /superset-frontend/plugins/plugin-chart-echarts (@dependabot[bot])
|
||||
- [#26569](https://github.com/apache/superset/pull/26569) build(deps): update prop-types requirement from ^15.6.2 to ^15.8.1 in /superset-frontend/plugins/legacy-plugin-chart-map-box (@dependabot[bot])
|
||||
- [#26574](https://github.com/apache/superset/pull/26574) build(deps): update prop-types requirement from ^15.7.2 to ^15.8.1 in /superset-frontend/plugins/legacy-plugin-chart-parallel-coordinates (@dependabot[bot])
|
||||
- [#26580](https://github.com/apache/superset/pull/26580) build(deps): update prop-types requirement from ^15.6.2 to ^15.8.1 in /superset-frontend/plugins/legacy-plugin-chart-horizon (@dependabot[bot])
|
||||
- [#26587](https://github.com/apache/superset/pull/26587) build(deps): update prop-types requirement from ^15.7.2 to ^15.8.1 in /superset-frontend/packages/superset-ui-chart-controls (@dependabot[bot])
|
||||
- [#26477](https://github.com/apache/superset/pull/26477) build(deps): update prop-types requirement from ^15.6.2 to ^15.8.1 in /superset-frontend/plugins/legacy-plugin-chart-sankey (@dependabot[bot])
|
||||
- [#26480](https://github.com/apache/superset/pull/26480) build(deps): update prop-types requirement from ^15.6.2 to ^15.8.1 in /superset-frontend/plugins/legacy-preset-chart-nvd3 (@dependabot[bot])
|
||||
- [#26484](https://github.com/apache/superset/pull/26484) build(deps): update prop-types requirement from ^15.6.2 to ^15.8.1 in /superset-frontend/plugins/legacy-plugin-chart-world-map (@dependabot[bot])
|
||||
- [#26486](https://github.com/apache/superset/pull/26486) build(deps): update prop-types requirement from ^15.6.2 to ^15.8.1 in /superset-frontend/plugins/legacy-plugin-chart-event-flow (@dependabot[bot])
|
||||
- [#26488](https://github.com/apache/superset/pull/26488) build(deps): update prop-types requirement from ^15.6.2 to ^15.8.1 in /superset-frontend/plugins/legacy-plugin-chart-heatmap (@dependabot[bot])
|
||||
- [#26492](https://github.com/apache/superset/pull/26492) build(deps): update prop-types requirement from ^15.6.2 to ^15.8.1 in /superset-frontend/plugins/legacy-plugin-chart-sankey-loop (@dependabot[bot])
|
||||
- [#26558](https://github.com/apache/superset/pull/26558) build(deps): bump @docusaurus/plugin-client-redirects from 2.4.1 to 2.4.3 in /docs (@dependabot[bot])
|
||||
- [#26554](https://github.com/apache/superset/pull/26554) build(deps): bump @algolia/client-search from 4.13.0 to 4.22.1 in /docs (@dependabot[bot])
|
||||
- [#26559](https://github.com/apache/superset/pull/26559) build(deps): bump react-draggable from 4.4.3 to 4.4.6 in /superset-frontend (@dependabot[bot])
|
||||
- [#26568](https://github.com/apache/superset/pull/26568) build(deps): bump react-resizable from 3.0.4 to 3.0.5 in /superset-frontend (@dependabot[bot])
|
||||
- [#26591](https://github.com/apache/superset/pull/26591) build(deps): update lodash requirement from ^4.17.11 to ^4.17.21 in /superset-frontend/packages/generator-superset (@dependabot[bot])
|
||||
- [#26592](https://github.com/apache/superset/pull/26592) build(deps): update prop-types requirement from ^15.6.2 to ^15.8.1 in /superset-frontend/plugins/legacy-plugin-chart-paired-t-test (@dependabot[bot])
|
||||
- [#26600](https://github.com/apache/superset/pull/26600) build(deps-dev): update yeoman-assert requirement from ^3.1.0 to ^3.1.1 in /superset-frontend/packages/generator-superset (@dependabot[bot])
|
||||
- [#26601](https://github.com/apache/superset/pull/26601) build(deps): update fast-safe-stringify requirement from ^2.0.6 to ^2.1.1 in /superset-frontend/plugins/legacy-preset-chart-nvd3 (@dependabot[bot])
|
||||
- [#26444](https://github.com/apache/superset/pull/26444) chore(deps): adding dependabot for plugins/packages and upping PR limits. (@rusackas)
|
||||
- [#26468](https://github.com/apache/superset/pull/26468) docs: Update installing-superset-from-scratch.mdx (@nytai)
|
||||
- [#26455](https://github.com/apache/superset/pull/26455) build(deps-dev): bump @types/node from 20.10.8 to 20.11.0 in /superset-websocket (@dependabot[bot])
|
||||
- [#26447](https://github.com/apache/superset/pull/26447) build(deps-dev): bump @types/node from 20.10.7 to 20.10.8 in /superset-websocket (@dependabot[bot])
|
||||
- [#26441](https://github.com/apache/superset/pull/26441) build(deps): bump follow-redirects from 1.15.2 to 1.15.4 in /superset-frontend (@dependabot[bot])
|
||||
- [#26440](https://github.com/apache/superset/pull/26440) build(deps-dev): bump follow-redirects from 1.15.3 to 1.15.4 in /superset-embedded-sdk (@dependabot[bot])
|
||||
- [#26438](https://github.com/apache/superset/pull/26438) build(deps): bump follow-redirects from 1.14.8 to 1.15.4 in /docs (@dependabot[bot])
|
||||
- [#26428](https://github.com/apache/superset/pull/26428) chore(docs): remove incorrect answer from FAQ (@sfirke)
|
||||
- [#26425](https://github.com/apache/superset/pull/26425) build(deps-dev): bump @types/node from 20.10.6 to 20.10.7 in /superset-websocket (@dependabot[bot])
|
||||
- [#24605](https://github.com/apache/superset/pull/24605) chore: Reenable SQLite tests which leverage foreign key constraints et al. (@john-bodley)
|
||||
- [#26386](https://github.com/apache/superset/pull/26386) build(deps-dev): bump @types/node from 20.10.5 to 20.10.6 in /superset-websocket (@dependabot[bot])
|
||||
- [#26381](https://github.com/apache/superset/pull/26381) docs: fix spelling and grammar (@fenilgmehta)
|
||||
- [#26363](https://github.com/apache/superset/pull/26363) build(deps): bump ws from 8.15.0 to 8.16.0 in /superset-websocket (@dependabot[bot])
|
||||
- [#26371](https://github.com/apache/superset/pull/26371) docs: fix config webdriver snippet in install on K8s (@dbaltor)
|
||||
- [#26368](https://github.com/apache/superset/pull/26368) chore(docs): point to correct StackOverflow page (@sfirke)
|
||||
- [#26308](https://github.com/apache/superset/pull/26308) docs: update CVEs fixed on 3.0.2 and 2.1.3 (@dpgaspar)
|
||||
- [#26305](https://github.com/apache/superset/pull/26305) build(deps-dev): bump @types/node from 20.10.4 to 20.10.5 in /superset-websocket (@dependabot[bot])
|
||||
- [#26301](https://github.com/apache/superset/pull/26301) chore(sqlalchemy): import from correct path (@villebro)
|
||||
- [#26294](https://github.com/apache/superset/pull/26294) build(deps-dev): bump eslint from 8.55.0 to 8.56.0 in /superset-websocket (@dependabot[bot])
|
||||
- [#26293](https://github.com/apache/superset/pull/26293) chore(cleanup): removing redundant rendering logic in telemetry pixel (@rusackas)
|
||||
- [#26285](https://github.com/apache/superset/pull/26285) chore(docs): fix typo "loader balancer" -> "load balancer" (@sfirke)
|
||||
- [#26280](https://github.com/apache/superset/pull/26280) chore(in the wild): Making it even easer to add a name (and cleanup) (@rusackas)
|
||||
- [#26253](https://github.com/apache/superset/pull/26253) chore(docs): add troubleshooting guide to alerts & reports (@sfirke)
|
||||
- [#26259](https://github.com/apache/superset/pull/26259) chore(async queries): sending statsd event for async events API call (@zephyring)
|
||||
- [#25628](https://github.com/apache/superset/pull/25628) chore: adding 'no-experimental-fetch' node option by default (@rusackas)
|
||||
- [#26220](https://github.com/apache/superset/pull/26220) chore(tests): Add tests to the column denormalization flow (@Vitor-Avila)
|
||||
- [#26078](https://github.com/apache/superset/pull/26078) chore: add class component tasklist file (@eschutho)
|
||||
- [#26234](https://github.com/apache/superset/pull/26234) build(deps): bump ws from 8.14.2 to 8.15.0 in /superset-websocket (@dependabot[bot])
|
||||
- [#26233](https://github.com/apache/superset/pull/26233) build(deps-dev): bump ts-node from 10.9.1 to 10.9.2 in /superset-websocket (@dependabot[bot])
|
||||
- [#26204](https://github.com/apache/superset/pull/26204) build(deps-dev): bump @types/node from 20.10.3 to 20.10.4 in /superset-websocket (@dependabot[bot])
|
||||
- [#26174](https://github.com/apache/superset/pull/26174) build(deps-dev): bump eslint from 8.54.0 to 8.55.0 in /superset-websocket (@dependabot[bot])
|
||||
- [#26150](https://github.com/apache/superset/pull/26150) docs: update CHANGELOG for 2.1.2 (@dpgaspar)
|
||||
- [#26166](https://github.com/apache/superset/pull/26166) build(deps-dev): bump eslint-config-prettier from 9.0.0 to 9.1.0 in /superset-websocket (@dependabot[bot])
|
||||
- [#26167](https://github.com/apache/superset/pull/26167) build(deps-dev): bump @types/node from 20.10.1 to 20.10.3 in /superset-websocket (@dependabot[bot])
|
||||
- [#26129](https://github.com/apache/superset/pull/26129) docs: add quickstart (@artofcomputing)
|
||||
- [#26143](https://github.com/apache/superset/pull/26143) build(deps-dev): bump @types/node from 20.10.0 to 20.10.1 in /superset-websocket (@dependabot[bot])
|
||||
- [#26149](https://github.com/apache/superset/pull/26149) docs: update CVEs fixed on 3.0.0 (@dpgaspar)
|
||||
- [#26038](https://github.com/apache/superset/pull/26038) docs(drivers): refresh guide on adding a db driver in docker (@sfirke)
|
||||
- [#26124](https://github.com/apache/superset/pull/26124) docs: add Increff to INTHEWILD (@ishansinghania)
|
||||
- [#26112](https://github.com/apache/superset/pull/26112) docs: add Onebeat to users list (@GuyAttia)
|
||||
- [#26119](https://github.com/apache/superset/pull/26119) docs: Update Trino Kerberos configuration (@john-bodley)
|
||||
- [#26100](https://github.com/apache/superset/pull/26100) build(deps-dev): bump @types/node from 20.9.4 to 20.10.0 in /superset-websocket (@dependabot[bot])
|
||||
- [#26099](https://github.com/apache/superset/pull/26099) build(deps-dev): bump @types/cookie from 0.5.4 to 0.6.0 in /superset-websocket (@dependabot[bot])
|
||||
- [#26104](https://github.com/apache/superset/pull/26104) docs: update CVEs fixed on 2.1.2 (@dpgaspar)
|
||||
- [#26288](https://github.com/apache/superset/pull/26288) chore: Ensure Mixins are ordered according to the MRO (@john-bodley)
|
1575
CONTRIBUTING.md
1575
CONTRIBUTING.md
File diff suppressed because it is too large
Load Diff
17
Dockerfile
17
Dockerfile
|
@ -18,11 +18,11 @@
|
|||
######################################################################
|
||||
# Node stage to deal with static asset construction
|
||||
######################################################################
|
||||
ARG PY_VER=3.9-slim-bookworm
|
||||
ARG PY_VER=3.10-slim-bookworm
|
||||
|
||||
# if BUILDPLATFORM is null, set it to 'amd64' (or leave as is otherwise).
|
||||
ARG BUILDPLATFORM=${BUILDPLATFORM:-amd64}
|
||||
FROM --platform=${BUILDPLATFORM} node:16-bookworm-slim AS superset-node
|
||||
FROM --platform=${BUILDPLATFORM} node:18-bullseye-slim AS superset-node
|
||||
|
||||
ARG NPM_BUILD_CMD="build"
|
||||
|
||||
|
@ -76,7 +76,7 @@ RUN mkdir -p ${PYTHONPATH} superset/static requirements superset-frontend apache
|
|||
&& chown -R superset:superset ./* \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
COPY --chown=superset:superset setup.py MANIFEST.in README.md ./
|
||||
COPY --chown=superset:superset pyproject.toml setup.py MANIFEST.in README.md ./
|
||||
# setup.py uses the version information in package.json
|
||||
COPY --chown=superset:superset superset-frontend/package.json superset-frontend/
|
||||
COPY --chown=superset:superset requirements/base.txt requirements/
|
||||
|
@ -119,8 +119,15 @@ RUN apt-get update -qq \
|
|||
libasound2 \
|
||||
libxtst6 \
|
||||
wget \
|
||||
# Install GeckoDriver WebDriver
|
||||
&& wget -q https://github.com/mozilla/geckodriver/releases/download/${GECKODRIVER_VERSION}/geckodriver-${GECKODRIVER_VERSION}-linux64.tar.gz -O - | tar xfz - -C /usr/local/bin \
|
||||
git \
|
||||
pkg-config
|
||||
|
||||
RUN pip install playwright
|
||||
RUN playwright install-deps
|
||||
RUN playwright install chromium
|
||||
|
||||
# Install GeckoDriver WebDriver
|
||||
RUN wget -q https://github.com/mozilla/geckodriver/releases/download/${GECKODRIVER_VERSION}/geckodriver-${GECKODRIVER_VERSION}-linux64.tar.gz -O - | tar xfz - -C /usr/local/bin \
|
||||
# Install Firefox
|
||||
&& wget -q https://download-installer.cdn.mozilla.net/pub/firefox/releases/${FIREFOX_VERSION}/linux-x86_64/en-US/firefox-${FIREFOX_VERSION}.tar.bz2 -O - | tar xfj - -C /opt \
|
||||
&& ln -s /opt/firefox/firefox /usr/local/bin/firefox \
|
||||
|
|
6
Makefile
6
Makefile
|
@ -15,8 +15,8 @@
|
|||
# limitations under the License.
|
||||
#
|
||||
|
||||
# Python version installed; we need 3.9-3.11
|
||||
PYTHON=`command -v python3.11 || command -v python3.10 || command -v python3.9`
|
||||
# Python version installed; we need 3.10-3.11
|
||||
PYTHON=`command -v python3.11 || command -v python3.10`
|
||||
|
||||
.PHONY: install superset venv pre-commit
|
||||
|
||||
|
@ -70,7 +70,7 @@ update-js:
|
|||
|
||||
venv:
|
||||
# Create a virtual environment and activate it (recommended)
|
||||
if ! [ -x "${PYTHON}" ]; then echo "You need Python 3.9, 3.10 or 3.11 installed"; exit 1; fi
|
||||
if ! [ -x "${PYTHON}" ]; then echo "You need Python 3.10 or 3.11 installed"; exit 1; fi
|
||||
test -d venv || ${PYTHON} -m venv venv # setup a python3 virtualenv
|
||||
. venv/bin/activate
|
||||
|
||||
|
|
2
NOTICE
2
NOTICE
|
@ -1,5 +1,5 @@
|
|||
Apache Superset
|
||||
Copyright 2016-2021 The Apache Software Foundation
|
||||
Copyright 2016-2024 The Apache Software Foundation
|
||||
|
||||
This product includes software developed at The Apache Software
|
||||
Foundation (http://www.apache.org/).
|
||||
|
|
120
README.md
120
README.md
|
@ -1,3 +1,7 @@
|
|||
---
|
||||
hide_title: true
|
||||
sidebar_position: 1
|
||||
---
|
||||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
|
@ -30,12 +34,14 @@ under the License.
|
|||
|
||||
<picture width="500">
|
||||
<source
|
||||
width="600"
|
||||
media="(prefers-color-scheme: dark)"
|
||||
src="https://github.com/apache/superset/raw/master/superset-frontend/src/assets/branding/superset-logo-horiz-apache-dark.png"
|
||||
src="https://superset.apache.org/img/superset-logo-horiz-dark.svg"
|
||||
alt="Superset logo (dark)"
|
||||
/>
|
||||
<img
|
||||
src="https://github.com/apache/superset/raw/master/superset-frontend/src/assets/branding/superset-logo-horiz-apache.png"
|
||||
width="600"
|
||||
src="https://superset.apache.org/img/superset-logo-horiz-apache.svg"
|
||||
alt="Superset logo (light)"
|
||||
/>
|
||||
</picture>
|
||||
|
@ -45,11 +51,11 @@ A modern, enterprise-ready business intelligence web application.
|
|||
[**Why Superset?**](#why-superset) |
|
||||
[**Supported Databases**](#supported-databases) |
|
||||
[**Installation and Configuration**](#installation-and-configuration) |
|
||||
[**Release Notes**](RELEASING/README.md#release-notes-for-recent-releases) |
|
||||
[**Release Notes**](https://github.com/apache/superset/blob/master/RELEASING/README.md#release-notes-for-recent-releases) |
|
||||
[**Get Involved**](#get-involved) |
|
||||
[**Contributor Guide**](#contributor-guide) |
|
||||
[**Resources**](#resources) |
|
||||
[**Organizations Using Superset**](RESOURCES/INTHEWILD.md)
|
||||
[**Organizations Using Superset**](https://github.com/apache/superset/blob/master/RESOURCES/INTHEWILD.md)
|
||||
|
||||
## Why Superset?
|
||||
|
||||
|
@ -70,76 +76,76 @@ Superset provides:
|
|||
## Screenshots & Gifs
|
||||
|
||||
**Video Overview**
|
||||
|
||||
https://user-images.githubusercontent.com/64562059/234390129-321d4f35-cb4b-45e8-89d9-20ae292f34fc.mp4
|
||||
<!-- File hosted here https://github.com/apache/superset-site/raw/lfs/superset-video-4k.mp4 -->
|
||||
https://superset.staged.apache.org/superset-video-4k.mp4
|
||||
|
||||
<br/>
|
||||
|
||||
**Large Gallery of Visualizations**
|
||||
|
||||
<kbd><img title="Gallery" src="superset-frontend/src/assets/images/screenshots/gallery.jpg"/></kbd><br/>
|
||||
<kbd><img title="Gallery" src="https://superset.apache.org/img/screenshots/gallery.jpg"/></kbd><br/>
|
||||
|
||||
**Craft Beautiful, Dynamic Dashboards**
|
||||
|
||||
<kbd><img title="View Dashboards" src="superset-frontend/src/assets/images/screenshots/slack_dash.jpg"/></kbd><br/>
|
||||
<kbd><img title="View Dashboards" src="https://superset.apache.org/img/screenshots/slack_dash.jpg"/></kbd><br/>
|
||||
|
||||
**No-Code Chart Builder**
|
||||
|
||||
<kbd><img title="Slice & dice your data" src="superset-frontend/src/assets/images/screenshots/explore.jpg"/></kbd><br/>
|
||||
<kbd><img title="Slice & dice your data" src="https://superset.apache.org/img/screenshots/explore.jpg"/></kbd><br/>
|
||||
|
||||
**Powerful SQL Editor**
|
||||
|
||||
<kbd><img title="SQL Lab" src="superset-frontend/src/assets/images/screenshots/sql_lab.jpg"/></kbd><br/>
|
||||
<kbd><img title="SQL Lab" src="https://superset.apache.org/img/screenshots/sql_lab.jpg"/></kbd><br/>
|
||||
|
||||
## Supported Databases
|
||||
|
||||
Superset can query data from any SQL-speaking datastore or data engine (Presto, Trino, Athena, [and more](https://superset.apache.org/docs/databases/installing-database-drivers/)) that has a Python DB-API driver and a SQLAlchemy dialect.
|
||||
Superset can query data from any SQL-speaking datastore or data engine (Presto, Trino, Athena, [and more](https://superset.apache.org/docs/configuration/databases)) that has a Python DB-API driver and a SQLAlchemy dialect.
|
||||
|
||||
Here are some of the major database solutions that are supported:
|
||||
|
||||
<p align="center">
|
||||
<img src="superset-frontend/src/assets/images/redshift.png" alt="redshift" border="0" width="200" height="80"/>
|
||||
<img src="superset-frontend/src/assets/images/google-biquery.png" alt="google-biquery" border="0" width="200" height="80"/>
|
||||
<img src="superset-frontend/src/assets/images/snowflake.png" alt="snowflake" border="0" width="200" height="80"/>
|
||||
<img src="superset-frontend/src/assets/images/trino.png" alt="trino" border="0" width="200" height="80"/>
|
||||
<img src="superset-frontend/src/assets/images/presto.png" alt="presto" border="0" width="200" height="80"/>
|
||||
<img src="superset-frontend/src/assets/images/databricks.png" alt="databricks" border="0" width="200" height="80" />
|
||||
<img src="superset-frontend/src/assets/images/druid.png" alt="druid" border="0" width="200" height="80" />
|
||||
<img src="superset-frontend/src/assets/images/firebolt.png" alt="firebolt" border="0" width="200" height="80" />
|
||||
<img src="superset-frontend/src/assets/images/timescale.png" alt="timescale" border="0" width="200" height="80" />
|
||||
<img src="superset-frontend/src/assets/images/rockset.png" alt="rockset" border="0" width="200" height="80" />
|
||||
<img src="superset-frontend/src/assets/images/postgresql.png" alt="postgresql" border="0" width="200" height="80" />
|
||||
<img src="superset-frontend/src/assets/images/mysql.png" alt="mysql" border="0" width="200" height="80" />
|
||||
<img src="superset-frontend/src/assets/images/mssql-server.png" alt="mssql-server" border="0" width="200" height="80" />
|
||||
<img src="superset-frontend/src/assets/images/db2.png" alt="db2" border="0" width="200" height="80" />
|
||||
<img src="superset-frontend/src/assets/images/sqlite.png" alt="sqlite" border="0" width="200" height="80" />
|
||||
<img src="superset-frontend/src/assets/images/sybase.png" alt="sybase" border="0" width="200" height="80" />
|
||||
<img src="superset-frontend/src/assets/images/mariadb.png" alt="mariadb" border="0" width="200" height="80" />
|
||||
<img src="superset-frontend/src/assets/images/vertica.png" alt="vertica" border="0" width="200" height="80" />
|
||||
<img src="superset-frontend/src/assets/images/oracle.png" alt="oracle" border="0" width="200" height="80" />
|
||||
<img src="superset-frontend/src/assets/images/firebird.png" alt="firebird" border="0" width="200" height="80" />
|
||||
<img src="superset-frontend/src/assets/images/greenplum.png" alt="greenplum" border="0" width="200" height="80" />
|
||||
<img src="superset-frontend/src/assets/images/clickhouse.png" alt="clickhouse" border="0" width="200" height="80" />
|
||||
<img src="superset-frontend/src/assets/images/exasol.png" alt="exasol" border="0" width="200" height="80" />
|
||||
<img src="superset-frontend/src/assets/images/monet-db.png" alt="monet-db" border="0" width="200" height="80" />
|
||||
<img src="superset-frontend/src/assets/images/apache-kylin.png" alt="apache-kylin" border="0" width="200" height="80"/>
|
||||
<img src="superset-frontend/src/assets/images/hologres.png" alt="hologres" border="0" width="200" height="80"/>
|
||||
<img src="superset-frontend/src/assets/images/netezza.png" alt="netezza" border="0" width="200" height="80"/>
|
||||
<img src="superset-frontend/src/assets/images/pinot.png" alt="pinot" border="0" width="200" height="80"/>
|
||||
<img src="superset-frontend/src/assets/images/teradata.png" alt="teradata" border="0" width="200" height="80"/>
|
||||
<img src="superset-frontend/src/assets/images/yugabyte.png" alt="yugabyte" border="0" width="200" height="80"/>
|
||||
<img src="superset-frontend/src/assets/images/databend.png" alt="databend" border="0" width="200" height="80"/>
|
||||
<img src="superset-frontend/src/assets/images/starrocks.png" alt="starrocks" border="0" width="200" height="80"/>
|
||||
<img src="superset-frontend/src/assets/images/doris.png" alt="doris" border="0" width="200" height="80"/>
|
||||
<img src="https://superset.apache.org/img/databases/redshift.png" alt="redshift" border="0" width="200"/>
|
||||
<img src="https://superset.apache.org/img/databases/google-biquery.png" alt="google-biquery" border="0" width="200"/>
|
||||
<img src="https://superset.apache.org/img/databases/snowflake.png" alt="snowflake" border="0" width="200"/>
|
||||
<img src="https://superset.apache.org/img/databases/trino.png" alt="trino" border="0" width="150" />
|
||||
<img src="https://superset.apache.org/img/databases/presto.png" alt="presto" border="0" width="200"/>
|
||||
<img src="https://superset.apache.org/img/databases/databricks.png" alt="databricks" border="0" width="160" />
|
||||
<img src="https://superset.apache.org/img/databases/druid.png" alt="druid" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/firebolt.png" alt="firebolt" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/timescale.png" alt="timescale" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/rockset.png" alt="rockset" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/postgresql.png" alt="postgresql" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/mysql.png" alt="mysql" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/mssql-server.png" alt="mssql-server" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/imb-db2.svg" alt="db2" border="0" width="220" />
|
||||
<img src="https://superset.apache.org/img/databases/sqlite.png" alt="sqlite" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/sybase.png" alt="sybase" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/mariadb.png" alt="mariadb" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/vertica.png" alt="vertica" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/oracle.png" alt="oracle" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/firebird.png" alt="firebird" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/greenplum.png" alt="greenplum" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/clickhouse.png" alt="clickhouse" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/exasol.png" alt="exasol" border="0" width="160" />
|
||||
<img src="https://superset.apache.org/img/databases/monet-db.png" alt="monet-db" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/apache-kylin.png" alt="apache-kylin" border="0" width="80"/>
|
||||
<img src="https://superset.apache.org/img/databases/hologres.png" alt="hologres" border="0" width="80"/>
|
||||
<img src="https://superset.apache.org/img/databases/netezza.png" alt="netezza" border="0" width="80"/>
|
||||
<img src="https://superset.apache.org/img/databases/pinot.png" alt="pinot" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/teradata.png" alt="teradata" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/yugabyte.png" alt="yugabyte" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/databend.png" alt="databend" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/starrocks.png" alt="starrocks" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/doris.png" alt="doris" border="0" width="200" />
|
||||
</p>
|
||||
|
||||
**A more comprehensive list of supported databases** along with the configuration instructions can be found [here](https://superset.apache.org/docs/databases/installing-database-drivers).
|
||||
**A more comprehensive list of supported databases** along with the configuration instructions can be found [here](https://superset.apache.org/docs/configuration/databases).
|
||||
|
||||
Want to add support for your datastore or data engine? Read more [here](https://superset.apache.org/docs/frequently-asked-questions#does-superset-work-with-insert-database-engine-here) about the technical requirements.
|
||||
|
||||
## Installation and Configuration
|
||||
|
||||
[Extended documentation for Superset](https://superset.apache.org/docs/installation/installing-superset-using-docker-compose)
|
||||
[Extended documentation for Superset](https://superset.apache.org/docs/installation/docker-compose)
|
||||
|
||||
## Get Involved
|
||||
|
||||
|
@ -159,26 +165,22 @@ how to set up a development environment.
|
|||
|
||||
## Resources
|
||||
|
||||
- [Superset "In the Wild"](RESOURCES/INTHEWILD.md) - open a PR to add your org to the list!
|
||||
- [Feature Flags](RESOURCES/FEATURE_FLAGS.md) - the status of Superset's Feature Flags.
|
||||
- [Standard Roles](RESOURCES/STANDARD_ROLES.md) - How RBAC permissions map to roles.
|
||||
- [Superset "In the Wild"](https://github.com/apache/superset/blob/master/RESOURCES/INTHEWILD.md) - open a PR to add your org to the list!
|
||||
- [Feature Flags](https://github.com/apache/superset/blob/master/RESOURCES/FEATURE_FLAGS.md) - the status of Superset's Feature Flags.
|
||||
- [Standard Roles](https://github.com/apache/superset/blob/master/RESOURCES/STANDARD_ROLES.md) - How RBAC permissions map to roles.
|
||||
- [Superset Wiki](https://github.com/apache/superset/wiki) - Tons of additional community resources: best practices, community content and other information.
|
||||
- [Superset SIPs](https://github.com/orgs/apache/projects/170) - The status of Superset's SIPs (Superset Improvement Proposals) for both consensus and implementation status.
|
||||
|
||||
Superset 2.0!
|
||||
- [Superset 2.0 Meetup](https://preset.io/events/superset-2-0-meetup/)
|
||||
- [Superset 2.0 Release Notes](https://github.com/apache/superset/tree/master/RELEASING/release-notes-2-0)
|
||||
|
||||
Understanding the Superset Points of View
|
||||
- [The Case for Dataset-Centric Visualization](https://preset.io/blog/dataset-centric-visualization/)
|
||||
- [Understanding the Superset Semantic Layer](https://preset.io/blog/understanding-superset-semantic-layer/)
|
||||
|
||||
|
||||
- Getting Started with Superset
|
||||
- [Superset in 2 Minutes using Docker Compose](https://superset.apache.org/docs/installation/installing-superset-using-docker-compose#installing-superset-locally-using-docker-compose)
|
||||
- [Installing Database Drivers](https://superset.apache.org/docs/databases/docker-add-drivers/)
|
||||
- [Superset in 2 Minutes using Docker Compose](https://superset.apache.org/docs/installation/docker-compose#installing-superset-locally-using-docker-compose)
|
||||
- [Installing Database Drivers](https://superset.apache.org/docs/configuration/databases#installing-database-drivers)
|
||||
- [Building New Database Connectors](https://preset.io/blog/building-database-connector/)
|
||||
- [Create Your First Dashboard](https://superset.apache.org/docs/creating-charts-dashboards/first-dashboard)
|
||||
- [Create Your First Dashboard](https://superset.apache.org/docs/using-superset/creating-your-first-dashboard/)
|
||||
- [Comprehensive Tutorial for Contributing Code to Apache Superset
|
||||
](https://preset.io/blog/tutorial-contributing-code-to-apache-superset/)
|
||||
- [Resources to master Superset by Preset](https://preset.io/resources/)
|
||||
|
@ -202,10 +204,10 @@ Understanding the Superset Points of View
|
|||
- [Superset API](https://superset.apache.org/docs/rest-api)
|
||||
|
||||
## Repo Activity
|
||||
<a href="https://next.ossinsight.io/widgets/official/compose-last-28-days-stats?repo_id=39464018" target="_blank" style="display: block" align="center">
|
||||
<a href="https://next.ossinsight.io/widgets/official/compose-last-28-days-stats?repo_id=39464018" target="_blank" align="center">
|
||||
<picture>
|
||||
<source media="(prefers-color-scheme: dark)" srcset="https://next.ossinsight.io/widgets/official/compose-last-28-days-stats/thumbnail.png?repo_id=39464018&image_size=auto&color_scheme=dark" width="655" height="auto">
|
||||
<img alt="Performance Stats of apache/superset - Last 28 days" src="https://next.ossinsight.io/widgets/official/compose-last-28-days-stats/thumbnail.png?repo_id=39464018&image_size=auto&color_scheme=light" width="655" height="auto">
|
||||
<source media="(prefers-color-scheme: dark)" srcset="https://next.ossinsight.io/widgets/official/compose-last-28-days-stats/thumbnail.png?repo_id=39464018&image_size=auto&color_scheme=dark" width="655" height="auto" />
|
||||
<img alt="Performance Stats of apache/superset - Last 28 days" src="https://next.ossinsight.io/widgets/official/compose-last-28-days-stats/thumbnail.png?repo_id=39464018&image_size=auto&color_scheme=light" width="655" height="auto" />
|
||||
</picture>
|
||||
</a>
|
||||
|
||||
|
|
|
@ -14,7 +14,7 @@
|
|||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
#
|
||||
FROM python:3.9-buster
|
||||
FROM python:3.10-slim-bookworm
|
||||
|
||||
RUN useradd --user-group --create-home --no-log-init --shell /bin/bash superset
|
||||
|
||||
|
|
|
@ -14,7 +14,7 @@
|
|||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
#
|
||||
FROM python:3.9-buster
|
||||
FROM python:3.10-slim-bookworm
|
||||
|
||||
RUN useradd --user-group --create-home --no-log-init --shell /bin/bash superset
|
||||
|
||||
|
|
|
@ -14,7 +14,7 @@
|
|||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
#
|
||||
FROM python:3.9-buster
|
||||
FROM python:3.10-slim-bookworm
|
||||
ARG VERSION
|
||||
|
||||
RUN git clone --depth 1 --branch ${VERSION} https://github.com/apache/superset.git /superset
|
||||
|
|
|
@ -14,7 +14,7 @@
|
|||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
#
|
||||
FROM python:3.9-buster
|
||||
FROM python:3.10-slim-bookworm
|
||||
|
||||
RUN apt-get update -y
|
||||
RUN apt-get install -y jq
|
||||
|
|
|
@ -506,3 +506,18 @@ and re-push the proper images and tags through this interface. The action
|
|||
takes the version (ie `3.1.1`), the git reference (any SHA, tag or branch
|
||||
reference), and whether to force the `latest` Docker tag on the
|
||||
generated images.
|
||||
|
||||
### Npm Release
|
||||
You might want to publish the latest @superset-ui release to npm
|
||||
```bash
|
||||
cd superset/superset-frontend
|
||||
```
|
||||
An automated GitHub action will run and generate a new tag, which will contain a version number provided as a parameter.
|
||||
```bash
|
||||
export GH_TOKEN={GITHUB_TOKEN}
|
||||
npx lerna version {VERSION} --conventional-commits --create-release github --no-private --yes --message {COMMIT_MESSAGE}
|
||||
```
|
||||
This action will publish the specified version to npm registry.
|
||||
```bash
|
||||
npx lerna publish from-package --yes
|
||||
```
|
||||
|
|
|
@ -94,10 +94,10 @@ class GitChangeLog:
|
|||
if not pull_request:
|
||||
pull_request = github_repo.get_pull(pr_number)
|
||||
self._github_prs[pr_number] = pull_request
|
||||
except BadCredentialsException as ex:
|
||||
except BadCredentialsException:
|
||||
print(
|
||||
f"Bad credentials to github provided"
|
||||
f" use access_token parameter or set GITHUB_TOKEN"
|
||||
"Bad credentials to github provided"
|
||||
" use access_token parameter or set GITHUB_TOKEN"
|
||||
)
|
||||
sys.exit(1)
|
||||
|
||||
|
@ -167,8 +167,8 @@ class GitChangeLog:
|
|||
def _get_changelog_version_head(self) -> str:
|
||||
if not len(self._logs):
|
||||
print(
|
||||
f"No changes found between revisions. "
|
||||
f"Make sure your branch is up to date."
|
||||
"No changes found between revisions. "
|
||||
"Make sure your branch is up to date."
|
||||
)
|
||||
sys.exit(1)
|
||||
return f"### {self._version} ({self._logs[0].time})"
|
||||
|
|
|
@ -18,7 +18,7 @@
|
|||
-#}
|
||||
To: {{ receiver_email }}
|
||||
|
||||
Subject: [ANNOUNCE] Apache {{ project_name }} version {{ version }} Released
|
||||
Subject: [ANNOUNCE] Apache {{ project_name }} version {{ version }} released
|
||||
|
||||
Hello Community,
|
||||
|
||||
|
@ -33,7 +33,7 @@ https://downloads.apache.org/{{ project_module }}/{{ version }}
|
|||
The PyPI package:
|
||||
https://pypi.org/project/apache-superset/{{ version }}
|
||||
|
||||
The Change Log for the release:
|
||||
The CHANGELOG for the release:
|
||||
https://github.com/apache/{{ project_module }}/blob/{{ version }}/CHANGELOG/{{ version }}.md
|
||||
|
||||
The instructions for updating to the release:
|
||||
|
|
|
@ -52,7 +52,8 @@ Negative votes:
|
|||
{% endfor -%}
|
||||
{%- endif %}
|
||||
|
||||
Link to vote thread: {{ vote_thread }}
|
||||
Link to vote thread:
|
||||
{{ vote_thread }}
|
||||
|
||||
We will work to complete the release process.
|
||||
|
||||
|
|
|
@ -30,7 +30,7 @@ https://dist.apache.org/repos/dist/dev/{{ project_module }}/{{ version_rc }}/
|
|||
The Git tag for the release:
|
||||
https://github.com/apache/{{ project_module }}/tree/{{ version_rc }}
|
||||
|
||||
The change log for the release:
|
||||
The CHANGELOG for the release:
|
||||
https://github.com/apache/{{ project_module }}/blob/{{ version_rc }}/CHANGELOG/{{ version }}.md
|
||||
|
||||
The instructions for updating to the release:
|
||||
|
@ -39,8 +39,8 @@ https://github.com/apache/{{ project_module }}/blob/{{ version_rc }}/UPDATING.md
|
|||
Public keys are available at:
|
||||
https://www.apache.org/dist/{{ project_module }}/KEYS
|
||||
|
||||
The vote will be open for at least 72 hours or until the necessary number
|
||||
of votes are reached.
|
||||
The vote will be left open until at least 72 hours have passed
|
||||
and the necessary number of votes (3) have been reached.
|
||||
|
||||
Please vote accordingly:
|
||||
|
||||
|
|
|
@ -31,7 +31,7 @@ except ModuleNotFoundError:
|
|||
RECEIVER_EMAIL = "dev@superset.apache.org"
|
||||
PROJECT_NAME = "Superset"
|
||||
PROJECT_MODULE = "superset"
|
||||
PROJECT_DESCRIPTION = "Apache Superset is a modern, enterprise-ready business intelligence web application"
|
||||
PROJECT_DESCRIPTION = "Apache Superset is a modern, enterprise-ready business intelligence web application."
|
||||
|
||||
|
||||
def string_comma_to_list(message: str) -> list[str]:
|
||||
|
|
|
@ -27,6 +27,7 @@ These features are considered **unfinished** and should only be used on developm
|
|||
|
||||
[//]: # "PLEASE KEEP THE LIST SORTED ALPHABETICALLY"
|
||||
|
||||
- ALERT_REPORT_TABS
|
||||
- ENABLE_ADVANCED_DATA_TYPES
|
||||
- PRESTO_EXPAND_DATA
|
||||
- SHARE_QUERIES_VIA_KV_STORE
|
||||
|
@ -39,44 +40,47 @@ These features are **finished** but currently being tested. They are usable, but
|
|||
|
||||
[//]: # "PLEASE KEEP THE LIST SORTED ALPHABETICALLY"
|
||||
|
||||
- ALERT_REPORTS: [(docs)](https://superset.apache.org/docs/installation/alerts-reports)
|
||||
- ALERT_REPORTS: [(docs)](https://superset.apache.org/docs/configuration/alerts-reports)
|
||||
- ALLOW_FULL_CSV_EXPORT
|
||||
- CACHE_IMPERSONATION
|
||||
- CONFIRM_DASHBOARD_DIFF
|
||||
- DRILL_TO_DETAIL
|
||||
- DYNAMIC_PLUGINS: [(docs)](https://superset.apache.org/docs/installation/running-on-kubernetes)
|
||||
- ENABLE_SUPERSET_META_DB: [(docs)](https://superset.apache.org/docs/databases/meta-database/)
|
||||
- DYNAMIC_PLUGINS: [(docs)](https://superset.apache.org/docs/configuration/running-on-kubernetes)
|
||||
- ENABLE_SUPERSET_META_DB: [(docs)]()
|
||||
- ESTIMATE_QUERY_COST
|
||||
- GLOBAL_ASYNC_QUERIES [(docs)](https://github.com/apache/superset/blob/master/CONTRIBUTING.md#async-chart-queries)
|
||||
- HORIZONTAL_FILTER_BAR
|
||||
- PLAYWRIGHT_REPORTS_AND_THUMBNAILS
|
||||
- RLS_IN_SQLLAB
|
||||
- SSH_TUNNELING [(docs)](https://superset.apache.org/docs/installation/setup-ssh-tunneling)
|
||||
- SSH_TUNNELING [(docs)](https://superset.apache.org/docs/configuration/setup-ssh-tunneling)
|
||||
- USE_ANALAGOUS_COLORS
|
||||
|
||||
## Stable
|
||||
|
||||
These features flags are **safe for production**. They have been tested and will be supported for the foreseeable future.
|
||||
These features flags are **safe for production**. They have been tested and will be supported for the at least the current major version cycle.
|
||||
|
||||
[//]: # "PLEASE KEEP THE LIST SORTED ALPHABETICALLY"
|
||||
[//]: # "PLEASE KEEP THESE LISTS SORTED ALPHABETICALLY"
|
||||
|
||||
### Flags on the path to feature launch and flag deprecation/removal
|
||||
- DASHBOARD_VIRTUALIZATION
|
||||
- DRILL_BY
|
||||
- DISABLE_LEGACY_DATASOURCE_EDITOR
|
||||
|
||||
### Flags retained for runtime configuration
|
||||
- ALERTS_ATTACH_REPORTS
|
||||
- ALLOW_ADHOC_SUBQUERY
|
||||
- DASHBOARD_RBAC [(docs)](https://superset.apache.org/docs/creating-charts-dashboards/first-dashboard#manage-access-to-dashboards)
|
||||
- DASHBOARD_VIRTUALIZATION
|
||||
- DASHBOARD_RBAC [(docs)](https://superset.apache.org/docs/using-superset/first-dashboard#manage-access-to-dashboards)
|
||||
- DATAPANEL_CLOSED_BY_DEFAULT
|
||||
- DISABLE_LEGACY_DATASOURCE_EDITOR
|
||||
- DRILL_BY
|
||||
- DRUID_JOINS
|
||||
- EMBEDDABLE_CHARTS
|
||||
- EMBEDDED_SUPERSET
|
||||
- ENABLE_TEMPLATE_PROCESSING
|
||||
- ESCAPE_MARKDOWN_HTML
|
||||
- LISTVIEWS_DEFAULT_CARD_VIEW
|
||||
- SCHEDULED_QUERIES [(docs)](https://superset.apache.org/docs/installation/alerts-reports)
|
||||
- SCHEDULED_QUERIES [(docs)](https://superset.apache.org/docs/configuration/alerts-reports)
|
||||
- SQLLAB_BACKEND_PERSISTENCE
|
||||
- SQL_VALIDATORS_BY_ENGINE [(docs)](https://superset.apache.org/docs/installation/sql-templating)
|
||||
- THUMBNAILS [(docs)](https://superset.apache.org/docs/installation/cache)
|
||||
- SQL_VALIDATORS_BY_ENGINE [(docs)](https://superset.apache.org/docs/configuration/sql-templating)
|
||||
- THUMBNAILS [(docs)](https://superset.apache.org/docs/configuration/cache)
|
||||
|
||||
## Deprecated Flags
|
||||
|
||||
|
|
|
@ -17,213 +17,211 @@ specific language governing permissions and limitations
|
|||
under the License.
|
||||
-->
|
||||
|
||||
||Admin|Alpha|Gamma|SQL_LAB|
|
||||
|---|---|---|---|---|
|
||||
|Permission/role description|Admins have all possible rights, including granting or revoking rights from other users and altering other people’s slices and dashboards.|Alpha users have access to all data sources, but they cannot grant or revoke access from other users. They are also limited to altering the objects that they own. Alpha users can add and alter data sources.|Gamma users have limited access. They can only consume data coming from data sources they have been given access to through another complementary role. They only have access to view the slices and dashboards made from data sources that they have access to. Currently Gamma users are not able to alter or add data sources. We assume that they are mostly content consumers, though they can create slices and dashboards.|The sql_lab role grants access to SQL Lab. Note that while Admin users have access to all databases by default, both Alpha and Gamma users need to be given access on a per database basis.||
|
||||
|can read on SavedQuery|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
|can write on SavedQuery|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
|can read on CssTemplate|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can write on CssTemplate|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can read on ReportSchedule|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can write on ReportSchedule|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can read on Chart|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can write on Chart|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can read on Annotation|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can write on Annotation|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can read on Dataset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can write on Dataset|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
|can read on Log|:heavy_check_mark:|O|O|O|
|
||||
|can write on Log|:heavy_check_mark:|O|O|O|
|
||||
|can read on Dashboard|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can write on Dashboard|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can read on Database|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
|can write on Database|:heavy_check_mark:|O|O|O|
|
||||
|can read on Query|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
|can this form get on ResetPasswordView|:heavy_check_mark:|O|O|O|
|
||||
|can this form post on ResetPasswordView|:heavy_check_mark:|O|O|O|
|
||||
|can this form get on ResetMyPasswordView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can this form post on ResetMyPasswordView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can this form get on UserInfoEditView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can this form post on UserInfoEditView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can show on UserDBModelView|:heavy_check_mark:|O|O|O|
|
||||
|can edit on UserDBModelView|:heavy_check_mark:|O|O|O|
|
||||
|can delete on UserDBModelView|:heavy_check_mark:|O|O|O|
|
||||
|can add on UserDBModelView|:heavy_check_mark:|O|O|O|
|
||||
|can list on UserDBModelView|:heavy_check_mark:|O|O|O|
|
||||
|can userinfo on UserDBModelView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|resetmypassword on UserDBModelView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|resetpasswords on UserDBModelView|:heavy_check_mark:|O|O|O|
|
||||
|userinfoedit on UserDBModelView|:heavy_check_mark:|O|O|O|
|
||||
|can show on RoleModelView|:heavy_check_mark:|O|O|O|
|
||||
|can edit on RoleModelView|:heavy_check_mark:|O|O|O|
|
||||
|can delete on RoleModelView|:heavy_check_mark:|O|O|O|
|
||||
|can add on RoleModelView|:heavy_check_mark:|O|O|O|
|
||||
|can list on RoleModelView|:heavy_check_mark:|O|O|O|
|
||||
|copyrole on RoleModelView|:heavy_check_mark:|O|O|O|
|
||||
|can get on OpenApi|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can show on SwaggerView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can get on MenuApi|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can list on AsyncEventsRestApi|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can invalidate on CacheRestApi|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can function names on Database|:heavy_check_mark:|O|O|O|
|
||||
|can query form data on Api|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can query on Api|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can time range on Api|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can this form get on CsvToDatabaseView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can this form post on CsvToDatabaseView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can this form get on ExcelToDatabaseView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can this form post on ExcelToDatabaseView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can external metadata on Datasource|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can save on Datasource|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
|can get on Datasource|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can my queries on SqlLab|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
|can log on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can schemas access for csv upload on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can import dashboards on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can schemas on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can sqllab history on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
|can publish on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can csv on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
|can slice on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can sync druid source on Superset|:heavy_check_mark:|O|O|O|
|
||||
|can explore on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can approve on Superset|:heavy_check_mark:|O|O|O|
|
||||
|can explore json on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can fetch datasource metadata on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can csrf token on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can sqllab on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
|can select star on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can warm up cache on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can sqllab table viz on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
|can available domains on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can request access on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can dashboard on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can post on TableSchemaView|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
|can expanded on TableSchemaView|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
|can delete on TableSchemaView|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
|can get on TabStateView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
|can post on TabStateView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
|can delete query on TabStateView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
|can migrate query on TabStateView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
|can activate on TabStateView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
|can delete on TabStateView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
|can put on TabStateView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
|can read on SecurityRestApi|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
|menu access on Security|:heavy_check_mark:|O|O|O|
|
||||
|menu access on List Users|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|menu access on List Roles|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|menu access on Action Log|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|menu access on Manage|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
|menu access on Annotation Layers|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|menu access on CSS Templates|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
|menu access on Import Dashboards|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|menu access on Data|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|menu access on Databases|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|menu access on Datasets|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|menu access on Upload a CSV|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
|menu access on Upload Excel|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|menu access on Charts|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|menu access on Dashboards|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|menu access on SQL Lab|:heavy_check_mark:|O|O|:heavy_check_mark:|
|
||||
|menu access on SQL Editor|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
|menu access on Saved Queries|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
|menu access on Query Search|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
|all datasource access on all_datasource_access|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
|all database access on all_database_access|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
|all query access on all_query_access|:heavy_check_mark:|O|O|O|
|
||||
|can edit on UserOAuthModelView|:heavy_check_mark:|O|O|O|
|
||||
|can list on UserOAuthModelView|:heavy_check_mark:|O|O|O|
|
||||
|can show on UserOAuthModelView|:heavy_check_mark:|O|O|O|
|
||||
|can userinfo on UserOAuthModelView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can add on UserOAuthModelView|:heavy_check_mark:|O|O|O|
|
||||
|can delete on UserOAuthModelView|:heavy_check_mark:|O|O|O|
|
||||
|userinfoedit on UserOAuthModelView|:heavy_check_mark:|O|O|O|
|
||||
|can write on DynamicPlugin|:heavy_check_mark:|O|O|O|
|
||||
|can edit on DynamicPlugin|:heavy_check_mark:|O|O|O|
|
||||
|can list on DynamicPlugin|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can show on DynamicPlugin|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can download on DynamicPlugin|:heavy_check_mark:|O|O|O|
|
||||
|can add on DynamicPlugin|:heavy_check_mark:|O|O|O|
|
||||
|can delete on DynamicPlugin|:heavy_check_mark:|O|O|O|
|
||||
|can edit on RowLevelSecurityFiltersModelView|:heavy_check_mark:|O|O|O|
|
||||
|can list on RowLevelSecurityFiltersModelView|:heavy_check_mark:|O|O|O|
|
||||
|can show on RowLevelSecurityFiltersModelView|:heavy_check_mark:|O|O|O|
|
||||
|can download on RowLevelSecurityFiltersModelView|:heavy_check_mark:|O|O|O|
|
||||
|can add on RowLevelSecurityFiltersModelView|:heavy_check_mark:|O|O|O|
|
||||
|can delete on RowLevelSecurityFiltersModelView|:heavy_check_mark:|O|O|O|
|
||||
|muldelete on RowLevelSecurityFiltersModelView|:heavy_check_mark:|O|O|O|
|
||||
|can external metadata by name on Datasource|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can get value on KV|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can store on KV|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can tagged objects on TagView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can suggestions on TagView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can get on TagView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can post on TagView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can delete on TagView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can edit on DashboardEmailScheduleView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can list on DashboardEmailScheduleView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can show on DashboardEmailScheduleView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can add on DashboardEmailScheduleView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can delete on DashboardEmailScheduleView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|muldelete on DashboardEmailScheduleView|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
|can edit on SliceEmailScheduleView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can list on SliceEmailScheduleView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can show on SliceEmailScheduleView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can add on SliceEmailScheduleView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can delete on SliceEmailScheduleView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|muldelete on SliceEmailScheduleView|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
|can edit on AlertModelView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can list on AlertModelView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can show on AlertModelView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can add on AlertModelView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can delete on AlertModelView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can list on AlertLogModelView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can show on AlertLogModelView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can list on AlertObservationModelView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can show on AlertObservationModelView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|menu access on Row Level Security|:heavy_check_mark:|O|O|O|
|
||||
|menu access on Access requests|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|menu access on Home|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|menu access on Plugins|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|menu access on Dashboard Email Schedules|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|menu access on Chart Emails|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|menu access on Alerts|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|menu access on Alerts & Report|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|menu access on Scan New Datasources|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can share dashboard on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can share chart on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can this form get on ColumnarToDatabaseView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can this form post on ColumnarToDatabaseView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|menu access on Upload a Columnar file|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can export on Chart|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can write on DashboardFilterStateRestApi|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can read on DashboardFilterStateRestApi|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can write on DashboardPermalinkRestApi|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can read on DashboardPermalinkRestApi|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can delete embedded on Dashboard|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can set embedded on Dashboard|:heavy_check_mark:|O|O|O|
|
||||
|can export on Dashboard|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can get embedded on Dashboard|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can export on Database|:heavy_check_mark:|O|O|O|
|
||||
|can export on Dataset|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
|can write on ExploreFormDataRestApi|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can read on ExploreFormDataRestApi|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can write on ExplorePermalinkRestApi|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can read on ExplorePermalinkRestApi|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can export on ImportExportRestApi|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can import on ImportExportRestApi|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can export on SavedQuery|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
|can dashboard permalink on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can grant guest token on SecurityRestApi|:heavy_check_mark:|O|O|O|
|
||||
|can read on AdvancedDataType|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can read on EmbeddedDashboard|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can duplicate on Dataset|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
|can read on Explore|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can samples on Datasource|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
|can read on AvailableDomains|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|can get or create dataset on Dataset|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
|can get column values on Datasource|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
|can export csv on SQLLab|:heavy_check_mark:|O|O|:heavy_check_mark:|
|
||||
|can get results on SQLLab|:heavy_check_mark:|O|O|:heavy_check_mark:|
|
||||
|can execute sql query on SQLLab|:heavy_check_mark:|O|O|:heavy_check_mark:|
|
||||
|can recent activity on Log|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| |Admin|Alpha|Gamma|SQL_LAB|
|
||||
|--------------------------------------------------|---|---|---|---|
|
||||
| Permission/role description |Admins have all possible rights, including granting or revoking rights from other users and altering other people’s slices and dashboards.|Alpha users have access to all data sources, but they cannot grant or revoke access from other users. They are also limited to altering the objects that they own. Alpha users can add and alter data sources.|Gamma users have limited access. They can only consume data coming from data sources they have been given access to through another complementary role. They only have access to view the slices and dashboards made from data sources that they have access to. Currently Gamma users are not able to alter or add data sources. We assume that they are mostly content consumers, though they can create slices and dashboards.|The sql_lab role grants access to SQL Lab. Note that while Admin users have access to all databases by default, both Alpha and Gamma users need to be given access on a per database basis.||
|
||||
| can read on SavedQuery |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can write on SavedQuery |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can read on CssTemplate |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can write on CssTemplate |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can read on ReportSchedule |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can write on ReportSchedule |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can read on Chart |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can write on Chart |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can read on Annotation |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can write on Annotation |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can read on Dataset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can write on Dataset |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can read on Log |:heavy_check_mark:|O|O|O|
|
||||
| can write on Log |:heavy_check_mark:|O|O|O|
|
||||
| can read on Dashboard |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can write on Dashboard |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can read on Database |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can write on Database |:heavy_check_mark:|O|O|O|
|
||||
| can read on Query |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can this form get on ResetPasswordView |:heavy_check_mark:|O|O|O|
|
||||
| can this form post on ResetPasswordView |:heavy_check_mark:|O|O|O|
|
||||
| can this form get on ResetMyPasswordView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can this form post on ResetMyPasswordView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can this form get on UserInfoEditView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can this form post on UserInfoEditView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can show on UserDBModelView |:heavy_check_mark:|O|O|O|
|
||||
| can edit on UserDBModelView |:heavy_check_mark:|O|O|O|
|
||||
| can delete on UserDBModelView |:heavy_check_mark:|O|O|O|
|
||||
| can add on UserDBModelView |:heavy_check_mark:|O|O|O|
|
||||
| can list on UserDBModelView |:heavy_check_mark:|O|O|O|
|
||||
| can userinfo on UserDBModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| resetmypassword on UserDBModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| resetpasswords on UserDBModelView |:heavy_check_mark:|O|O|O|
|
||||
| userinfoedit on UserDBModelView |:heavy_check_mark:|O|O|O|
|
||||
| can show on RoleModelView |:heavy_check_mark:|O|O|O|
|
||||
| can edit on RoleModelView |:heavy_check_mark:|O|O|O|
|
||||
| can delete on RoleModelView |:heavy_check_mark:|O|O|O|
|
||||
| can add on RoleModelView |:heavy_check_mark:|O|O|O|
|
||||
| can list on RoleModelView |:heavy_check_mark:|O|O|O|
|
||||
| copyrole on RoleModelView |:heavy_check_mark:|O|O|O|
|
||||
| can get on OpenApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can show on SwaggerView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can get on MenuApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can list on AsyncEventsRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can invalidate on CacheRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can function names on Database |:heavy_check_mark:|O|O|O|
|
||||
| can csv upload on Database |:heavy_check_mark:|O|O|O|
|
||||
| can excel upload on Database |:heavy_check_mark:|O|O|O|
|
||||
| can query form data on Api |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can query on Api |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can time range on Api |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can external metadata on Datasource |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can save on Datasource |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can get on Datasource |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can my queries on SqlLab |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can log on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can schemas access for csv upload on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can import dashboards on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can schemas on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can sqllab history on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can publish on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can csv on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can slice on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can sync druid source on Superset |:heavy_check_mark:|O|O|O|
|
||||
| can explore on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can approve on Superset |:heavy_check_mark:|O|O|O|
|
||||
| can explore json on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can fetch datasource metadata on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can csrf token on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can sqllab on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can select star on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can warm up cache on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can sqllab table viz on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can available domains on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can request access on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can dashboard on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can post on TableSchemaView |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can expanded on TableSchemaView |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can delete on TableSchemaView |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can get on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can post on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can delete query on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can migrate query on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can activate on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can delete on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can put on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can read on SecurityRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| menu access on Security |:heavy_check_mark:|O|O|O|
|
||||
| menu access on List Users |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on List Roles |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Action Log |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Manage |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| menu access on Annotation Layers |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on CSS Templates |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| menu access on Import Dashboards |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Data |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Databases |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Datasets |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Upload a CSV |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| menu access on Upload Excel |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Charts |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Dashboards |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on SQL Lab |:heavy_check_mark:|O|O|:heavy_check_mark:|
|
||||
| menu access on SQL Editor |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| menu access on Saved Queries |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| menu access on Query Search |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| all datasource access on all_datasource_access |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| all database access on all_database_access |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| all query access on all_query_access |:heavy_check_mark:|O|O|O|
|
||||
| can edit on UserOAuthModelView |:heavy_check_mark:|O|O|O|
|
||||
| can list on UserOAuthModelView |:heavy_check_mark:|O|O|O|
|
||||
| can show on UserOAuthModelView |:heavy_check_mark:|O|O|O|
|
||||
| can userinfo on UserOAuthModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can add on UserOAuthModelView |:heavy_check_mark:|O|O|O|
|
||||
| can delete on UserOAuthModelView |:heavy_check_mark:|O|O|O|
|
||||
| userinfoedit on UserOAuthModelView |:heavy_check_mark:|O|O|O|
|
||||
| can write on DynamicPlugin |:heavy_check_mark:|O|O|O|
|
||||
| can edit on DynamicPlugin |:heavy_check_mark:|O|O|O|
|
||||
| can list on DynamicPlugin |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can show on DynamicPlugin |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can download on DynamicPlugin |:heavy_check_mark:|O|O|O|
|
||||
| can add on DynamicPlugin |:heavy_check_mark:|O|O|O|
|
||||
| can delete on DynamicPlugin |:heavy_check_mark:|O|O|O|
|
||||
| can edit on RowLevelSecurityFiltersModelView |:heavy_check_mark:|O|O|O|
|
||||
| can list on RowLevelSecurityFiltersModelView |:heavy_check_mark:|O|O|O|
|
||||
| can show on RowLevelSecurityFiltersModelView |:heavy_check_mark:|O|O|O|
|
||||
| can download on RowLevelSecurityFiltersModelView |:heavy_check_mark:|O|O|O|
|
||||
| can add on RowLevelSecurityFiltersModelView |:heavy_check_mark:|O|O|O|
|
||||
| can delete on RowLevelSecurityFiltersModelView |:heavy_check_mark:|O|O|O|
|
||||
| muldelete on RowLevelSecurityFiltersModelView |:heavy_check_mark:|O|O|O|
|
||||
| can external metadata by name on Datasource |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can get value on KV |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can store on KV |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can tagged objects on TagView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can suggestions on TagView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can get on TagView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can post on TagView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can delete on TagView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can edit on DashboardEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can list on DashboardEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can show on DashboardEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can add on DashboardEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can delete on DashboardEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| muldelete on DashboardEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can edit on SliceEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can list on SliceEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can show on SliceEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can add on SliceEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can delete on SliceEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| muldelete on SliceEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can edit on AlertModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can list on AlertModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can show on AlertModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can add on AlertModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can delete on AlertModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can list on AlertLogModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can show on AlertLogModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can list on AlertObservationModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can show on AlertObservationModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Row Level Security |:heavy_check_mark:|O|O|O|
|
||||
| menu access on Access requests |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Home |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Plugins |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Dashboard Email Schedules |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Chart Emails |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Alerts |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Alerts & Report |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Scan New Datasources |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can share dashboard on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can share chart on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can this form get on ColumnarToDatabaseView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can this form post on ColumnarToDatabaseView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Upload a Columnar file |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can export on Chart |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can write on DashboardFilterStateRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can read on DashboardFilterStateRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can write on DashboardPermalinkRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can read on DashboardPermalinkRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can delete embedded on Dashboard |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can set embedded on Dashboard |:heavy_check_mark:|O|O|O|
|
||||
| can export on Dashboard |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can get embedded on Dashboard |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can export on Database |:heavy_check_mark:|O|O|O|
|
||||
| can export on Dataset |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can write on ExploreFormDataRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can read on ExploreFormDataRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can write on ExplorePermalinkRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can read on ExplorePermalinkRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can export on ImportExportRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can import on ImportExportRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can export on SavedQuery |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can dashboard permalink on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can grant guest token on SecurityRestApi |:heavy_check_mark:|O|O|O|
|
||||
| can read on AdvancedDataType |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can read on EmbeddedDashboard |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can duplicate on Dataset |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can read on Explore |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can samples on Datasource |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can read on AvailableDomains |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can get or create dataset on Dataset |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can get column values on Datasource |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can export csv on SQLLab |:heavy_check_mark:|O|O|:heavy_check_mark:|
|
||||
| can get results on SQLLab |:heavy_check_mark:|O|O|:heavy_check_mark:|
|
||||
| can execute sql query on SQLLab |:heavy_check_mark:|O|O|:heavy_check_mark:|
|
||||
| can recent activity on Log |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
|
|
28
UPDATING.md
28
UPDATING.md
|
@ -23,18 +23,36 @@ This file documents any backwards-incompatible changes in Superset and
|
|||
assists people when migrating to a new version.
|
||||
|
||||
## Next
|
||||
|
||||
- [27505](https://github.com/apache/superset/pull/27505): We simplified the files under
|
||||
`requirements/` folder. If you use these files for your builds you may want to double
|
||||
check that your builds are not affected. `base.txt` should be the same as before, though
|
||||
`development.txt` becomes a bigger set, incorporating the now defunct local,testing,integration, and docker
|
||||
- [27119](https://github.com/apache/superset/pull/27119): Updates various database columns to use the `MediumText` type, potentially requiring a table lock on MySQL dbs or taking some time to complete on large deployments.
|
||||
|
||||
- [26450](https://github.com/apache/superset/pull/26450): Deprecates the `KV_STORE` feature flag and its related assets such as the API endpoint and `keyvalue` table. The main dependency of this feature is the `SHARE_QUERIES_VIA_KV_STORE` feature flag which allows sharing SQL Lab queries without the necessity of saving the query. Our intention is to use the permalink feature to implement this use case before 5.0 and that's why we are deprecating the feature flag now.
|
||||
|
||||
- [27434](https://github.com/apache/superset/pull/27434/files): DO NOT USE our docker-compose.*
|
||||
- [27434](https://github.com/apache/superset/pull/27434/files): DO NOT USE our docker-compose.\*
|
||||
files for production use cases! While we never really supported
|
||||
or should have tried to support docker-compose for production use cases, we now actively
|
||||
have taken a stance against supporting it. See the PR for details.
|
||||
- [24112](https://github.com/apache/superset/pull/24112): Python 3.10 is now the recommended python version to use, 3.9 still
|
||||
supported but getting deprecated in the nearish future. CI/CD runs on py310 so you probably want to align. If you
|
||||
use official dockers, upgrade should happen automatically.
|
||||
- [27697](https://github.com/apache/superset/pull/27697) [minor] flask-session bump leads to them
|
||||
deprecating `SESSION_USE_SIGNER`, check your configs as this flag won't do anything moving
|
||||
forward.
|
||||
- [27849](https://github.com/apache/superset/pull/27849/) More of an FYI, but we have a
|
||||
new config `SLACK_ENABLE_AVATARS` (False by default) that works in conjunction with
|
||||
set `SLACK_API_TOKEN` to fetch and serve Slack avatar links
|
||||
- [28134](https://github.com/apache/superset/pull/28134/) The default logging level was changed
|
||||
from DEBUG to INFO - which is the normal/sane default logging level for most software.
|
||||
- [28205](https://github.com/apache/superset/pull/28205) The permission `all_database_access` now
|
||||
more clearly provides access to all databases, as specified in its name. Before it only allowed
|
||||
listing all databases in CRUD-view and dropdown and didn't provide access to data as it
|
||||
seemed the name would imply.
|
||||
|
||||
## 4.0.0
|
||||
|
||||
- [27119](https://github.com/apache/superset/pull/27119): Updates various database columns to use the `MediumText` type, potentially requiring a table lock on MySQL dbs or taking some time to complete on large deployments.
|
||||
|
||||
- [26450](https://github.com/apache/superset/pull/26450): Deprecates the `KV_STORE` feature flag and its related assets such as the API endpoint and `keyvalue` table. The main dependency of this feature is the `SHARE_QUERIES_VIA_KV_STORE` feature flag which allows sharing SQL Lab queries without the necessity of saving the query. Our intention is to use the permalink feature to implement this use case before 5.0 and that's why we are deprecating the feature flag now.
|
||||
|
||||
### Breaking Changes
|
||||
|
||||
|
|
Binary file not shown.
After Width: | Height: | Size: 10 KiB |
|
@ -14,6 +14,13 @@
|
|||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
#
|
||||
|
||||
# -----------------------------------------------------------------------
|
||||
# We don't support docker-compose for production environments.
|
||||
# If you choose to use this type of deployment make sure to
|
||||
# create you own docker environment file (docker/.env) with your own
|
||||
# unique random secure passwords and SECRET_KEY.
|
||||
# -----------------------------------------------------------------------
|
||||
x-superset-image: &superset-image apachesuperset.docker.scarf.sh/apache/superset:${TAG:-latest}
|
||||
x-superset-depends-on: &superset-depends-on
|
||||
- db
|
||||
|
@ -33,7 +40,11 @@ services:
|
|||
- redis:/data
|
||||
|
||||
db:
|
||||
env_file: docker/.env
|
||||
env_file:
|
||||
- path: docker/.env # default
|
||||
required: true
|
||||
- path: docker/.env-local # optional override
|
||||
required: false
|
||||
image: postgres:15
|
||||
container_name: superset_db
|
||||
restart: unless-stopped
|
||||
|
@ -42,7 +53,11 @@ services:
|
|||
- ./docker/docker-entrypoint-initdb.d:/docker-entrypoint-initdb.d
|
||||
|
||||
superset:
|
||||
env_file: docker/.env
|
||||
env_file:
|
||||
- path: docker/.env # default
|
||||
required: true
|
||||
- path: docker/.env-local # optional override
|
||||
required: false
|
||||
image: *superset-image
|
||||
container_name: superset_app
|
||||
command: ["/app/docker/docker-bootstrap.sh", "app-gunicorn"]
|
||||
|
@ -57,7 +72,11 @@ services:
|
|||
image: *superset-image
|
||||
container_name: superset_init
|
||||
command: ["/app/docker/docker-init.sh"]
|
||||
env_file: docker/.env
|
||||
env_file:
|
||||
- path: docker/.env # default
|
||||
required: true
|
||||
- path: docker/.env-local # optional override
|
||||
required: false
|
||||
depends_on: *superset-depends-on
|
||||
user: "root"
|
||||
volumes: *superset-volumes
|
||||
|
@ -68,7 +87,11 @@ services:
|
|||
image: *superset-image
|
||||
container_name: superset_worker
|
||||
command: ["/app/docker/docker-bootstrap.sh", "worker"]
|
||||
env_file: docker/.env
|
||||
env_file:
|
||||
- path: docker/.env # default
|
||||
required: true
|
||||
- path: docker/.env-local # optional override
|
||||
required: false
|
||||
restart: unless-stopped
|
||||
depends_on: *superset-depends-on
|
||||
user: "root"
|
||||
|
@ -84,7 +107,11 @@ services:
|
|||
image: *superset-image
|
||||
container_name: superset_worker_beat
|
||||
command: ["/app/docker/docker-bootstrap.sh", "beat"]
|
||||
env_file: docker/.env
|
||||
env_file:
|
||||
- path: docker/.env # default
|
||||
required: true
|
||||
- path: docker/.env-local # optional override
|
||||
required: false
|
||||
restart: unless-stopped
|
||||
depends_on: *superset-depends-on
|
||||
user: "root"
|
||||
|
|
|
@ -14,6 +14,13 @@
|
|||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
#
|
||||
|
||||
# -----------------------------------------------------------------------
|
||||
# We don't support docker-compose for production environments.
|
||||
# If you choose to use this type of deployment make sure to
|
||||
# create you own docker environment file (docker/.env) with your own
|
||||
# unique random secure passwords and SECRET_KEY.
|
||||
# -----------------------------------------------------------------------
|
||||
x-superset-depends-on: &superset-depends-on
|
||||
- db
|
||||
- redis
|
||||
|
@ -26,7 +33,7 @@ x-common-build: &common-build
|
|||
context: .
|
||||
target: dev
|
||||
cache_from:
|
||||
- apache/superset-cache:3.9-slim-bookworm
|
||||
- apache/superset-cache:3.10-slim-bookworm
|
||||
|
||||
version: "4.0"
|
||||
services:
|
||||
|
@ -38,7 +45,11 @@ services:
|
|||
- redis:/data
|
||||
|
||||
db:
|
||||
env_file: docker/.env
|
||||
env_file:
|
||||
- path: docker/.env # default
|
||||
required: true
|
||||
- path: docker/.env-local # optional override
|
||||
required: false
|
||||
image: postgres:15
|
||||
container_name: superset_db
|
||||
restart: unless-stopped
|
||||
|
@ -47,7 +58,11 @@ services:
|
|||
- ./docker/docker-entrypoint-initdb.d:/docker-entrypoint-initdb.d
|
||||
|
||||
superset:
|
||||
env_file: docker/.env
|
||||
env_file:
|
||||
- path: docker/.env # default
|
||||
required: true
|
||||
- path: docker/.env-local # optional override
|
||||
required: false
|
||||
build:
|
||||
<<: *common-build
|
||||
container_name: superset_app
|
||||
|
@ -64,7 +79,11 @@ services:
|
|||
build:
|
||||
<<: *common-build
|
||||
command: ["/app/docker/docker-init.sh"]
|
||||
env_file: docker/.env
|
||||
env_file:
|
||||
- path: docker/.env # default
|
||||
required: true
|
||||
- path: docker/.env-local # optional override
|
||||
required: false
|
||||
depends_on: *superset-depends-on
|
||||
user: "root"
|
||||
volumes: *superset-volumes
|
||||
|
@ -76,7 +95,11 @@ services:
|
|||
<<: *common-build
|
||||
container_name: superset_worker
|
||||
command: ["/app/docker/docker-bootstrap.sh", "worker"]
|
||||
env_file: docker/.env
|
||||
env_file:
|
||||
- path: docker/.env # default
|
||||
required: true
|
||||
- path: docker/.env-local # optional override
|
||||
required: false
|
||||
restart: unless-stopped
|
||||
depends_on: *superset-depends-on
|
||||
user: "root"
|
||||
|
@ -93,7 +116,11 @@ services:
|
|||
<<: *common-build
|
||||
container_name: superset_worker_beat
|
||||
command: ["/app/docker/docker-bootstrap.sh", "beat"]
|
||||
env_file: docker/.env
|
||||
env_file:
|
||||
- path: docker/.env # default
|
||||
required: true
|
||||
- path: docker/.env-local # optional override
|
||||
required: false
|
||||
restart: unless-stopped
|
||||
depends_on: *superset-depends-on
|
||||
user: "root"
|
||||
|
|
|
@ -14,6 +14,13 @@
|
|||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
#
|
||||
|
||||
# -----------------------------------------------------------------------
|
||||
# We don't support docker-compose for production environments.
|
||||
# If you choose to use this type of deployment make sure to
|
||||
# create you own docker environment file (docker/.env) with your own
|
||||
# unique random secure passwords and SECRET_KEY.
|
||||
# -----------------------------------------------------------------------
|
||||
x-superset-user: &superset-user root
|
||||
x-superset-depends-on: &superset-depends-on
|
||||
- db
|
||||
|
@ -30,9 +37,8 @@ x-common-build: &common-build
|
|||
context: .
|
||||
target: dev
|
||||
cache_from:
|
||||
- apache/superset-cache:3.9-slim-bookworm
|
||||
- apache/superset-cache:3.10-slim-bookworm
|
||||
|
||||
version: "4.0"
|
||||
services:
|
||||
nginx:
|
||||
image: nginx:latest
|
||||
|
@ -54,7 +60,11 @@ services:
|
|||
- redis:/data
|
||||
|
||||
db:
|
||||
env_file: docker/.env
|
||||
env_file:
|
||||
- path: docker/.env # default
|
||||
required: true
|
||||
- path: docker/.env-local # optional override
|
||||
required: false
|
||||
image: postgres:15
|
||||
container_name: superset_db
|
||||
restart: unless-stopped
|
||||
|
@ -65,7 +75,11 @@ services:
|
|||
- ./docker/docker-entrypoint-initdb.d:/docker-entrypoint-initdb.d
|
||||
|
||||
superset:
|
||||
env_file: docker/.env
|
||||
env_file:
|
||||
- path: docker/.env # default
|
||||
required: true
|
||||
- path: docker/.env-local # optional override
|
||||
required: false
|
||||
build:
|
||||
<<: *common-build
|
||||
container_name: superset_app
|
||||
|
@ -79,12 +93,11 @@ services:
|
|||
depends_on: *superset-depends-on
|
||||
volumes: *superset-volumes
|
||||
environment:
|
||||
CYPRESS_CONFIG: "${CYPRESS_CONFIG}"
|
||||
CYPRESS_CONFIG: "${CYPRESS_CONFIG:-}"
|
||||
|
||||
superset-websocket:
|
||||
container_name: superset_websocket
|
||||
build: ./superset-websocket
|
||||
image: superset-websocket
|
||||
ports:
|
||||
- 8080:8080
|
||||
extra_hosts:
|
||||
|
@ -105,6 +118,10 @@ services:
|
|||
- ./superset-websocket:/home/superset-websocket
|
||||
- /home/superset-websocket/node_modules
|
||||
- /home/superset-websocket/dist
|
||||
|
||||
# Mounting a config file that contains a dummy secret required to boot up.
|
||||
# do not use this docker-compose in production
|
||||
- ./docker/superset-websocket/config.json:/home/superset-websocket/config.json
|
||||
environment:
|
||||
- PORT=8080
|
||||
- REDIS_HOST=redis
|
||||
|
@ -116,26 +133,32 @@ services:
|
|||
<<: *common-build
|
||||
container_name: superset_init
|
||||
command: ["/app/docker/docker-init.sh"]
|
||||
env_file: docker/.env
|
||||
env_file:
|
||||
- path: docker/.env # default
|
||||
required: true
|
||||
- path: docker/.env-local # optional override
|
||||
required: false
|
||||
depends_on: *superset-depends-on
|
||||
user: *superset-user
|
||||
volumes: *superset-volumes
|
||||
environment:
|
||||
CYPRESS_CONFIG: "${CYPRESS_CONFIG}"
|
||||
CYPRESS_CONFIG: "${CYPRESS_CONFIG:-}"
|
||||
healthcheck:
|
||||
disable: true
|
||||
|
||||
superset-node:
|
||||
image: node:16
|
||||
image: node:18
|
||||
environment:
|
||||
# set this to false if you have perf issues running the npm i; npm run dev in-docker
|
||||
# if you do so, you have to run this manually on the host, which should perform better!
|
||||
BUILD_SUPERSET_FRONTEND_IN_DOCKER: ${BUILD_SUPERSET_FRONTEND_IN_DOCKER:-true}
|
||||
SCARF_ANALYTICS: "${SCARF_ANALYTICS}"
|
||||
PUPPETEER_SKIP_CHROMIUM_DOWNLOAD: ${BUILD_SUPERSET_FRONTEND_IN_DOCKER:-false}
|
||||
SCARF_ANALYTICS: "${SCARF_ANALYTICS:-}"
|
||||
container_name: superset_node
|
||||
command: ["/app/docker/docker-frontend.sh"]
|
||||
env_file: docker/.env
|
||||
env_file:
|
||||
- path: docker/.env # default
|
||||
required: true
|
||||
- path: docker/.env-local # optional override
|
||||
required: false
|
||||
depends_on: *superset-depends-on
|
||||
volumes: *superset-volumes
|
||||
|
||||
|
@ -144,7 +167,11 @@ services:
|
|||
<<: *common-build
|
||||
container_name: superset_worker
|
||||
command: ["/app/docker/docker-bootstrap.sh", "worker"]
|
||||
env_file: docker/.env
|
||||
env_file:
|
||||
- path: docker/.env # default
|
||||
required: true
|
||||
- path: docker/.env-local # optional override
|
||||
required: false
|
||||
restart: unless-stopped
|
||||
depends_on: *superset-depends-on
|
||||
user: *superset-user
|
||||
|
@ -162,7 +189,11 @@ services:
|
|||
<<: *common-build
|
||||
container_name: superset_worker_beat
|
||||
command: ["/app/docker/docker-bootstrap.sh", "beat"]
|
||||
env_file: docker/.env
|
||||
env_file:
|
||||
- path: docker/.env # default
|
||||
required: true
|
||||
- path: docker/.env-local # optional override
|
||||
required: false
|
||||
restart: unless-stopped
|
||||
depends_on: *superset-depends-on
|
||||
user: *superset-user
|
||||
|
@ -175,7 +206,11 @@ services:
|
|||
<<: *common-build
|
||||
container_name: superset_tests_worker
|
||||
command: ["/app/docker/docker-bootstrap.sh", "worker"]
|
||||
env_file: docker/.env
|
||||
env_file:
|
||||
- path: docker/.env # default
|
||||
required: true
|
||||
- path: docker/.env-local # optional override
|
||||
required: false
|
||||
environment:
|
||||
DATABASE_HOST: localhost
|
||||
DATABASE_DB: test
|
||||
|
|
10
docker/.env
10
docker/.env
|
@ -14,17 +14,21 @@
|
|||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
#
|
||||
|
||||
|
||||
COMPOSE_PROJECT_NAME=superset
|
||||
|
||||
# database configurations (do not modify)
|
||||
DATABASE_DB=superset
|
||||
DATABASE_HOST=db
|
||||
# Make sure you set this to a unique secure random value on production
|
||||
DATABASE_PASSWORD=superset
|
||||
DATABASE_USER=superset
|
||||
|
||||
EXAMPLES_DB=examples
|
||||
EXAMPLES_HOST=db
|
||||
EXAMPLES_USER=examples
|
||||
# Make sure you set this to a unique secure random value on production
|
||||
EXAMPLES_PASSWORD=examples
|
||||
EXAMPLES_PORT=5432
|
||||
|
||||
|
@ -34,6 +38,7 @@ DATABASE_PORT=5432
|
|||
DATABASE_DIALECT=postgresql
|
||||
POSTGRES_DB=superset
|
||||
POSTGRES_USER=superset
|
||||
# Make sure you set this to a unique secure random value on production
|
||||
POSTGRES_PASSWORD=superset
|
||||
#MYSQL_DATABASE=superset
|
||||
#MYSQL_USER=superset
|
||||
|
@ -52,4 +57,9 @@ CYPRESS_CONFIG=false
|
|||
SUPERSET_PORT=8088
|
||||
MAPBOX_API_KEY=''
|
||||
|
||||
# Make sure you set this to a unique secure random value on production
|
||||
SUPERSET_SECRET_KEY=TEST_NON_DEV_SECRET
|
||||
|
||||
ENABLE_PLAYWRIGHT=false
|
||||
PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=true
|
||||
BUILD_SUPERSET_FRONTEND_IN_DOCKER=true
|
||||
|
|
|
@ -35,14 +35,6 @@ else
|
|||
echo "Skipping local overrides"
|
||||
fi
|
||||
|
||||
#
|
||||
# playwright is an optional package - run only if it is installed
|
||||
#
|
||||
if command -v playwright > /dev/null 2>&1; then
|
||||
playwright install-deps
|
||||
playwright install chromium
|
||||
fi
|
||||
|
||||
case "${1}" in
|
||||
worker)
|
||||
echo "Starting Celery worker..."
|
||||
|
|
|
@ -18,8 +18,8 @@
|
|||
set -e
|
||||
|
||||
# Packages needed for puppeteer:
|
||||
apt update
|
||||
if [ "$PUPPETEER_SKIP_CHROMIUM_DOWNLOAD" = "false" ]; then
|
||||
apt update
|
||||
apt install -y chromium
|
||||
fi
|
||||
|
||||
|
|
|
@ -0,0 +1,22 @@
|
|||
{
|
||||
"port": 8080,
|
||||
"logLevel": "info",
|
||||
"logToFile": false,
|
||||
"logFilename": "app.log",
|
||||
"statsd": {
|
||||
"host": "127.0.0.1",
|
||||
"port": 8125,
|
||||
"globalTags": []
|
||||
},
|
||||
"redis": {
|
||||
"port": 6379,
|
||||
"host": "127.0.0.1",
|
||||
"password": "",
|
||||
"db": 0,
|
||||
"ssl": false
|
||||
},
|
||||
"redisStreamPrefix": "async-events-",
|
||||
"jwtAlgorithms": ["HS256"],
|
||||
"jwtSecret": "CHANGE-ME-IN-PRODUCTION-GOTTA-BE-LONG-AND-SECRET",
|
||||
"jwtCookieName": "async-token"
|
||||
}
|
|
@ -20,3 +20,6 @@ yarn-debug.log*
|
|||
yarn-error.log*
|
||||
|
||||
docs/.zshrc
|
||||
|
||||
# Gets copied from the root of the project at build time (yarn start / yarn build)
|
||||
docs/intro.md
|
||||
|
|
|
@ -0,0 +1,102 @@
|
|||
{
|
||||
"countries": [
|
||||
"Afghanistan",
|
||||
"Albania",
|
||||
"Algeria",
|
||||
"Argentina",
|
||||
"Australia",
|
||||
"Austria",
|
||||
"Belgium",
|
||||
"Bolivia",
|
||||
"Brazil",
|
||||
"Bulgaria",
|
||||
"Burundi",
|
||||
"Canada",
|
||||
"Chile",
|
||||
"China",
|
||||
"Colombia",
|
||||
"Costa Rica",
|
||||
"Cuba",
|
||||
"Cyprus",
|
||||
"Czech Republic",
|
||||
"Denmark",
|
||||
"Dominican Republic",
|
||||
"Ecuador",
|
||||
"Egypt",
|
||||
"El Salvador",
|
||||
"Estonia",
|
||||
"Ethiopia",
|
||||
"France",
|
||||
"France (regions)",
|
||||
"Finland",
|
||||
"Germany",
|
||||
"Guatemala",
|
||||
"Haiti",
|
||||
"Honduras",
|
||||
"Iceland",
|
||||
"India",
|
||||
"Indonesia",
|
||||
"Iran",
|
||||
"Italy",
|
||||
"Italy (regions)",
|
||||
"Japan",
|
||||
"Jordan",
|
||||
"Kazakhstan",
|
||||
"Kenya",
|
||||
"Korea",
|
||||
"Kuwait",
|
||||
"Kyrgyzstan",
|
||||
"Latvia",
|
||||
"Liechtenstein",
|
||||
"Lithuania",
|
||||
"Malaysia",
|
||||
"Mexico",
|
||||
"Morocco",
|
||||
"Myanmar",
|
||||
"Netherlands",
|
||||
"Nicaragua",
|
||||
"Nigeria",
|
||||
"Norway",
|
||||
"Oman",
|
||||
"Pakistan",
|
||||
"Panama",
|
||||
"Papua New Guinea",
|
||||
"Paraguay",
|
||||
"Peru",
|
||||
"Philippines",
|
||||
"Philippines (regions)",
|
||||
"Portugal",
|
||||
"Poland",
|
||||
"Puerto Rico",
|
||||
"Qatar",
|
||||
"Russia",
|
||||
"Rwanda",
|
||||
"Saint Barthelemy",
|
||||
"Saint Martin",
|
||||
"Saudi Arabia",
|
||||
"Singapore",
|
||||
"Slovenia",
|
||||
"Spain",
|
||||
"Sri Lanka",
|
||||
"Sweden",
|
||||
"Switzerland",
|
||||
"Syria",
|
||||
"Tajikistan",
|
||||
"Tanzania",
|
||||
"Thailand",
|
||||
"Timorleste",
|
||||
"Turkey",
|
||||
"Turkey (regions)",
|
||||
"Turkmenistan",
|
||||
"Uganda",
|
||||
"UK",
|
||||
"Ukraine",
|
||||
"United Arab Emirates",
|
||||
"Uruguay",
|
||||
"USA",
|
||||
"Uzbekistan",
|
||||
"Venezuela",
|
||||
"Vietnam",
|
||||
"Zambia"
|
||||
]
|
||||
}
|
|
@ -1,11 +1,11 @@
|
|||
---
|
||||
title: Alerts and Reports
|
||||
hide_title: true
|
||||
sidebar_position: 10
|
||||
sidebar_position: 2
|
||||
version: 2
|
||||
---
|
||||
|
||||
## Alerts and Reports
|
||||
# Alerts and Reports
|
||||
|
||||
Users can configure automated alerts and reports to send dashboards or charts to an email recipient or Slack channel.
|
||||
|
||||
|
@ -14,28 +14,28 @@ Users can configure automated alerts and reports to send dashboards or charts to
|
|||
|
||||
Alerts and reports are disabled by default. To turn them on, you need to do some setup, described here.
|
||||
|
||||
### Requirements
|
||||
## Requirements
|
||||
|
||||
#### Commons
|
||||
### Commons
|
||||
|
||||
##### In your `superset_config.py` or `superset_config_docker.py`
|
||||
#### In your `superset_config.py` or `superset_config_docker.py`
|
||||
|
||||
- `"ALERT_REPORTS"` [feature flag](/docs/installation/configuring-superset#feature-flags) must be turned to True.
|
||||
- `"ALERT_REPORTS"` [feature flag](/docs/configuration/configuring-superset#feature-flags) must be turned to True.
|
||||
- `beat_schedule` in CeleryConfig must contain schedule for `reports.scheduler`.
|
||||
- At least one of those must be configured, depending on what you want to use:
|
||||
- emails: `SMTP_*` settings
|
||||
- Slack messages: `SLACK_API_TOKEN`
|
||||
|
||||
###### Disable dry-run mode
|
||||
##### Disable dry-run mode
|
||||
|
||||
Screenshots will be taken but no messages actually sent as long as `ALERT_REPORTS_NOTIFICATION_DRY_RUN = True`, its default value in `docker/pythonpath_dev/superset_config.py`. To disable dry-run mode and start receiving email/Slack notifications, set `ALERT_REPORTS_NOTIFICATION_DRY_RUN` to `False` in [superset config](https://github.com/apache/superset/blob/master/docker/pythonpath_dev/superset_config.py).
|
||||
|
||||
##### In your `Dockerfile`
|
||||
#### In your `Dockerfile`
|
||||
|
||||
- You must install a headless browser, for taking screenshots of the charts and dashboards. Only Firefox and Chrome are currently supported.
|
||||
> If you choose Chrome, you must also change the value of `WEBDRIVER_TYPE` to `"chrome"` in your `superset_config.py`.
|
||||
|
||||
Note: All the components required (Firefox headless browser, Redis, Postgres db, celery worker and celery beat) are present in the *dev* docker image if you are following [Installing Superset Locally](/docs/installation/installing-superset-using-docker-compose/).
|
||||
Note: All the components required (Firefox headless browser, Redis, Postgres db, celery worker and celery beat) are present in the *dev* docker image if you are following [Installing Superset Locally](/docs/installation/docker-compose/).
|
||||
All you need to do is add the required config variables described in this guide (See `Detailed Config`).
|
||||
|
||||
If you are running a non-dev docker image, e.g., a stable release like `apache/superset:3.1.0`, that image does not include a headless browser. Only the `superset_worker` container needs this headless browser to browse to the target chart or dashboard.
|
||||
|
@ -43,11 +43,11 @@ You can either install and configure the headless browser - see "Custom Dockerfi
|
|||
|
||||
*Note*: In this context, a "dev image" is the same application software as its corresponding non-dev image, just bundled with additional tools. So an image like `3.1.0-dev` is identical to `3.1.0` when it comes to stability, functionality, and running in production. The actual "in-development" versions of Superset - cutting-edge and unstable - are not tagged with version numbers on Docker Hub and will display version `0.0.0-dev` within the Superset UI.
|
||||
|
||||
#### Slack integration
|
||||
### Slack integration
|
||||
|
||||
To send alerts and reports to Slack channels, you need to create a new Slack Application on your workspace.
|
||||
|
||||
1. Connect to your Slack workspace, then head to <https://api.slack.com/apps>.
|
||||
1. Connect to your Slack workspace, then head to [https://api.slack.com/apps].
|
||||
2. Create a new app.
|
||||
3. Go to "OAuth & Permissions" section, and give the following scopes to your app:
|
||||
- `incoming-webhook`
|
||||
|
@ -61,14 +61,14 @@ To send alerts and reports to Slack channels, you need to create a new Slack App
|
|||
|
||||
Note: when you configure an alert or a report, the Slack channel list takes channel names without the leading '#' e.g. use `alerts` instead of `#alerts`.
|
||||
|
||||
#### Kubernetes-specific
|
||||
### Kubernetes-specific
|
||||
|
||||
- You must have a `celery beat` pod running. If you're using the chart included in the GitHub repository under [helm/superset](https://github.com/apache/superset/tree/master/helm/superset), you need to put `supersetCeleryBeat.enabled = true` in your values override.
|
||||
- You can see the dedicated docs about [Kubernetes installation](/docs/installation/running-on-kubernetes) for more generic details.
|
||||
- You can see the dedicated docs about [Kubernetes installation](/docs/installation/kubernetes) for more details.
|
||||
|
||||
#### Docker Compose specific
|
||||
### Docker Compose specific
|
||||
|
||||
##### You must have in your `docker-compose.yml`
|
||||
#### You must have in your `docker-compose.yml`
|
||||
|
||||
- A Redis message broker
|
||||
- PostgreSQL DB instead of SQLlite
|
||||
|
@ -195,14 +195,14 @@ Please refer to `ExecutorType` in the codebase for other executor types.
|
|||
its default value of `http://0.0.0.0:8080/`.
|
||||
|
||||
|
||||
### Custom Dockerfile
|
||||
## Custom Dockerfile
|
||||
|
||||
If you're running the dev version of a released Superset image, like `apache/superset:3.1.0-dev`, you should be set with the above.
|
||||
|
||||
But if you're building your own image, or starting with a non-dev version, a webdriver (and headless browser) is needed to capture screenshots of the charts and dashboards which are then sent to the recipient.
|
||||
Here's how you can modify your Dockerfile to take the screenshots either with Firefox or Chrome.
|
||||
|
||||
#### Using Firefox
|
||||
### Using Firefox
|
||||
|
||||
```docker
|
||||
FROM apache/superset:3.1.0
|
||||
|
@ -223,7 +223,7 @@ RUN pip install --no-cache gevent psycopg2 redis
|
|||
USER superset
|
||||
```
|
||||
|
||||
#### Using Chrome
|
||||
### Using Chrome
|
||||
|
||||
```docker
|
||||
FROM apache/superset:3.1.0
|
||||
|
@ -248,21 +248,21 @@ USER superset
|
|||
|
||||
Don't forget to set `WEBDRIVER_TYPE` and `WEBDRIVER_OPTION_ARGS` in your config if you use Chrome.
|
||||
|
||||
### Troubleshooting
|
||||
## Troubleshooting
|
||||
|
||||
There are many reasons that reports might not be working. Try these steps to check for specific issues.
|
||||
|
||||
#### Confirm feature flag is enabled and you have sufficient permissions
|
||||
### Confirm feature flag is enabled and you have sufficient permissions
|
||||
|
||||
If you don't see "Alerts & Reports" under the *Manage* section of the Settings dropdown in the Superset UI, you need to enable the `ALERT_REPORTS` feature flag (see above). Enable another feature flag and check to see that it took effect, to verify that your config file is getting loaded.
|
||||
|
||||
Log in as an admin user to ensure you have adequate permissions.
|
||||
|
||||
#### Check the logs of your Celery worker
|
||||
### Check the logs of your Celery worker
|
||||
|
||||
This is the best source of information about the problem. In a docker compose deployment, you can do this with a command like `docker logs superset_worker --since 1h`.
|
||||
|
||||
#### Check web browser and webdriver installation
|
||||
### Check web browser and webdriver installation
|
||||
|
||||
To take a screenshot, the worker visits the dashboard or chart using a headless browser, then takes a screenshot. If you are able to send a chart as CSV or text but can't send as PNG, your problem may lie with the browser.
|
||||
|
||||
|
@ -270,7 +270,7 @@ Superset docker images that have a tag ending with `-dev` have the Firefox headl
|
|||
|
||||
If you are handling the installation of that software on your own, or wish to use Chromium instead, do your own verification to ensure that the headless browser opens successfully in the worker environment.
|
||||
|
||||
#### Send a test email
|
||||
### Send a test email
|
||||
|
||||
One symptom of an invalid connection to an email server is receiving an error of `[Errno 110] Connection timed out` in your logs when the report tries to send.
|
||||
|
||||
|
@ -301,7 +301,7 @@ Possible fixes:
|
|||
- Some cloud hosts disable outgoing unauthenticated SMTP email to prevent spam. For instance, [Azure blocks port 25 by default on some machines](https://learn.microsoft.com/en-us/azure/virtual-network/troubleshoot-outbound-smtp-connectivity). Enable that port or use another sending method.
|
||||
- Use another set of SMTP credentials that you verify works in this setup.
|
||||
|
||||
#### Browse to your report from the worker
|
||||
### Browse to your report from the worker
|
||||
|
||||
The worker may be unable to reach the report. It will use the value of `WEBDRIVER_BASEURL` to browse to the report. If that route is invalid, or presents an authentication challenge that the worker can't pass, the report screenshot will fail.
|
||||
|
||||
|
@ -309,7 +309,7 @@ Check this by attempting to `curl` the URL of a report that you see in the error
|
|||
|
||||
In a deployment with authentication measures enabled like HTTPS and Single Sign-On, it may make sense to have the worker navigate directly to the Superset application running in the same location, avoiding the need to sign in. For instance, you could use `WEBDRIVER_BASEURL="http://superset_app:8088"` for a docker compose deployment, and set `"force_https": False,` in your `TALISMAN_CONFIG`.
|
||||
|
||||
### Scheduling Queries as Reports
|
||||
## Scheduling Queries as Reports
|
||||
|
||||
You can optionally allow your users to schedule queries directly in SQL Lab. This is done by adding
|
||||
extra metadata to saved queries, which are then picked up by an external scheduled (like
|
|
@ -1,13 +1,13 @@
|
|||
---
|
||||
title: Async Queries via Celery
|
||||
hide_title: true
|
||||
sidebar_position: 9
|
||||
sidebar_position: 4
|
||||
version: 1
|
||||
---
|
||||
|
||||
## Async Queries via Celery
|
||||
# Async Queries via Celery
|
||||
|
||||
### Celery
|
||||
## Celery
|
||||
|
||||
On large analytic databases, it’s common to run queries that execute for minutes or hours. To enable
|
||||
support for long running queries that execute beyond the typical web request’s timeout (30-60
|
||||
|
@ -89,7 +89,7 @@ issues arise. Please clear your existing results cache store when upgrading an e
|
|||
- SQL Lab will _only run your queries asynchronously if_ you enable **Asynchronous Query Execution**
|
||||
in your database settings (Sources > Databases > Edit record).
|
||||
|
||||
### Celery Flower
|
||||
## Celery Flower
|
||||
|
||||
Flower is a web based tool for monitoring the Celery cluster which you can install from pip:
|
||||
|
|
@ -1,11 +1,11 @@
|
|||
---
|
||||
title: Caching
|
||||
hide_title: true
|
||||
sidebar_position: 6
|
||||
sidebar_position: 3
|
||||
version: 1
|
||||
---
|
||||
|
||||
## Caching
|
||||
# Caching
|
||||
|
||||
Superset uses [Flask-Caching](https://flask-caching.readthedocs.io/) for caching purposes.
|
||||
Flask-Caching supports various caching backends, including Redis (recommended), Memcached,
|
||||
|
@ -33,7 +33,7 @@ FILTER_STATE_CACHE_CONFIG = {
|
|||
}
|
||||
```
|
||||
|
||||
### Dependencies
|
||||
## Dependencies
|
||||
|
||||
In order to use dedicated cache stores, additional python libraries must be installed
|
||||
|
||||
|
@ -43,7 +43,7 @@ In order to use dedicated cache stores, additional python libraries must be inst
|
|||
|
||||
These libraries can be installed using pip.
|
||||
|
||||
### Fallback Metastore Cache
|
||||
## Fallback Metastore Cache
|
||||
|
||||
Note, that some form of Filter State and Explore caching are required. If either of these caches
|
||||
are undefined, Superset falls back to using a built-in cache that stores data in the metadata
|
||||
|
@ -60,7 +60,7 @@ DATA_CACHE_CONFIG = {
|
|||
}
|
||||
```
|
||||
|
||||
### Chart Cache Timeout
|
||||
## Chart Cache Timeout
|
||||
|
||||
The cache timeout for charts may be overridden by the settings for an individual chart, dataset, or
|
||||
database. Each of these configurations will be checked in order before falling back to the default
|
||||
|
@ -69,7 +69,7 @@ value defined in `DATA_CACHE_CONFIG`.
|
|||
Note, that by setting the cache timeout to `-1`, caching for charting data can be disabled, either
|
||||
per chart, dataset or database, or by default if set in `DATA_CACHE_CONFIG`.
|
||||
|
||||
### SQL Lab Query Results
|
||||
## SQL Lab Query Results
|
||||
|
||||
Caching for SQL Lab query results is used when async queries are enabled and is configured using
|
||||
`RESULTS_BACKEND`.
|
||||
|
@ -77,11 +77,11 @@ Caching for SQL Lab query results is used when async queries are enabled and is
|
|||
Note that this configuration does not use a flask-caching dictionary for its configuration, but
|
||||
instead requires a cachelib object.
|
||||
|
||||
See [Async Queries via Celery](/docs/installation/async-queries-celery) for details.
|
||||
See [Async Queries via Celery](/docs/configuration/async-queries-celery) for details.
|
||||
|
||||
### Caching Thumbnails
|
||||
## Caching Thumbnails
|
||||
|
||||
This is an optional feature that can be turned on by activating it’s [feature flag](/docs/installation/configuring-superset#feature-flags) on config:
|
||||
This is an optional feature that can be turned on by activating it’s [feature flag](/docs/configuration/configuring-superset#feature-flags) on config:
|
||||
|
||||
```
|
||||
FEATURE_FLAGS = {
|
|
@ -1,32 +1,44 @@
|
|||
---
|
||||
title: Configuring Superset
|
||||
hide_title: true
|
||||
sidebar_position: 4
|
||||
sidebar_position: 1
|
||||
version: 1
|
||||
---
|
||||
|
||||
## Configuring Superset
|
||||
# Configuring Superset
|
||||
|
||||
### Configuration
|
||||
## superset_config.py
|
||||
|
||||
To configure your application, you need to create a file `superset_config.py`. Add this file to your
|
||||
Superset exposes hundreds of configurable parameters through its
|
||||
[config.py module](https://github.com/apache/superset/blob/master/superset/config.py). The
|
||||
variables and objects exposed act as a public interface of the bulk of what you may want
|
||||
to configure, alter and interface with. In this python module, you'll find all these
|
||||
parameters, sensible defaults, as well as rich documentation in the form of comments
|
||||
|
||||
`PYTHONPATH` or create an environment variable `SUPERSET_CONFIG_PATH` specifying the full path of the `superset_config.py`.
|
||||
To configure your application, you need to create you own configuration module, which
|
||||
will allow you to override few or many of these parameters. Instead of altering the core module,
|
||||
You'll want to define your own module (typically a file named `superset_config.py`.
|
||||
Add this file to your `PYTHONPATH` or create an environment variable
|
||||
`SUPERSET_CONFIG_PATH` specifying the full path of the `superset_config.py`.
|
||||
|
||||
For example, if deploying on Superset directly on a Linux-based system where your `superset_config.py` is under `/app` directory, you can run:
|
||||
For example, if deploying on Superset directly on a Linux-based system where your
|
||||
`superset_config.py` is under `/app` directory, you can run:
|
||||
|
||||
```bash
|
||||
export SUPERSET_CONFIG_PATH=/app/superset_config.py
|
||||
```
|
||||
|
||||
If you are using your own custom Dockerfile with official Superset image as base image, then you can add your overrides as shown below:
|
||||
If you are using your own custom Dockerfile with official Superset image as base image,
|
||||
then you can add your overrides as shown below:
|
||||
|
||||
```bash
|
||||
COPY --chown=superset superset_config.py /app/
|
||||
ENV SUPERSET_CONFIG_PATH /app/superset_config.py
|
||||
```
|
||||
|
||||
Docker compose deployments handle application configuration differently. See [https://github.com/apache/superset/tree/master/docker#readme](https://github.com/apache/superset/tree/master/docker#readme) for details.
|
||||
Docker compose deployments handle application configuration differently using specific conventions..
|
||||
Refer to the [docker-compose tips & configuration](/docs/installation/docker-compose#docker-compose-tips--configuration)
|
||||
for details.
|
||||
|
||||
The following is an example of just a few of the parameters you can set in your `superset_config.py` file:
|
||||
|
||||
|
@ -63,33 +75,39 @@ WTF_CSRF_TIME_LIMIT = 60 * 60 * 24 * 365
|
|||
MAPBOX_API_KEY = ''
|
||||
```
|
||||
|
||||
All the parameters and default values defined in
|
||||
[https://github.com/apache/superset/blob/master/superset/config.py](https://github.com/apache/superset/blob/master/superset/config.py)
|
||||
:::tip
|
||||
Note that it is typical to copy and paste [only] the portions of the
|
||||
core [superset/config.py](https://github.com/apache/superset/blob/master/superset/config.py) that
|
||||
you want to alter, along with the related comments into your own `superset_config.py` file.
|
||||
:::
|
||||
|
||||
All the parameters and default values defined
|
||||
in [superset/config.py](https://github.com/apache/superset/blob/master/superset/config.py)
|
||||
can be altered in your local `superset_config.py`. Administrators will want to read through the file
|
||||
to understand what can be configured locally as well as the default values in place.
|
||||
|
||||
Since `superset_config.py` acts as a Flask configuration module, it can be used to alter the
|
||||
settings Flask itself, as well as Flask extensions like `flask-wtf`, `flask-caching`, `flask-migrate`,
|
||||
and `flask-appbuilder`. Flask App Builder, the web framework used by Superset, offers many
|
||||
settings Flask itself, as well as Flask extensions that Superset bundles like
|
||||
`flask-wtf`, `flask-caching`, `flask-migrate`,
|
||||
and `flask-appbuilder`. Each one of these extensions offers intricate configurability.
|
||||
Flask App Builder, the web framework used by Superset, also offers many
|
||||
configuration settings. Please consult the
|
||||
[Flask App Builder Documentation](https://flask-appbuilder.readthedocs.org/en/latest/config.html)
|
||||
for more information on how to configure it.
|
||||
|
||||
Make sure to change:
|
||||
You'll want to change:
|
||||
|
||||
- `SQLALCHEMY_DATABASE_URI`: by default it is stored at ~/.superset/superset.db
|
||||
- `SECRET_KEY`: to a long random string
|
||||
|
||||
If you need to exempt endpoints from CSRF (e.g. if you are running a custom auth postback endpoint),
|
||||
you can add the endpoints to `WTF_CSRF_EXEMPT_LIST`:
|
||||
- `SQLALCHEMY_DATABASE_URI`: that by default points to sqlite database located at
|
||||
~/.superset/superset.db
|
||||
|
||||
```
|
||||
WTF_CSRF_EXEMPT_LIST = [‘’]
|
||||
```
|
||||
|
||||
### Specifying a SECRET_KEY
|
||||
## Specifying a SECRET_KEY
|
||||
|
||||
#### Adding an initial SECRET_KEY
|
||||
### Adding an initial SECRET_KEY
|
||||
|
||||
Superset requires a user-specified SECRET_KEY to start up. This requirement was [added in version 2.1.0 to force secure configurations](https://preset.io/blog/superset-security-update-default-secret_key-vulnerability/). Add a strong SECRET_KEY to your `superset_config.py` file like:
|
||||
|
||||
|
@ -99,7 +117,12 @@ SECRET_KEY = 'YOUR_OWN_RANDOM_GENERATED_SECRET_KEY'
|
|||
|
||||
You can generate a strong secure key with `openssl rand -base64 42`.
|
||||
|
||||
#### Rotating to a newer SECRET_KEY
|
||||
:::caution Use a strong secret key
|
||||
This key will be used for securely signing session cookies and encrypting sensitive information stored in Superset's application metadata database.
|
||||
Your deployment must use a complex, unique key.
|
||||
:::
|
||||
|
||||
### Rotating to a newer SECRET_KEY
|
||||
|
||||
If you wish to change your existing SECRET_KEY, add the existing SECRET_KEY to your `superset_config.py` file as
|
||||
`PREVIOUS_SECRET_KEY = `and provide your new key as `SECRET_KEY =`. You can find your current SECRET_KEY with these
|
||||
|
@ -112,9 +135,13 @@ from flask import current_app; print(current_app.config["SECRET_KEY"])
|
|||
|
||||
Save your `superset_config.py` with these values and then run `superset re-encrypt-secrets`.
|
||||
|
||||
### Using a production metastore
|
||||
## Setting up a production metadata database
|
||||
|
||||
By default, Superset is configured to use SQLite, which is a simple and fast way to get started
|
||||
Superset needs a database to store the information it manages, like the definitions of
|
||||
charts, dashboards, and many other things.
|
||||
|
||||
By default, Superset is configured to use [SQLite](https://www.sqlite.org/),
|
||||
a self-contained, single-file database that offers a simple and fast way to get started
|
||||
(without requiring any installation). However, for production environments,
|
||||
using SQLite is highly discouraged due to security, scalability, and data integrity reasons.
|
||||
It's important to use only the supported database engines and consider using a different
|
||||
|
@ -134,10 +161,17 @@ Use the following database drivers and connection strings:
|
|||
| [PostgreSQL](https://www.postgresql.org/) | `pip install psycopg2` | `postgresql://<UserName>:<DBPassword>@<Database Host>/<Database Name>` |
|
||||
| [MySQL](https://www.mysql.com/) | `pip install mysqlclient` | `mysql://<UserName>:<DBPassword>@<Database Host>/<Database Name>` |
|
||||
|
||||
:::tip
|
||||
Properly setting up metadata store is beyond the scope of this documentation. We recommend
|
||||
using a hosted managed service such as [Amazon RDS](https://aws.amazon.com/rds/) or
|
||||
[Google Cloud Databases](https://cloud.google.com/products/databases?hl=en) to handle
|
||||
service and supporting infrastructure and backup strategy.
|
||||
:::
|
||||
|
||||
To configure Superset metastore set `SQLALCHEMY_DATABASE_URI` config key on `superset_config`
|
||||
to the appropriate connection string.
|
||||
|
||||
### Running on a WSGI HTTP Server
|
||||
## Running on a WSGI HTTP Server
|
||||
|
||||
While you can run Superset on NGINX or Apache, we recommend using Gunicorn in async mode. This
|
||||
enables impressive concurrency even and is fairly easy to install and configure. Please refer to the
|
||||
|
@ -166,12 +200,12 @@ If you're not using Gunicorn, you may want to disable the use of `flask-compress
|
|||
Currently, Google BigQuery python sdk is not compatible with `gevent`, due to some dynamic monkeypatching on python core library by `gevent`.
|
||||
So, when you use `BigQuery` datasource on Superset, you have to use `gunicorn` worker type except `gevent`.
|
||||
|
||||
### HTTPS Configuration
|
||||
## HTTPS Configuration
|
||||
|
||||
You can configure HTTPS upstream via a load balancer or a reverse proxy (such as nginx) and do SSL/TLS Offloading before traffic reaches the Superset application. In this setup, local traffic from a Celery worker taking a snapshot of a chart for Alerts & Reports can access Superset at a `http://` URL, from behind the ingress point.
|
||||
You can also configure [SSL in Gunicorn](https://docs.gunicorn.org/en/stable/settings.html#ssl) (the Python webserver) if you are using an official Superset Docker image.
|
||||
|
||||
### Configuration Behind a Load Balancer
|
||||
## Configuration Behind a Load Balancer
|
||||
|
||||
If you are running superset behind a load balancer or reverse proxy (e.g. NGINX or ELB on AWS), you
|
||||
may need to utilize a healthcheck endpoint so that your load balancer knows if your superset
|
||||
|
@ -189,7 +223,7 @@ In case the reverse proxy is used for providing SSL encryption, an explicit defi
|
|||
RequestHeader set X-Forwarded-Proto "https"
|
||||
```
|
||||
|
||||
### Custom OAuth2 Configuration
|
||||
## Custom OAuth2 Configuration
|
||||
|
||||
Superset is built on Flask-AppBuilder (FAB), which supports many providers out of the box
|
||||
(GitHub, Twitter, LinkedIn, Google, Azure, etc). Beyond those, Superset can be configured to connect
|
||||
|
@ -288,43 +322,45 @@ CUSTOM_SECURITY_MANAGER = CustomSsoSecurityManager
|
|||
]
|
||||
```
|
||||
|
||||
### LDAP Authentication
|
||||
## LDAP Authentication
|
||||
|
||||
FAB supports authenticating user credentials against an LDAP server.
|
||||
To use LDAP you must install the [python-ldap](https://www.python-ldap.org/en/latest/installing.html) package.
|
||||
See [FAB's LDAP documentation](https://flask-appbuilder.readthedocs.io/en/latest/security.html#authentication-ldap)
|
||||
for details.
|
||||
|
||||
### Mapping LDAP or OAUTH groups to Superset roles
|
||||
## Mapping LDAP or OAUTH groups to Superset roles
|
||||
|
||||
AUTH_ROLES_MAPPING in Flask-AppBuilder is a dictionary that maps from LDAP/OAUTH group names to FAB roles.
|
||||
It is used to assign roles to users who authenticate using LDAP or OAuth.
|
||||
|
||||
#### Mapping OAUTH groups to Superset roles
|
||||
### Mapping OAUTH groups to Superset roles
|
||||
|
||||
The following AUTH_ROLES_MAPPING dictionary would map the OAUTH group "superset_users" to the Superset roles "Gamma" as well as "Alpha", and the OAUTH group "superset_admins" to the Superset role "Admin".
|
||||
The following `AUTH_ROLES_MAPPING` dictionary would map the OAUTH group "superset_users" to the Superset roles "Gamma" as well as "Alpha", and the OAUTH group "superset_admins" to the Superset role "Admin".
|
||||
|
||||
```python
|
||||
AUTH_ROLES_MAPPING = {
|
||||
"superset_users": ["Gamma","Alpha"],
|
||||
"superset_admins": ["Admin"],
|
||||
}
|
||||
```
|
||||
### Mapping LDAP groups to Superset roles
|
||||
|
||||
#### Mapping LDAP groups to Superset roles
|
||||
|
||||
The following AUTH_ROLES_MAPPING dictionary would map the LDAP DN "cn=superset_users,ou=groups,dc=example,dc=com" to the Superset roles "Gamma" as well as "Alpha", and the LDAP DN "cn=superset_admins,ou=groups,dc=example,dc=com" to the Superset role "Admin".
|
||||
The following `AUTH_ROLES_MAPPING` dictionary would map the LDAP DN "cn=superset_users,ou=groups,dc=example,dc=com" to the Superset roles "Gamma" as well as "Alpha", and the LDAP DN "cn=superset_admins,ou=groups,dc=example,dc=com" to the Superset role "Admin".
|
||||
|
||||
```python
|
||||
AUTH_ROLES_MAPPING = {
|
||||
"cn=superset_users,ou=groups,dc=example,dc=com": ["Gamma","Alpha"],
|
||||
"cn=superset_admins,ou=groups,dc=example,dc=com": ["Admin"],
|
||||
}
|
||||
```
|
||||
Note: This requires `AUTH_LDAP_SEARCH` to be set. For more details, please see the [FAB Security documentation](https://flask-appbuilder.readthedocs.io/en/latest/security.html).
|
||||
|
||||
Note: This requires AUTH_LDAP_SEARCH to be set. For more details, Please refer (FAB Security documentation)[https://flask-appbuilder.readthedocs.io/en/latest/security.html].
|
||||
### Syncing roles at login
|
||||
|
||||
#### Syncing roles at login
|
||||
You can also use the `AUTH_ROLES_SYNC_AT_LOGIN` configuration variable to control how often Flask-AppBuilder syncs the user's roles with the LDAP/OAUTH groups. If `AUTH_ROLES_SYNC_AT_LOGIN` is set to True, Flask-AppBuilder will sync the user's roles each time they log in. If `AUTH_ROLES_SYNC_AT_LOGIN` is set to False, Flask-AppBuilder will only sync the user's roles when they first register.
|
||||
|
||||
You can also use the AUTH_ROLES_SYNC_AT_LOGIN configuration variable to control how often Flask-AppBuilder syncs the user's roles with the LDAP/OAUTH groups. If AUTH_ROLES_SYNC_AT_LOGIN is set to True, Flask-AppBuilder will sync the user's roles each time they log in. If AUTH_ROLES_SYNC_AT_LOGIN is set to False, Flask-AppBuilder will only sync the user's roles when they first register.
|
||||
|
||||
### Flask app Configuration Hook
|
||||
## Flask app Configuration Hook
|
||||
|
||||
`FLASK_APP_MUTATOR` is a configuration function that can be provided in your environment, receives
|
||||
the app object and can alter it in any way. For example, add `FLASK_APP_MUTATOR` into your
|
||||
|
@ -347,7 +383,7 @@ def FLASK_APP_MUTATOR(app: Flask) -> None:
|
|||
app.before_request_funcs.setdefault(None, []).append(make_session_permanent)
|
||||
```
|
||||
|
||||
### Feature Flags
|
||||
## Feature Flags
|
||||
|
||||
To support a diverse set of users, Superset has some features that are not enabled by default. For
|
||||
example, some users have stronger security restrictions, while some others may not. So Superset
|
|
@ -1,11 +1,12 @@
|
|||
---
|
||||
title: Country Map Tools
|
||||
hide_title: true
|
||||
sidebar_position: 1
|
||||
sidebar_position: 10
|
||||
version: 1
|
||||
---
|
||||
|
||||
## The Country Map Visualization
|
||||
import countriesData from '../../data/countries.json';
|
||||
|
||||
# The Country Map Visualization
|
||||
|
||||
The Country Map visualization allows you to plot lightweight choropleth maps of
|
||||
your countries by province, states, or other subdivision types. It does not rely
|
||||
|
@ -21,102 +22,11 @@ The current list of countries can be found in the src
|
|||
|
||||
The Country Maps visualization already ships with the maps for the following countries:
|
||||
|
||||
- Afghanistan
|
||||
- Albania
|
||||
- Algeria
|
||||
- Argentina
|
||||
- Australia
|
||||
- Austria
|
||||
- Belgium
|
||||
- Bolivia
|
||||
- Brazil
|
||||
- Bulgaria
|
||||
- Burundi
|
||||
- Canada
|
||||
- Chile
|
||||
- China
|
||||
- Colombia
|
||||
- Costa Rica
|
||||
- Cuba
|
||||
- Cyprus
|
||||
- Denmark
|
||||
- Dominican Republic
|
||||
- Ecuador
|
||||
- Egypt
|
||||
- El_salvador
|
||||
- Estonia
|
||||
- Ethiopia
|
||||
- France
|
||||
- France Regions
|
||||
- Finland
|
||||
- Germany
|
||||
- Guatemala
|
||||
- Haiti
|
||||
- Honduras
|
||||
- Iceland
|
||||
- India
|
||||
- Indonesia
|
||||
- Iran
|
||||
- Italy
|
||||
- Italy Regions
|
||||
- Japan
|
||||
- Jordan
|
||||
- Kazakhstan
|
||||
- Kenya
|
||||
- Korea
|
||||
- Kuwait
|
||||
- Kyrgyzstan
|
||||
- Latvia
|
||||
- Liechtenstein
|
||||
- Lithuania
|
||||
- Malaysia
|
||||
- Mexico
|
||||
- Morocco
|
||||
- Myanmar
|
||||
- Netherlands
|
||||
- Nicaragua
|
||||
- Nigeria
|
||||
- Norway
|
||||
- Oman
|
||||
- Pakistan
|
||||
- Panama
|
||||
- Papua New Guinea
|
||||
- Paraguay
|
||||
- Peru
|
||||
- Philippines
|
||||
- Portugal
|
||||
- Poland
|
||||
- Puerto_rico
|
||||
- Qatar
|
||||
- Russia
|
||||
- Rwanda
|
||||
- Saint Barthelemy
|
||||
- Saint Martin
|
||||
- Saudi Arabia
|
||||
- Singapore
|
||||
- Slovenia
|
||||
- Spain
|
||||
- Sri Lanka
|
||||
- Sweden
|
||||
- Switzerland
|
||||
- Syria
|
||||
- Tajikistan
|
||||
- Tanzania
|
||||
- Thailand
|
||||
- Timorleste
|
||||
- Turkey
|
||||
- Turkey Regions
|
||||
- Turkmenistan
|
||||
- Uganda
|
||||
- Uk
|
||||
- Ukraine
|
||||
- United Arab Emirates
|
||||
- Uruguay
|
||||
- USA
|
||||
- Uzbekistan
|
||||
- Venezuela
|
||||
- Vietnam
|
||||
- Zambia
|
||||
<ul style={{columns: 3}}>
|
||||
{countriesData.countries.map((country, index) => (
|
||||
<li key={index}>{country}</li>
|
||||
))}
|
||||
</ul>
|
||||
|
||||
## Adding a New Country
|
||||
|
File diff suppressed because it is too large
Load Diff
|
@ -1,18 +1,18 @@
|
|||
---
|
||||
title: Event Logging
|
||||
hide_title: true
|
||||
sidebar_position: 7
|
||||
sidebar_position: 9
|
||||
version: 1
|
||||
---
|
||||
|
||||
## Logging
|
||||
# Logging
|
||||
|
||||
### Event Logging
|
||||
## Event Logging
|
||||
|
||||
Superset by default logs special action events in its internal database (DBEventLogger). These logs can be accessed
|
||||
on the UI by navigating to **Security > Action Log**. You can freely customize these logs by
|
||||
implementing your own event log class.
|
||||
**When custom log class is enabled DBEventLogger is disabled and logs stop being populated in UI logs view.**
|
||||
**When custom log class is enabled DBEventLogger is disabled and logs
|
||||
stop being populated in UI logs view.**
|
||||
To achieve both, custom log class should extend built-in DBEventLogger log class.
|
||||
|
||||
Here's an example of a simple JSON-to-stdout class:
|
||||
|
@ -44,9 +44,10 @@ End by updating your config to pass in an instance of the logger you want to use
|
|||
EVENT_LOGGER = JSONStdOutEventLogger()
|
||||
```
|
||||
|
||||
### StatsD Logging
|
||||
## StatsD Logging
|
||||
|
||||
Superset can be instrumented to log events to StatsD if desired. Most endpoints hit are logged as
|
||||
Superset can be configured to log events to [StatsD](https://github.com/statsd/statsd)
|
||||
if desired. Most endpoints hit are logged as
|
||||
well as key events like query start and end in SQL Lab.
|
||||
|
||||
To setup StatsD logging, it’s a matter of configuring the logger in your `superset_config.py`.
|
|
@ -1,11 +1,11 @@
|
|||
---
|
||||
title: Importing and Exporting Datasources
|
||||
hide_title: true
|
||||
sidebar_position: 2
|
||||
sidebar_position: 11
|
||||
version: 1
|
||||
---
|
||||
|
||||
## Importing and Exporting Datasources
|
||||
# Importing and Exporting Datasources
|
||||
|
||||
The superset cli allows you to import and export datasources from and to YAML. Datasources include
|
||||
databases. The data is expected to be organized in the following hierarchy:
|
||||
|
@ -26,7 +26,7 @@ databases. The data is expected to be organized in the following hierarchy:
|
|||
| └── ... (more databases)
|
||||
```
|
||||
|
||||
### Exporting Datasources to YAML
|
||||
## Exporting Datasources to YAML
|
||||
|
||||
You can print your current datasources to stdout by running:
|
||||
|
||||
|
@ -61,7 +61,7 @@ superset export_datasource_schema
|
|||
|
||||
As a reminder, you can use the `-b` flag to include back references.
|
||||
|
||||
### Importing Datasources
|
||||
## Importing Datasources
|
||||
|
||||
In order to import datasources from a ZIP file, run:
|
||||
|
||||
|
@ -75,9 +75,9 @@ The optional username flag **-u** sets the user used for the datasource import.
|
|||
superset import_datasources -p <path / filename> -u 'admin'
|
||||
```
|
||||
|
||||
### Legacy Importing Datasources
|
||||
## Legacy Importing Datasources
|
||||
|
||||
#### From older versions of Superset to current version
|
||||
### From older versions of Superset to current version
|
||||
|
||||
When using Superset version 4.x.x to import from an older version (2.x.x or 3.x.x) importing is supported as the command `legacy_import_datasources` and expects a JSON or directory of JSONs. The options are `-r` for recursive and `-u` for specifying a user. Example of legacy import without options:
|
||||
|
||||
|
@ -85,7 +85,7 @@ When using Superset version 4.x.x to import from an older version (2.x.x or 3.x.
|
|||
superset legacy_import_datasources -p <path or filename>
|
||||
```
|
||||
|
||||
#### From older versions of Superset to older versions
|
||||
### From older versions of Superset to older versions
|
||||
|
||||
When using an older Superset version (2.x.x & 3.x.x) of Superset, the command is `import_datasources`. ZIP and YAML files are supported and to switch between them the feature flag `VERSIONED_EXPORT` is used. When `VERSIONED_EXPORT` is `True`, `import_datasources` expects a ZIP file, otherwise YAML. Example:
|
||||
|
|
@ -1,13 +1,12 @@
|
|||
---
|
||||
title: Additional Networking Settings
|
||||
hide_title: true
|
||||
sidebar_position: 5
|
||||
title: Network and Security Settings
|
||||
sidebar_position: 7
|
||||
version: 1
|
||||
---
|
||||
|
||||
## Additional Networking Settings
|
||||
# Network and Security Settings
|
||||
|
||||
### CORS
|
||||
## CORS
|
||||
|
||||
To configure CORS, or cross-origin resource sharing, the following dependency must be installed:
|
||||
|
||||
|
@ -21,7 +20,37 @@ The following keys in `superset_config.py` can be specified to configure CORS:
|
|||
- `CORS_OPTIONS`: options passed to Flask-CORS
|
||||
([documentation](https://flask-cors.corydolphin.com/en/latest/api.html#extension))
|
||||
|
||||
### Domain Sharding
|
||||
|
||||
## HTTP headers
|
||||
|
||||
Note that Superset bundles [flask-talisman](https://pypi.org/project/talisman/)
|
||||
Self-descried as a small Flask extension that handles setting HTTP headers that can help
|
||||
protect against a few common web application security issues.
|
||||
|
||||
## CSRF settings
|
||||
|
||||
Similarly, [flask-wtf](https://flask-wtf.readthedocs.io/en/0.15.x/config/) is used manage
|
||||
some CSRF configurations. If you need to exempt endpoints from CSRF (e.g. if you are
|
||||
running a custom auth postback endpoint), you can add the endpoints to `WTF_CSRF_EXEMPT_LIST`:
|
||||
|
||||
|
||||
## SSH Tunneling
|
||||
|
||||
1. Turn on feature flag
|
||||
- Change [`SSH_TUNNELING`](https://github.com/apache/superset/blob/eb8386e3f0647df6d1bbde8b42073850796cc16f/superset/config.py#L489) to `True`
|
||||
- If you want to add more security when establishing the tunnel we allow users to overwrite the `SSHTunnelManager` class [here](https://github.com/apache/superset/blob/eb8386e3f0647df6d1bbde8b42073850796cc16f/superset/config.py#L507)
|
||||
- You can also set the [`SSH_TUNNEL_LOCAL_BIND_ADDRESS`](https://github.com/apache/superset/blob/eb8386e3f0647df6d1bbde8b42073850796cc16f/superset/config.py#L508) this the host address where the tunnel will be accessible on your VPC
|
||||
|
||||
2. Create database w/ ssh tunnel enabled
|
||||
- With the feature flag enabled you should now see ssh tunnel toggle.
|
||||
- Click the toggle to enables ssh tunneling and add your credentials accordingly.
|
||||
- Superset allows for 2 different type authentication (Basic + Private Key). These credentials should come from your service provider.
|
||||
|
||||
3. Verify data is flowing
|
||||
- Once SSH tunneling has been enabled, go to SQL Lab and write a query to verify data is properly flowing.
|
||||
|
||||
|
||||
## Domain Sharding
|
||||
|
||||
Chrome allows up to 6 open connections per domain at a time. When there are more than 6 slices in
|
||||
dashboard, a lot of time fetch requests are queued up and wait for next available socket.
|
||||
|
@ -42,7 +71,7 @@ or add the following setting in your `superset_config.py` file if domain shards
|
|||
|
||||
- `SESSION_COOKIE_DOMAIN = '.mydomain.com'`
|
||||
|
||||
### Middleware
|
||||
## Middleware
|
||||
|
||||
Superset allows you to add your own middleware. To add your own middleware, update the
|
||||
`ADDITIONAL_MIDDLEWARE` key in your `superset_config.py`. `ADDITIONAL_MIDDLEWARE` should be a list
|
|
@ -0,0 +1,6 @@
|
|||
---
|
||||
title: Setup SSH Tunneling
|
||||
hide_title: true
|
||||
sidebar_position: 8
|
||||
version: 1
|
||||
---
|
|
@ -1,16 +1,16 @@
|
|||
---
|
||||
title: SQL Templating
|
||||
hide_title: true
|
||||
sidebar_position: 11
|
||||
sidebar_position: 5
|
||||
version: 1
|
||||
---
|
||||
|
||||
## SQL Templating
|
||||
# SQL Templating
|
||||
|
||||
### Jinja Templates
|
||||
## Jinja Templates
|
||||
|
||||
SQL Lab and Explore supports [Jinja templating](https://jinja.palletsprojects.com/en/2.11.x/) in queries.
|
||||
To enable templating, the `ENABLE_TEMPLATE_PROCESSING` [feature flag](/docs/installation/configuring-superset#feature-flags) needs to be enabled in
|
||||
To enable templating, the `ENABLE_TEMPLATE_PROCESSING` [feature flag](/docs/configuration/configuring-superset#feature-flags) needs to be enabled in
|
||||
`superset_config.py`. When templating is enabled, python code can be embedded in virtual datasets and
|
||||
in Custom SQL in the filter and metric controls in Explore. By default, the following variables are
|
||||
made available in the Jinja context:
|
||||
|
@ -168,7 +168,7 @@ FEATURE_FLAGS = {
|
|||
The available validators and names can be found in
|
||||
[sql_validators](https://github.com/apache/superset/tree/master/superset/sql_validators).
|
||||
|
||||
### Available Macros
|
||||
## Available Macros
|
||||
|
||||
In this section, we'll walkthrough the pre-defined Jinja macros in Superset.
|
||||
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue