diff --git a/docs-v2/docs/Contributing/_category_.json b/docs-v2/docs/Contributing/_category_.json new file mode 100644 index 0000000000000..ca96e44a2dcd9 --- /dev/null +++ b/docs-v2/docs/Contributing/_category_.json @@ -0,0 +1,4 @@ +{ + "label": "Contributing", + "position": 6 +} diff --git a/docs-v2/docs/Contributing/contributing-page.mdx b/docs-v2/docs/Contributing/contributing-page.mdx new file mode 100644 index 0000000000000..70323de1836c1 --- /dev/null +++ b/docs-v2/docs/Contributing/contributing-page.mdx @@ -0,0 +1,21 @@ +--- +name: General Resources +menu: Contributing +route: /docs/contributing/contribution-guidelines +index: 1 +version: 1 +--- + +## Contributing to Superset + +Superset is an [Apache Software foundation](https://www.apache.org/theapacheway/index.html) project. +The core contributors (or committers) to Superset communicate primarily in the following channels (all of +which you can join): + +- [Mailing list](https://lists.apache.org/list.html?dev@superset.apache.org) +- [Apache Superset Slack community](https://join.slack.com/t/apache-superset/shared_invite/zt-uxbh5g36-AISUtHbzOXcu0BIj7kgUaw) +- [Github issues and PR's](https://github.com/apache/superset/issues) + +More references: +- [Comprehensive Tutorial for Contributing Code to Apache Superset](https://preset.io/blog/tutorial-contributing-code-to-apache-superset/) +- [CONTRIBUTING Guide on Github](https://github.com/apache/superset/blob/master/CONTRIBUTING.md) diff --git a/docs-v2/docs/Contributing/conventions-and-typing.mdx b/docs-v2/docs/Contributing/conventions-and-typing.mdx new file mode 100644 index 0000000000000..7096b735ea508 --- /dev/null +++ b/docs-v2/docs/Contributing/conventions-and-typing.mdx @@ -0,0 +1,57 @@ +--- +title: Conventions and Typing +hide_title: true +sidebar_position: 7 +version: 1 +--- + +## Conventions + +### Python + +Parameters in the `config.py` (which are accessible via the Flask app.config dictionary) are assumed to always be defined and thus should be accessed directly via, + +```python +blueprints = app.config["BLUEPRINTS"] +``` + +rather than, + +```python +blueprints = app.config.get("BLUEPRINTS") +``` + +or similar as the later will cause typing issues. The former is of type `List[Callable]` whereas the later is of type `Optional[List[Callable]]`. + +## Typing + +### Python + +To ensure clarity, consistency, all readability, _all_ new functions should use +[type hints](https://docs.python.org/3/library/typing.html) and include a +docstring. + +Note per [PEP-484](https://www.python.org/dev/peps/pep-0484/#exceptions) no +syntax for listing explicitly raised exceptions is proposed and thus the +recommendation is to put this information in a docstring, i.e., + +```python +import math +from typing import Union + + +def sqrt(x: Union[float, int]) -> Union[float, int]: + """ + Return the square root of x. + + :param x: A number + :returns: The square root of the given number + :raises ValueError: If the number is negative + """ + + return math.sqrt(x) +``` + +### TypeScript + +TypeScript is fully supported and is the recommended language for writing all new frontend components. When modifying existing functions/components, migrating to TypeScript is appreciated, but not required. Examples of migrating functions/components to TypeScript can be found in [#9162](https://github.com/apache/superset/pull/9162) and [#9180](https://github.com/apache/superset/pull/9180). diff --git a/docs-v2/docs/Contributing/hooks-and-linting.mdx b/docs-v2/docs/Contributing/hooks-and-linting.mdx new file mode 100644 index 0000000000000..b6d82420184c6 --- /dev/null +++ b/docs-v2/docs/Contributing/hooks-and-linting.mdx @@ -0,0 +1,61 @@ +--- +title: Pre-commit Hooks and Linting +hide_title: true +sidebar_position: 6 +version: 1 +--- + +## Git Hooks + +Superset uses Git pre-commit hooks courtesy of [pre-commit](https://pre-commit.com/). To install run the following: + +```bash +pip3 install -r requirements/integration.txt +pre-commit install +``` + +A series of checks will now run when you make a git commit. + +Alternatively it is possible to run pre-commit via tox: + +```bash +tox -e pre-commit +``` + +Or by running pre-commit manually: + +```bash +pre-commit run --all-files +``` + +## Linting + +### Python + +We use [Pylint](https://pylint.org/) for linting which can be invoked via: + +```bash +# for python +tox -e pylint +``` + +In terms of best practices please advoid blanket disablement of Pylint messages globally (via `.pylintrc`) or top-level within the file header, albeit there being a few exceptions. Disablement should occur inline as it prevents masking issues and provides context as to why said message is disabled. + +Additionally the Python code is auto-formatted using [Black](https://github.com/python/black) which +is configured as a pre-commit hook. There are also numerous [editor integrations](https://black.readthedocs.io/en/stable/editor_integration.html) + +### TypeScript + +```bash +cd superset-frontend +npm ci +npm run lint +``` + +If using the eslint extension with vscode, put the following in your workspace `settings.json` file: + +```json +"eslint.workingDirectories": [ + "superset-frontend" +] +``` diff --git a/docs-v2/docs/Contributing/local-backend.mdx b/docs-v2/docs/Contributing/local-backend.mdx new file mode 100644 index 0000000000000..8b7bf14ba56fd --- /dev/null +++ b/docs-v2/docs/Contributing/local-backend.mdx @@ -0,0 +1,106 @@ +--- +title: Running a Local Flask Backend +hide_title: true +sidebar_position: 5 +version: 1 +--- + +### Flask server + +#### OS Dependencies + +Make sure your machine meets the [OS dependencies](https://superset.apache.org/docs/installation/installing-superset-from-scratch#os-dependencies) before following these steps. +You also need to install MySQL or [MariaDB](https://mariadb.com/downloads). + +Ensure that you are using Python version 3.7 or 3.8, then proceed with: + +````bash +# Create a virtual environment and activate it (recommended) +python3 -m venv venv # setup a python3 virtualenv +source venv/bin/activate + +# Install external dependencies +pip install -r requirements/testing.txt + +# Install Superset in editable (development) mode +pip install -e . + +# Initialize the database +superset db upgrade + +# Create an admin user in your metadata database (use `admin` as username to be able to load the examples) +superset fab create-admin + +# Create default roles and permissions +superset init + +# Load some data to play with. +# Note: you MUST have previously created an admin user with the username `admin` for this command to work. +superset load-examples + +# Start the Flask dev web server from inside your virtualenv. +# Note that your page may not have CSS at this point. +FLASK_ENV=development superset run -p 8088 --with-threads --reload --debugger +``` + +Or you can install via our Makefile + +```bash +# Create a virtual environment and activate it (recommended) +$ python3 -m venv venv # setup a python3 virtualenv +$ source venv/bin/activate + +# install pip packages + pre-commit +$ make install + +# Install superset pip packages and setup env only +$ make superset + +# Setup pre-commit only +$ make pre-commit +```` + +**Note: the FLASK_APP env var should not need to be set, as it's currently controlled +via `.flaskenv`, however if needed, it should be set to `superset.app:create_app()`** + +If you have made changes to the FAB-managed templates, which are not built the same way as the newer, React-powered front-end assets, you need to start the app without the `--with-threads` argument like so: +`FLASK_ENV=development superset run -p 8088 --reload --debugger` + +#### Dependencies + +If you add a new requirement or update an existing requirement (per the `install_requires` section in `setup.py`) you must recompile (freeze) the Python dependencies to ensure that for CI, testing, etc. the build is deterministic. This can be achieved via, + +```bash +$ python3 -m venv venv +$ source venv/bin/activate +$ python3 -m pip install -r requirements/integration.txt +$ pip-compile-multi --no-upgrade +``` + +#### Logging to the browser console + +This feature is only available on Python 3. When debugging your application, you can have the server logs sent directly to the browser console using the [ConsoleLog](https://github.com/betodealmeida/consolelog) package. You need to mutate the app, by adding the following to your `config.py` or `superset_config.py`: + +```python +from console_log import ConsoleLog + +def FLASK_APP_MUTATOR(app): + app.wsgi_app = ConsoleLog(app.wsgi_app, app.logger) +``` + +Then make sure you run your WSGI server using the right worker type: + +```bash +FLASK_ENV=development gunicorn "superset.app:create_app()" -k "geventwebsocket.gunicorn.workers.GeventWebSocketWorker" -b 127.0.0.1:8088 --reload +``` + +You can log anything to the browser console, including objects: + +```python +from superset import app +app.logger.error('An exception occurred!') +app.logger.info(form_data) +``` + +### Frontend Assets +See [Running Frontend Assets Locally](https://superset.apache.org/docs/installation/installing-superset-from-scratch#os-dependencies) diff --git a/docs-v2/docs/Contributing/pull-request-guidelines.mdx b/docs-v2/docs/Contributing/pull-request-guidelines.mdx new file mode 100644 index 0000000000000..f37efd785eb60 --- /dev/null +++ b/docs-v2/docs/Contributing/pull-request-guidelines.mdx @@ -0,0 +1,96 @@ +--- +title: Pull Request Guidelines +hide_title: true +sidebar_position: 3 +version: 1 +--- + +## Pull Request Guidelines + +A philosophy we would like to strongly encourage is + +> Before creating a PR, create an issue. + +The purpose is to separate problem from possible solutions. + +**Bug fixes:** If you’re only fixing a small bug, it’s fine to submit a pull request right away but we highly recommend to file an issue detailing what you’re fixing. This is helpful in case we don’t accept that specific fix but want to keep track of the issue. Please keep in mind that the project maintainers reserve the rights to accept or reject incoming PRs, so it is better to separate the issue and the code to fix it from each other. In some cases, project maintainers may request you to create a separate issue from PR before proceeding. + +**Refactor:** For small refactors, it can be a standalone PR itself detailing what you are refactoring and why. If there are concerns, project maintainers may request you to create a `#SIP` for the PR before proceeding. + +**Feature/Large changes:** If you intend to change the public API, or make any non-trivial changes to the implementation, we require you to file a new issue as `#SIP` (Superset Improvement Proposal). This lets us reach an agreement on your proposal before you put significant effort into it. You are welcome to submit a PR along with the SIP (sometimes necessary for demonstration), but we will not review/merge the code until the SIP is approved. + +In general, small PRs are always easier to review than large PRs. The best practice is to break your work into smaller independent PRs and refer to the same issue. This will greatly reduce turnaround time. + +If you wish to share your work which is not ready to merge yet, create a [Draft PR](https://github.blog/2019-02-14-introducing-draft-pull-requests/). This will enable maintainers and the CI runner to prioritize mature PR's. + +Finally, never submit a PR that will put master branch in broken state. If the PR is part of multiple PRs to complete a large feature and cannot work on its own, you can create a feature branch and merge all related PRs into the feature branch before creating a PR from feature branch to master. + +### Protocol + +#### Authoring + +- Fill in all sections of the PR template. +- Title the PR with one of the following semantic prefixes (inspired by [Karma](http://karma-runner.github.io/0.10/dev/git-commit-msg.html])): + + - `feat` (new feature) + - `fix` (bug fix) + - `docs` (changes to the documentation) + - `style` (formatting, missing semi colons, etc; no application logic change) + - `refactor` (refactoring code) + - `test` (adding missing tests, refactoring tests; no application logic change) + - `chore` (updating tasks etc; no application logic change) + - `perf` (performance-related change) + - `build` (build tooling, Docker configuration change) + - `ci` (test runner, Github Actions workflow changes) + - `other` (changes that don't correspond to the above -- should be rare!) + - Examples: + - `feat: export charts as ZIP files` + - `perf(api): improve API info performance` + - `fix(chart-api): cached-indicator always shows value is cached` + +- Add prefix `[WIP]` to title if not ready for review (WIP = work-in-progress). We recommend creating a PR with `[WIP]` first and remove it once you have passed CI test and read through your code changes at least once. +- If you believe your PR contributes a potentially breaking change, put a `!` after the semantic prefix but before the colon in the PR title, like so: `feat!: Added foo functionality to bar` +- **Screenshots/GIFs:** Changes to user interface require before/after screenshots, or GIF for interactions + - Recommended capture tools ([Kap](https://getkap.co/), [LICEcap](https://www.cockos.com/licecap/), [Skitch](https://download.cnet.com/Skitch/3000-13455_4-189876.html)) + - If no screenshot is provided, the committers will mark the PR with `need:screenshot` label and will not review until screenshot is provided. +- **Dependencies:** Be careful about adding new dependency and avoid unnecessary dependencies. + - For Python, include it in `setup.py` denoting any specific restrictions and in `requirements.txt` pinned to a specific version which ensures that the application build is deterministic. + - For TypeScript/JavaScript, include new libraries in `package.json` +- **Tests:** The pull request should include tests, either as doctests, unit tests, or both. Make sure to resolve all errors and test failures. See [Testing](#testing) for how to run tests. +- **Documentation:** If the pull request adds functionality, the docs should be updated as part of the same PR. +- **CI:** Reviewers will not review the code until all CI tests are passed. Sometimes there can be flaky tests. You can close and open PR to re-run CI test. Please report if the issue persists. After the CI fix has been deployed to `master`, please rebase your PR. +- **Code coverage:** Please ensure that code coverage does not decrease. +- Remove `[WIP]` when ready for review. Please note that it may be merged soon after approved so please make sure the PR is ready to merge and do not expect more time for post-approval edits. +- If the PR was not ready for review and inactive for > 30 days, we will close it due to inactivity. The author is welcome to re-open and update. + +#### Reviewing + +- Use constructive tone when writing reviews. +- If there are changes required, state clearly what needs to be done before the PR can be approved. +- If you are asked to update your pull request with some changes there's no need to create a new one. Push your changes to the same branch. +- The committers reserve the right to reject any PR and in some cases may request the author to file an issue. + +#### Test Environments + +- Members of the Apache GitHub org can launch an ephemeral test environment directly on a pull request by creating a comment containing (only) the command `/testenv up`. + - Note that org membership must be public in order for this validation to function properly. +- Feature flags may be set for a test environment by specifying the flag name (prefixed with `FEATURE_`) and value after the command. + - Format: `/testenv up FEATURE_=true|false` + - Example: `/testenv up FEATURE_DASHBOARD_NATIVE_FILTERS=true` + - Multiple feature flags may be set in single command, separated by whitespace +- A comment will be created by the workflow script with the address and login information for the ephemeral environment. +- Test environments may be created once the Docker build CI workflow for the PR has completed successfully. +- Test environments do not currently update automatically when new commits are added to a pull request. +- Test environments do not currently support async workers, though this is planned. +- Running test environments will be shutdown upon closing the pull request. + +#### Merging + +- At least one approval is required for merging a PR. +- PR is usually left open for at least 24 hours before merging. +- After the PR is merged, [close the corresponding issue(s)](https://help.github.com/articles/closing-issues-using-keywords/). + +#### Post-merge Responsibility + +- Project maintainers may contact the PR author if new issues are introduced by the PR. +- Project maintainers may revert your changes if a critical issue is found, such as breaking master branch CI. diff --git a/docs-v2/docs/Contributing/style-guide.mdx b/docs-v2/docs/Contributing/style-guide.mdx new file mode 100644 index 0000000000000..3c40eef6725db --- /dev/null +++ b/docs-v2/docs/Contributing/style-guide.mdx @@ -0,0 +1,54 @@ +--- +title: Style Guide +hide_title: true +sidebar_position: 4 +version: 1 +--- + +## Design Guidelines + +### Capitalization guidelines + +#### Sentence case + +Use sentence-case capitalization for everything in the UI (except these \*\*). + +Sentence case is predominantly lowercase. Capitalize only the initial character of the first word, and other words that require capitalization, like: + +- **Proper nouns.** Objects in the product _are not_ considered proper nouns e.g. dashboards, charts, saved queries etc. Proprietary feature names eg. SQL Lab, Preset Manager _are_ considered proper nouns +- **Acronyms** (e.g. CSS, HTML) +- When referring to **UI labels that are themselves capitalized** from sentence case (e.g. page titles - Dashboards page, Charts page, Saved queries page, etc.) +- User input that is reflected in the UI. E.g. a user-named a dashboard tab + +**Sentence case vs. Title case:** +Title case: "A Dog Takes a Walk in Paris" +Sentence case: "A dog takes a walk in Paris" + +**Why sentence case?** + +- It’s generally accepted as the quickest to read +- It’s the easiest form to distinguish between common and proper nouns + +#### How to refer to UI elements + +When writing about a UI element, use the same capitalization as used in the UI. + +For example, if an input field is labeled “Name” then you refer to this as the “Name input field”. Similarly, if a button has the label “Save” in it, then it is correct to refer to the “Save button”. + +Where a product page is titled “Settings”, you refer to this in writing as follows: +“Edit your personal information on the Settings page”. + +Often a product page will have the same title as the objects it contains. In this case, refer to the page as it appears in the UI, and the objects as common nouns: + +- Upload a dashboard on the Dashboards page +- Go to Dashboards +- View dashboard +- View all dashboards +- Upload CSS templates on the CSS templates page +- Queries that you save will appear on the Saved queries page +- Create custom queries in SQL Lab then create dashboards + +#### \*\*Exceptions to sentence case: + +- Input labels, buttons and UI tabs are all caps +- User input values (e.g. column names, SQL Lab tab names) should be in their original case diff --git a/docs-v2/docs/Contributing/testing-locally.mdx b/docs-v2/docs/Contributing/testing-locally.mdx new file mode 100644 index 0000000000000..17a1c81086444 --- /dev/null +++ b/docs-v2/docs/Contributing/testing-locally.mdx @@ -0,0 +1,275 @@ +--- +title: Testing +hide_title: true +sidebar_position: 8 +version: 1 +--- + +## Testing + +### Python Testing + +All python tests are carried out in [tox](https://tox.readthedocs.io/en/latest/index.html) +a standardized testing framework. +All python tests can be run with any of the tox [environments](https://tox.readthedocs.io/en/latest/example/basic.html#a-simple-tox-ini-default-environments), via, + +```bash +tox -e +``` + +For example, + +```bash +tox -e py38 +``` + +Alternatively, you can run all tests in a single file via, + +```bash +tox -e -- tests/test_file.py +``` + +or for a specific test via, + +```bash +tox -e -- tests/test_file.py::TestClassName::test_method_name +``` + +Note that the test environment uses a temporary directory for defining the +SQLite databases which will be cleared each time before the group of test +commands are invoked. + +There is also a utility script included in the Superset codebase to run python integration tests. The [readme can be +found here](https://github.com/apache/superset/tree/master/scripts/tests) + +To run all integration tests for example, run this script from the root directory: + +```bash +scripts/tests/run.sh +``` + +You can run unit tests found in './tests/unit_tests' for example with pytest. It is a simple way to run an isolated test that doesn't need any database setup + +```bash +pytest ./link_to_test.py +``` + +### Frontend Testing + +We use [Jest](https://jestjs.io/) and [Enzyme](https://airbnb.io/enzyme/) to test TypeScript/JavaScript. Tests can be run with: + +```bash +cd superset-frontend +npm run test +``` + +To run a single test file: + +```bash +npm run test -- path/to/file.js +``` + +### Integration Testing + +We use [Cypress](https://www.cypress.io/) for integration tests. Tests can be run by `tox -e cypress`. To open Cypress and explore tests first setup and run test server: + +```bash +export SUPERSET_CONFIG=tests.integration_tests.superset_test_config +export SUPERSET_TESTENV=true +export ENABLE_REACT_CRUD_VIEWS=true +export CYPRESS_BASE_URL="http://localhost:8081" +superset db upgrade +superset load_test_users +superset load-examples --load-test-data +superset init +superset run --port 8081 +``` + +Run Cypress tests: + +```bash +cd superset-frontend +npm run build-instrumented + +cd cypress-base +npm install + +# run tests via headless Chrome browser (requires Chrome 64+) +npm run cypress-run-chrome + +# run tests from a specific file +npm run cypress-run-chrome -- --spec cypress/integration/explore/link.test.ts + +# run specific file with video capture +npm run cypress-run-chrome -- --spec cypress/integration/dashboard/index.test.js --config video=true + +# to open the cypress ui +npm run cypress-debug + +# to point cypress to a url other than the default (http://localhost:8088) set the environment variable before running the script +# e.g., CYPRESS_BASE_URL="http://localhost:9000" +CYPRESS_BASE_URL= npm run cypress open +``` + +See [`superset-frontend/cypress_build.sh`](https://github.com/apache/superset/blob/master/superset-frontend/cypress_build.sh). + +As an alternative you can use docker-compose environment for testing: + +Make sure you have added below line to your /etc/hosts file: +`127.0.0.1 db` + +If you already have launched Docker environment please use the following command to assure a fresh database instance: +`docker-compose down -v` + +Launch environment: + +`CYPRESS_CONFIG=true docker-compose up` + +It will serve backend and frontend on port 8088. + +Run Cypress tests: + +```bash +cd cypress-base +npm install +npm run cypress open +``` + +### Debugging Server App + +Follow these instructions to debug the Flask app running inside a docker container. + +First add the following to the ./docker-compose.yaml file + +```diff +superset: + env_file: docker/.env + image: *superset-image + container_name: superset_app + command: ["/app/docker/docker-bootstrap.sh", "app"] + restart: unless-stopped ++ cap_add: ++ - SYS_PTRACE + ports: + - 8088:8088 ++ - 5678:5678 + user: "root" + depends_on: *superset-depends-on + volumes: *superset-volumes + environment: + CYPRESS_CONFIG: "${CYPRESS_CONFIG}" +``` + +Start Superset as usual + +```bash +docker-compose up +``` + +Install the required libraries and packages to the docker container + +Enter the superset_app container + +```bash +docker exec -it superset_app /bin/bash +root@39ce8cf9d6ab:/app# +``` + +Run the following commands inside the container + +```bash +apt update +apt install -y gdb +apt install -y net-tools +pip install debugpy +``` + +Find the PID for the Flask process. Make sure to use the first PID. The Flask app will re-spawn a sub-process every time you change any of the python code. So it's important to use the first PID. + +```bash +ps -ef + +UID PID PPID C STIME TTY TIME CMD +root 1 0 0 14:09 ? 00:00:00 bash /app/docker/docker-bootstrap.sh app +root 6 1 4 14:09 ? 00:00:04 /usr/local/bin/python /usr/bin/flask run -p 8088 --with-threads --reload --debugger --host=0.0.0.0 +root 10 6 7 14:09 ? 00:00:07 /usr/local/bin/python /usr/bin/flask run -p 8088 --with-threads --reload --debugger --host=0.0.0.0 +``` + +Inject debugpy into the running Flask process. In this case PID 6. + +```bash +python3 -m debugpy --listen 0.0.0.0:5678 --pid 6 +``` + +Verify that debugpy is listening on port 5678 + +```bash +netstat -tunap + +Active Internet connections (servers and established) +Proto Recv-Q Send-Q Local Address Foreign Address State PID/Program name +tcp 0 0 0.0.0.0:5678 0.0.0.0:* LISTEN 462/python +tcp 0 0 0.0.0.0:8088 0.0.0.0:* LISTEN 6/python +``` + +You are now ready to attach a debugger to the process. Using VSCode you can configure a launch configuration file .vscode/launch.json like so. + +``` +{ + "version": "0.2.0", + "configurations": [ + { + "name": "Attach to Superset App in Docker Container", + "type": "python", + "request": "attach", + "connect": { + "host": "127.0.0.1", + "port": 5678 + }, + "pathMappings": [ + { + "localRoot": "${workspaceFolder}", + "remoteRoot": "/app" + } + ] + }, + ] +} +``` + +VSCode will not stop on breakpoints right away. We've attached to PID 6 however it does not yet know of any sub-processes. In order to "wakeup" the debugger you need to modify a python file. This will trigger Flask to reload the code and create a new sub-process. This new sub-process will be detected by VSCode and breakpoints will be activated. + +### Debugging Server App in Kubernetes Environment + +To debug Flask running in POD inside kubernetes cluster. You'll need to make sure the pod runs as root and is granted the SYS_TRACE capability.These settings should not be used in production environments. + +``` + securityContext: + capabilities: + add: ["SYS_PTRACE"] +``` + +See (set capabilities for a container)[https://kubernetes.io/docs/tasks/configure-pod-container/security-context/#set-capabilities-for-a-container] for more details. + +Once the pod is running as root and has the SYS_PTRACE capability it will be able to debug the Flask app. + +You can follow the same instructions as in the docker-compose. Enter the pod and install the required library and packages; gdb, netstat and debugpy. + +Often in a Kubernetes environment nodes are not addressable from outside the cluster. VSCode will thus be unable to remotely connect to port 5678 on a Kubernetes node. In order to do this you need to create a tunnel that port forwards 5678 to your local machine. + +``` +kubectl port-forward pod/superset- 5678:5678 +``` + +You can now launch your VSCode debugger with the same config as above. VSCode will connect to to 127.0.0.1:5678 which is forwarded by kubectl to your remote kubernetes POD. + +### Storybook + +Superset includes a [Storybook](https://storybook.js.org/) to preview the layout/styling of various Superset components, and variations thereof. To open and view the Storybook: + +```bash +cd superset-frontend +npm run storybook +``` + +When contributing new React components to Superset, please try to add a Story alongside the component's `jsx/tsx` file. diff --git a/docs-v2/docs/Contributing/translations.mdx b/docs-v2/docs/Contributing/translations.mdx new file mode 100644 index 0000000000000..bb57cdf745261 --- /dev/null +++ b/docs-v2/docs/Contributing/translations.mdx @@ -0,0 +1,103 @@ +--- +title: Translating +hide_title: true +sidebar_position: 9 +version: 1 +--- + +## Translating + +We use [Babel](http://babel.pocoo.org/en/latest/) to translate Superset. +In Python files, we import the magic `_` function using: + +```python +from flask_babel import lazy_gettext as _ +``` + +then wrap our translatable strings with it, e.g. `_('Translate me')`. +During extraction, string literals passed to `_` will be added to the +generated `.po` file for each language for later translation. + +At runtime, the `_` function will return the translation of the given +string for the current language, or the given string itself +if no translation is available. + +In TypeScript/JavaScript, the technique is similar: +we import `t` (simple translation), `tn` (translation containing a number). + +```javascript +import { t, tn } from "@superset-ui/translation"; +``` + +### Enabling language selection + +Add the `LANGUAGES` variable to your `superset_config.py`. Having more than one +option inside will add a language selection dropdown to the UI on the right side +of the navigation bar. + +```python +LANGUAGES = { + 'en': {'flag': 'us', 'name': 'English'}, + 'fr': {'flag': 'fr', 'name': 'French'}, + 'zh': {'flag': 'cn', 'name': 'Chinese'}, +} +``` + +### Extracting new strings for translation + +```bash +pybabel extract -F superset/translations/babel.cfg -o superset/translations/messages.pot -k _ -k __ -k t -k tn -k tct . +``` + +This will update the template file `superset/translations/messages.pot` with current application strings. Do not forget to update +this file with the appropriate license information. + +### Updating language files + +```bash + pybabel update -i superset/translations/messages.pot -d superset/translations --ignore-obsolete +``` + +This will update language files with the new extracted strings. + +You can then translate the strings gathered in files located under +`superset/translation`, where there's one per language. You can use [Poedit](https://poedit.net/features) +to translate the `po` file more conveniently. +There are some [tutorials in the wiki](https://wiki.lxde.org/en/Translate_*.po_files_with_Poedit). + +In the case of JS translation, we need to convert the PO file into a JSON file, and we need the global download of the npm package po2json. + +```bash +npm install -g po2json +``` + +To convert all PO files to formatted JSON files you can use the `po2json.sh` script. + +```bash +./scripts/po2json.sh +``` + +If you get errors running `po2json`, you might be running the Ubuntu package with the same +name, rather than the Node.js package (they have a different format for the arguments). If +there is a conflict, you may need to update your `PATH` environment variable or fully qualify +the executable path (e.g. `/usr/local/bin/po2json` instead of `po2json`). +If you get a lot of `[null,***]` in `messages.json`, just delete all the `null,`. +For example, `"year":["年"]` is correct while `"year":[null,"年"]`is incorrect. + +For the translations to take effect we need to compile translation catalogs into binary MO files. + +```bash +pybabel compile -d superset/translations +``` + +### Creating a new language dictionary + +To create a dictionary for a new language, run the following, where `LANGUAGE_CODE` is replaced with +the language code for your target language, e.g. `es` (see [Flask AppBuilder i18n documentation](https://flask-appbuilder.readthedocs.io/en/latest/i18n.html) for more details): + +```bash +pip install -r superset/translations/requirements.txt +pybabel init -i superset/translations/messages.pot -d superset/translations -l LANGUAGE_CODE +``` + +Then, [extract strings for the new language](#extracting-new-strings-for-translation). diff --git a/docs-v2/docs/Contributing/types-of-contributions.mdx b/docs-v2/docs/Contributing/types-of-contributions.mdx new file mode 100644 index 0000000000000..a9fa907f553de --- /dev/null +++ b/docs-v2/docs/Contributing/types-of-contributions.mdx @@ -0,0 +1,60 @@ +--- +title: Types of Contributions +hide_title: true +sidebar_position: 2 +version: 1 +--- + +## Types of Contributions + +### Report Bug + +The best way to report a bug is to file an issue on GitHub. Please include: + +- Your operating system name and version. +- Superset version. +- Detailed steps to reproduce the bug. +- Any details about your local setup that might be helpful in troubleshooting. + +When posting Python stack traces, please quote them using +[Markdown blocks](https://help.github.com/articles/creating-and-highlighting-code-blocks/). + +### Submit Ideas or Feature Requests + +The best way is to file an issue on GitHub: + +- Explain in detail how it would work. +- Keep the scope as narrow as possible, to make it easier to implement. +- Remember that this is a volunteer-driven project, and that contributions are welcome :) + +For large features or major changes to codebase, please create **Superset Improvement Proposal (SIP)**. See template from [SIP-0](https://github.com/apache/superset/issues/5602) + +### Fix Bugs + +Look through the GitHub issues. Issues tagged with `#bug` are +open to whoever wants to implement them. + +### Implement Features + +Look through the GitHub issues. Issues tagged with +`#feature` is open to whoever wants to implement it. + +### Improve Documentation + +Superset could always use better documentation, +whether as part of the official Superset docs, +in docstrings, `docs/*.rst` or even on the web as blog posts or +articles. See [Documentation](#documentation) for more details. + +### Add Translations + +If you are proficient in a non-English language, you can help translate +text strings from Superset's UI. You can jump in to the existing +language dictionaries at +`superset/translations//LC_MESSAGES/messages.po`, or +even create a dictionary for a new language altogether. +See [Translating](#translating) for more details. + +### Ask Questions + +There is a dedicated [`apache-superset` tag](https://stackoverflow.com/questions/tagged/apache-superset) on [StackOverflow](https://stackoverflow.com/). Please use it when asking questions. diff --git a/docs-v2/docs/Creating Charts and Dashboards/_category_.json b/docs-v2/docs/Creating Charts and Dashboards/_category_.json new file mode 100644 index 0000000000000..dc440b94466f2 --- /dev/null +++ b/docs-v2/docs/Creating Charts and Dashboards/_category_.json @@ -0,0 +1,4 @@ +{ + "label": "Creating Charts and Dashboards", + "position": 4 +} diff --git a/docs-v2/docs/Creating Charts and Dashboards/creating-your-first-dashboard.mdx b/docs-v2/docs/Creating Charts and Dashboards/creating-your-first-dashboard.mdx new file mode 100644 index 0000000000000..074458111fe14 --- /dev/null +++ b/docs-v2/docs/Creating Charts and Dashboards/creating-your-first-dashboard.mdx @@ -0,0 +1,191 @@ +--- +title: Creating Your First Dashboard +hide_title: true +sidebar_position: 1 +version: 1 +--- + +import useBaseUrl from "@docusaurus/useBaseUrl"; + +## Creating Your First Dashboard + +This section is focused on documentation for end-users who will be using Superset +for the data analysis and exploration workflow + (data analysts, business analysts, data +scientists, etc). In addition to this site, [Preset.io](http://preset.io/) maintains an updated set of end-user +documentation at [docs.preset.io](https://docs.preset.io/). + +This tutorial targets someone who wants to create charts and dashboards in Superset. We’ll show you +how to connect Superset to a new database and configure a table in that database for analysis. +You’ll also explore the data you’ve exposed and add a visualization to a dashboard so that you get a +feel for the end-to-end user experience. + +### Connecting to a new database + +Superset itself doesn't have a storage layer to store your data but instead pairs with +your existing SQL-speaking database or data store. + +First things first, we need to add the connection credentials to your database to be able +to query and visualize data from it. If you're using Superset locally via +[Docker compose](/docs/installation/installing-superset-using-docker-compose), you can +skip this step because a Postgres database, named **examples**, is included and +pre-configured in Superset for you. + +Under the **Data** menu, select the _Databases_ option: + +{" "}

+ +Next, click the green **+ Database** button in the top right corner: + +{" "}

+ +You can configure a number of advanced options in this window, but for this walkthrough you only +need to specify two things (the database name and SQLAlchemy URI): + + + +As noted in the text below +the URI, you should refer to the SQLAlchemy documentation on +[creating new connection URIs](https://docs.sqlalchemy.org/en/12/core/engines.html#database-urls) +for your target database. + +Click the **Test Connection** button to confirm things work end to end. If the connection looks good, save the configuration +by clicking the **Add** button in the bottom right corner of the modal window: + + + +Congratulations, you've just added a new data source in Superset! + +### Registering a new table + +Now that you’ve configured a data source, you can select specific tables (called **Datasets** in Superset) +that you want exposed in Superset for querying. + +Navigate to **Data ‣ Datasets** and select the **+ Dataset** button in the top right corner. + + + +A modal window should pop up in front of you. Select your **Database**, +**Schema**, and **Table** using the drop downs that appear. In the following example, +we register the **cleaned_sales_data** table from the **examples** database. + + + +To finish, click the **Add** button in the bottom right corner. You should now see your dataset in the list of datasets. + +### Customizing column properties + +Now that you've registered your dataset, you can configure column properties + for how the column should be treated in the Explore workflow: + +- Is the column temporal? (should it be used for slicing & dicing in time series charts?) +- Should the column be filterable? +- Is the column dimensional? +- If it's a datetime column, how should Superset parse +the datetime format? (using the [ISO-8601 string pattern](https://en.wikipedia.org/wiki/ISO_8601)) + + + +### Superset semantic layer + +Superset has a thin semantic layer that adds many quality of life improvements for analysts. +The Superset semantic layer can store 2 types of computed data: + +1. Virtual metrics: you can write SQL queries that aggregate values +from multiple column (e.g. `SUM(recovered) / SUM(confirmed)`) and make them +available as columns for (e.g. `recovery_rate`) visualization in Explore. +Agggregate functions are allowed and encouraged for metrics. + + + +You can also certify metrics if you'd like for your team in this view. + +2. Virtual calculated columns: you can write SQL queries that +customize the appearance and behavior +of a specific column (e.g. `CAST(recovery_rate) as float`). +Aggregate functions aren't allowed in calculated columns. + + + +### Creating charts in Explore view + +Superset has 2 main interfaces for exploring data: + +- **Explore**: no-code viz builder. Select your dataset, select the chart, +customize the appearance, and publish. +- **SQL Lab**: SQL IDE for cleaning, joining, and preparing data for Explore workflow + +We'll focus on the Explore view for creating charts right now. +To start the Explore workflow from the **Datasets** tab, start by clicking the name +of the dataset that will be powering your chart. + +

+ +You're now presented with a powerful workflow for exploring data and iterating on charts. + +- The **Dataset** view on the left-hand side has a list of columns and metrics, +scoped to the current dataset you selected. +- The **Data** preview below the chart area also gives you helpful data context. +- Using the **Data** tab and **Customize** tabs, you can change the visualization type, +select the temporal column, select the metric to group by, and customize +the aesthetics of the chart. + +As you customize your chart using drop-down menus, make sure to click the **Run** button +to get visual feedback. + + + +In the following screenshot, we craft a grouped Time-series Bar Chart to visualize +our quarterly sales data by product line just be clicking options in drop-down menus. + + + +### Creating a slice and dashboard + +To save your chart, first click the **Save** button. You can either: + +- Save your chart and add it to an existing dashboard +- Save your chart and add it to a new dashboard + +In the following screenshot, we save the chart to a new "Superset Duper Sales Dashboard": + + + +To publish, click **Save and goto Dashboard**. + +Behind the scenes, Superset will create a slice and store all the information needed +to create your chart in its thin data layer + (the query, chart type, options selected, name, etc). + + + + To resize the chart, start by clicking the pencil button in the top right corner. + + + +Then, click and drag the bottom right corner of the chart until the chart layout snaps +into a position you like onto the underlying grid. + + + + Click **Save** to persist the changes. + +Congrats! You’ve successfully linked, analyzed, and visualized data in Superset. There are a wealth +of other table configuration and visualization options, so please start exploring and creating +slices and dashboards of your own + +ֿ +### Manage access to Dashboards + + +Access to dashboards is managed via owners (users that have edit permissions to the dashboard) + +Non-owner users access can be managed two different ways: + +1. Dataset permissions - if you add to the relevant role permissions to datasets it automatically grants implict access to all dashboards that uses those permitted datasets +2. Dashboard roles - if you enable **DASHBOARD_RBAC** feature flag then you be able to manage which roles can access the dashboard +- Having dashboard access implicitly grants read access to the associated datasets, therefore +all charts will load their data even if feature flag is turned on and no roles assigned +to roles the access will fallback to **Dataset permissions** + + diff --git a/docs-v2/docs/Creating Charts and Dashboards/exploring-data.mdx b/docs-v2/docs/Creating Charts and Dashboards/exploring-data.mdx new file mode 100644 index 0000000000000..65f7cae737996 --- /dev/null +++ b/docs-v2/docs/Creating Charts and Dashboards/exploring-data.mdx @@ -0,0 +1,354 @@ +--- +title: Exploring Data in Superset +hide_title: true +sidebar_position: 2 +version: 1 +--- + +import useBaseUrl from "@docusaurus/useBaseUrl"; + +## Exploring Data in Superset + +In this tutorial, we will introduce key concepts in Apache Superset through the exploration of a +real dataset which contains the flights made by employees of a UK-based organization in 2011. The +following information about each flight is given: + +- The traveller’s department. For the purposes of this tutorial the departments have been renamed + Orange, Yellow and Purple. +- The cost of the ticket. +- The travel class (Economy, Premium Economy, Business and First Class). +- Whether the ticket was a single or return. +- The date of travel. +- Information about the origin and destination. +- The distance between the origin and destination, in kilometers (km). + +### Enabling Data Upload Functionality + +You may need to enable the functionality to upload a CSV or Excel file to your database. The following section +explains how to enable this functionality for the examples database. + +In the top menu, select **Data ‣ Databases**. Find the **examples** database in the list and +select the **Edit** button. + + + +In the resulting modal window, switch to the **Extra** tab and +tick the checkbox for **Allow Data Upload**. End by clicking the **Save** button. + + + +### Loading CSV Data + +Download the CSV dataset to your computer from +[Github](https://raw.githubusercontent.com/apache-superset/examples-data/master/tutorial_flights.csv). +In the Superset menu, select **Data ‣ Upload a CSV**. + + + +Then, enter the **Table Name** as _tutorial_flights_ and select the CSV file from your computer. + + + +Next enter the text _Travel Date_ into the **Parse Dates** field. + + + +Leaving all the other options in their default settings, select **Save** at the bottom of the page. + +### Table Visualization + +You should now see _tutorial_flights_ as a dataset in the **Datasets** tab. Click on the entry to +launch an Explore workflow using this dataset. + +In this section, we'll create a table visualization +to show the number of flights and cost per travel class. + +By default, Apache Superset only shows the last week of data. In our example, we want to visualize all +of the data in the dataset. Click the **Time ‣ Time Range** section and change +the **Range Type** to **No Filter**. + + + +Click **Apply** to save. + +Now, we want to specify the rows in our table by using the **Group by** option. Since in this +example, we want to understand different Travel Classes, we select **Travel Class** in this menu. + +Next, we can specify the metrics we would like to see in our table with the **Metrics** option. + +- `COUNT(*)`, which represents the number of rows in the table +(in this case, quantity of flights in each Travel Class) +- `SUM(Cost)`, which represents the total cost spent by each Travel Class + + + +Finally, select **Run Query** to see the results of the table. + + + +To save the visualization, click on **Save** in the top left of the screen. In the following modal, + +- Select the **Save as** +option and enter the chart name as Tutorial Table (you will be able to find it again through the +**Charts** screen, accessible in the top menu). +- Select **Add To Dashboard** and enter +Tutorial Dashboard. Finally, select **Save & Go To Dashboard**. + + + +### Dashboard Basics + +Next, we are going to explore the dashboard interface. If you’ve followed the previous section, you +should already have the dashboard open. Otherwise, you can navigate to the dashboard by selecting +Dashboards on the top menu, then Tutorial dashboard from the list of dashboards. + +On this dashboard you should see the table you created in the previous section. Select **Edit +dashboard** and then hover over the table. By selecting the bottom right hand corner of the table +(the cursor will change too), you can resize it by dragging and dropping. + + + +Finally, save your changes by selecting Save changes in the top right. + +### Pivot Table + +In this section, we will extend our analysis using a more complex visualization, Pivot Table. By the +end of this section, you will have created a table that shows the monthly spend on flights for the +first six months, by department, by travel class. + +Create a new chart by selecting **+ ‣ Chart** from the top right corner. Choose +tutorial_flights again as a datasource, then click on the visualization type to get to the +visualization menu. Select the **Pivot Table** visualization (you can filter by entering text in the +search box) and then **Create New Chart**. + + + +In the **Time** section, keep the Time Column as Travel Date (this is selected automatically as we +only have one time column in our dataset). Then select Time Grain to be month as having daily data +would be too granular to see patterns from. Then select the time range to be the first six months of +2011 by click on Last week in the Time Range section, then in Custom selecting a Start / end of 1st +January 2011 and 30th June 2011 respectively by either entering directly the dates or using the +calendar widget (by selecting the month name and then the year, you can move more quickly to far +away dates). + + + +Next, within the **Query** section, remove the default COUNT(\*) and add Cost, keeping the default +SUM aggregate. Note that Apache Superset will indicate the type of the metric by the symbol on the +left hand column of the list (ABC for string, # for number, a clock face for time, etc.). + +In **Group by** select **Time**: this will automatically use the Time Column and Time Grain +selections we defined in the Time section. + +Within **Columns**, select first Department and then Travel Class. All set – let’s **Run Query** to +see some data! + + + +You should see months in the rows and Department and Travel Class in the columns. Publish this chart +to your existing Tutorial Dashboard you created earlier. + +### Line Chart + +In this section, we are going to create a line chart to understand the average price of a ticket by +month across the entire dataset. + +In the Time section, as before, keep the Time Column as Travel Date and Time Grain as month but this +time for the Time range select No filter as we want to look at entire dataset. + +Within Metrics, remove the default `COUNT(*)` metric and instead add `AVG(Cost)`, to show the mean value. + + + +Next, select **Run Query** to show the data on the chart. + +How does this look? Well, we can see that the average cost goes up in December. However, perhaps it +doesn’t make sense to combine both single and return tickets, but rather show two separate lines for +each ticket type. + +Let’s do this by selecting Ticket Single or Return in the Group by box, and the selecting **Run +Query** again. Nice! We can see that on average single tickets are cheaper than returns and that the +big spike in December is caused by return tickets. + +Our chart is looking pretty good already, but let’s customize some more by going to the Customize +tab on the left hand pane. Within this pane, try changing the Color Scheme, removing the range +filter by selecting No in the Show Range Filter drop down and adding some labels using X Axis Label +and Y Axis Label. + + + +Once you’re done, publish the chart in your Tutorial Dashboard. + +### Markup + +In this section, we will add some text to our dashboard. If you’re there already, you can navigate +to the dashboard by selecting Dashboards on the top menu, then Tutorial dashboard from the list of +dashboards. Got into edit mode by selecting **Edit dashboard**. + +Within the Insert components pane, drag and drop a Markdown box on the dashboard. Look for the blue +lines which indicate the anchor where the box will go. + + + +Now, to edit the text, select the box. You can enter text, in markdown format (see +[this Markdown Cheatsheet](https://github.com/adam-p/markdown-here/wiki/Markdown-Cheatsheet) for +more information about this format). You can toggle between Edit and Preview using the menu on the +top of the box. + + + +To exit, select any other part of the dashboard. Finally, don’t forget to keep your changes using +**Save changes**. + +### Filter Box + +In this section, you will learn how to add a filter to your dashboard. Specifically, we will create +a filter that allows us to look at those flights that depart from a particular country. + +A filter box visualization can be created as any other visualization by selecting **+ ‣ Chart**, +and then _tutorial_flights_ as the datasource and Filter Box as the visualization type. + +First of all, in the **Time** section, remove the filter from the Time range selection by selecting +No filter. + +Next, in **Filters Configurations** first add a new filter by selecting the plus sign and then edit +the newly created filter by selecting the pencil icon. + +For our use case, it makes most sense to present a list of countries in alphabetical order. First, +enter the column as Origin Country and keep all other options the same and then select **Run +Query**. This gives us a preview of our filter. + +Next, remove the date filter by unchecking the Date Filter checkbox. + + + +Finally, select **Save**, name the chart as Tutorial Filter, add the chart to our existing Tutorial +Dashboard and then Save & go to dashboard. Once on the Dashboard, try using the filter to show only +those flights that departed from the United Kingdom – you will see the filter is applied to all of +the other visualizations on the dashboard. + +### Publishing Your Dashboard + +If you have followed all of the steps outlined in the previous section, you should have a dashboard +that looks like the below. If you would like, you can rearrange the elements of the dashboard by +selecting **Edit dashboard** and dragging and dropping. + +If you would like to make your dashboard available to other users, simply select Draft next to the +title of your dashboard on the top left to change your dashboard to be in Published state. You can +also favorite this dashboard by selecting the star. + + + +### Annotations + +Annotations allow you to add additional context to your chart. In this section, we will add an +annotation to the Tutorial Line Chart we made in a previous section. Specifically, we will add the +dates when some flights were cancelled by the UK’s Civil Aviation Authority in response to the +eruption of the Grímsvötn volcano in Iceland (23-25 May 2011). + +First, add an annotation layer by navigating to Manage ‣ Annotation Layers. Add a new annotation +layer by selecting the green plus sign to add a new record. Enter the name Volcanic Eruptions and +save. We can use this layer to refer to a number of different annotations. + +Next, add an annotation by navigating to Manage ‣ Annotations and then create a new annotation by +selecting the green plus sign. Then, select the Volcanic Eruptions layer, add a short description +Grímsvötn and the eruption dates (23-25 May 2011) before finally saving. + + + +Then, navigate to the line chart by going to Charts then selecting Tutorial Line Chart from the +list. Next, go to the Annotations and Layers section and select Add Annotation Layer. Within this +dialogue: + +- Name the layer as Volcanic Eruptions +- Change the Annotation Layer Type to Event +- Set the Annotation Source as Superset annotation +- Specify the Annotation Layer as Volcanic Eruptions + + + +Select **Apply** to see your annotation shown on the chart. + + + +If you wish, you can change how your annotation looks by changing the settings in the Display +configuration section. Otherwise, select **OK** and finally **Save** to save your chart. If you keep +the default selection to overwrite the chart, your annotation will be saved to the chart and also +appear automatically in the Tutorial Dashboard. + +### Advanced Analytics + +In this section, we are going to explore the Advanced Analytics feature of Apache Superset that +allows you to apply additional transformations to your data. The three types of transformation are: + +**Setting up the base chart** + +In this section, we’re going to set up a base chart which we can then apply the different **Advanced +Analytics** features to. Start off by creating a new chart using the same _tutorial_flights_ +datasource and the **Line Chart** visualization type. Within the Time section, set the Time Range as +1st October 2011 and 31st October 2011. + +Next, in the query section, change the Metrics to the sum of Cost. Select **Run Query** to show the +chart. You should see the total cost per day for each month in October 2011. + + + +Finally, save the visualization as Tutorial Advanced Analytics Base, adding it to the Tutorial +Dashboard. + +### Rolling Mean + +There is quite a lot of variation in the data, which makes it difficult to identify any trend. One +approach we can take is to show instead a rolling average of the time series. To do this, in the +**Moving Average** subsection of **Advanced Analytics**, select mean in the **Rolling** box and +enter 7 into both Periods and Min Periods. The period is the length of the rolling period expressed +as a multiple of the Time Grain. In our example, the Time Grain is day, so the rolling period is 7 +days, such that on the 7th October 2011 the value shown would correspond to the first seven days of +October 2011. Lastly, by specifying Min Periods as 7, we ensure that our mean is always calculated +on 7 days and we avoid any ramp up period. + +After displaying the chart by selecting **Run Query** you will see that the data is less variable +and that the series starts later as the ramp up period is excluded. + + + +Save the chart as Tutorial Rolling Mean and add it to the Tutorial Dashboard. + +### Time Comparison + +In this section, we will compare values in our time series to the value a week before. Start off by +opening the Tutorial Advanced Analytics Base chart, by going to **Charts** in the top menu and then +selecting the visualization name in the list (alternatively, find the chart in the Tutorial +Dashboard and select Explore chart from the menu for that visualization). + +Next, in the Time Comparison subsection of **Advanced Analytics**, enter the Time Shift by typing in +“minus 1 week” (note this box accepts input in natural language). Run Query to see the new chart, +which has an additional series with the same values, shifted a week back in time. + + + +Then, change the **Calculation type** to Absolute difference and select **Run Query**. We can now +see only one series again, this time showing the difference between the two series we saw +previously. + + + +Save the chart as Tutorial Time Comparison and add it to the Tutorial Dashboard. + +### Resampling the data + +In this section, we’ll resample the data so that rather than having daily data we have weekly data. +As in the previous section, reopen the Tutorial Advanced Analytics Base chart. + +Next, in the Python Functions subsection of **Advanced Analytics**, enter 7D, corresponding to seven +days, in the Rule and median as the Method and show the chart by selecting **Run Query**. + + + +Note that now we have a single data point every 7 days. In our case, the value showed corresponds to +the median value within the seven daily data points. For more information on the meaning of the +various options in this section, refer to the +[Pandas documentation](https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.DataFrame.resample.html). + +Lastly, save your chart as Tutorial Resample and add it to the Tutorial Dashboard. Go to the +tutorial dashboard to see the four charts side by side and compare the different outputs. diff --git a/docs-v2/docs/contribution.mdx b/docs-v2/docs/contribution.mdx deleted file mode 100644 index 58b6ed931142b..0000000000000 --- a/docs-v2/docs/contribution.mdx +++ /dev/null @@ -1,24 +0,0 @@ ---- -title: Contribution Guide -hide_title: true -sidebar_position: 7 ---- - -## Contributing to Superset - -Superset is an [Apache Software foundation](https://www.apache.org/theapacheway/index.html) project. -The core contributors (or committers) to Superset communicate primarily in the following channels (all of -which you can join): - -- [Mailing list](https://lists.apache.org/list.html?dev@superset.apache.org) -- [Apache Superset Slack community](https://join.slack.com/t/apache-superset/shared_invite/zt-uxbh5g36-AISUtHbzOXcu0BIj7kgUaw) -- [Github issues and PR's](https://github.com/apache/superset/issues) - -If you're interested in contributing, we recommend reading the Community Contribution Guide -[described in CONTRIBUTING.MD](https://github.com/apache/superset/blob/master/CONTRIBUTING.md) -to get started. Here are some helpful links from that page: - -- [Overview of types of contributions](https://github.com/apache/superset/blob/master/CONTRIBUTING.md#types-of-contributions) -- [Pull request guidelines](https://github.com/apache/superset/blob/master/CONTRIBUTING.md#pull-request-guidelines) -- [Managing Issues and PR's](https://github.com/apache/superset/blob/master/CONTRIBUTING.md#managing-issues-and-prs) -- [Setting up local environment for development](https://github.com/apache/superset/blob/master/CONTRIBUTING.md#setup-local-environment-for-development) diff --git a/docs-v2/docs/frequently-asked-questions.mdx b/docs-v2/docs/frequently-asked-questions.mdx index 71bb058ec021a..c01b82208e478 100644 --- a/docs-v2/docs/frequently-asked-questions.mdx +++ b/docs-v2/docs/frequently-asked-questions.mdx @@ -1,7 +1,7 @@ --- title: Frequently Asked Questions hide_title: true -sidebar_position: 6 +sidebar_position: 7 --- ## Frequently Asked Questions diff --git a/docs-v2/docs/miscellaneous/chart-params.mdx b/docs-v2/docs/miscellaneous/chart-params.mdx new file mode 100644 index 0000000000000..0bd94db22694b --- /dev/null +++ b/docs-v2/docs/miscellaneous/chart-params.mdx @@ -0,0 +1,147 @@ +--- +title: Chart Parameters Reference +hide_title: true +sidebar_position: 4 +version: 1 +--- + +## Chart Parameters + +Chart parameters are stored as a JSON encoded string the `slices.params` column and are often referenced throughout the code as form-data. Currently the form-data is neither versioned nor typed as thus is somewhat free-formed. Note in the future there may be merit in using something like [JSON Schema](https://json-schema.org/) to both annotate and validate the JSON object in addition to using a Mypy `TypedDict` (introduced in Python 3.8) for typing the form-data in the backend. This section serves as a potential primer for that work. + +The following tables provide a non-exhausive list of the various fields which can be present in the JSON object grouped by the Explorer pane sections. These values were obtained by extracting the distinct fields from a legacy deployment consisting of tens of thousands of charts and thus some fields may be missing whilst others may be deprecated. + +Note not all fields are correctly categorized. The fields vary based on visualization type and may appear in different sections depending on the type. Verified deprecated columns may indicate a missing migration and/or prior migrations which were unsuccessful and thus future work may be required to clean up the form-data. + +### Datasource & Chart Type + +| Field | Type | Notes | +| ----------------- | -------- | ----------------------------------- | +| `database_name` | _string_ | _Deprecated?_ | +| `datasource` | _string_ | `__` | +| `datasource_id` | _string_ | _Deprecated?_ See `datasource` | +| `datasource_name` | _string_ | _Deprecated?_ | +| `datasource_type` | _string_ | _Deprecated?_ See `datasource` | +| `viz_type` | _string_ | The **Visualization Type** widget | + +### Time + +| Field | Type | Notes | +| ------------------- | -------- | ------------------------------------- | +| `druid_time_origin` | _string_ | The Druid **Origin** widget | +| `granularity` | _string_ | The Druid **Time Granularity** widget | +| `granularity_sqla` | _string_ | The SQLA **Time Column** widget | +| `time_grain_sqla` | _string_ | The SQLA **Time Grain** widget | +| `time_range` | _string_ | The **Time range** widget | + +### GROUP BY + +| Field | Type | Notes | +| ------------------------- | --------------- | ----------------- | +| `metrics` | _array(string)_ | See Query section | +| `order_asc` | - | See Query section | +| `row_limit` | - | See Query section | +| `timeseries_limit_metric` | - | See Query section | + +### NOT GROUPED BY + +| Field | Type | Notes | +| --------------- | --------------- | ----------------------- | +| `order_by_cols` | _array(string)_ | The **Ordering** widget | +| `row_limit` | - | See Query section | + +### Y Axis 1 + +| Field | Type | Notes | +| --------------- | ---- | -------------------------------------------------- | +| `metric` | - | The **Left Axis Metric** widget. See Query section | +| `y_axis_format` | - | See Y Axis section | + +### Y Axis 2 + +| Field | Type | Notes | +| ---------- | ---- | --------------------------------------------------- | +| `metric_2` | - | The **Right Axis Metric** widget. See Query section | + +### Query + +| Field | Type | Notes | +| ------------------------------------------------------------------------------------------------------ | ------------------------------------------------- | ------------------------------------------------- | +| `adhoc_filters` | _array(object)_ | The **Filters** widget | +| `extra_filters` | _array(object)_ | Another pathway to the **Filters** widget.
It is generally used to pass dashboard filter parameters to a chart.
It can be used for appending additional filters to a chart that has been saved with its own filters on an ad-hoc basis if the chart is being used as a standalone widget.

For implementation examples see : [utils test.py](https://github.com/apache/superset/blob/66a4c94a1ed542e69fe6399bab4c01d4540486cf/tests/utils_tests.py#L181)
For insight into how superset processes the contents of this parameter see: [exploreUtils/index.js](https://github.com/apache/superset/blob/93c7f5bb446ec6895d7702835f3157426955d5a9/superset-frontend/src/explore/exploreUtils/index.js#L159) | +| `columns` | _array(string)_ | The **Breakdowns** widget | +| `groupby` | _array(string)_ | The **Group by** or **Series** widget | +| `limit` | _number_ | The **Series Limit** widget | +| `metric`
`metric_2`
`metrics`
`percent_mertics`
`secondary_metric`
`size`
`x`
`y` | _string_,_object_,_array(string)_,_array(object)_ | The metric(s) depending on the visualization type | +| `order_asc` | _boolean_ | The **Sort Descending** widget | +| `row_limit` | _number_ | The **Row limit** widget | +| `timeseries_limit_metric` | _object_ | The **Sort By** widget | + +The `metric` (or equivalent) and `timeseries_limit_metric` fields are all composed of either metric names or the JSON representation of the `AdhocMetric` TypeScript type. The `adhoc_filters` is composed of the JSON represent of the `AdhocFilter` TypeScript type (which can comprise of columns or metrics depending on whether it is a WHERE or HAVING clause). The `all_columns`, `all_columns_x`, `columns`, `groupby`, and `order_by_cols` fields all represent column names. + +### Chart Options + +| Field | Type | Notes | +| -------------- | --------- | --------------------------- | +| `color_picker` | _object_ | The **Fixed Color** widget | +| `label_colors` | _object_ | The **Color Scheme** widget | +| `normalized` | _boolean_ | The **Normalized** widget | + +### Y Axis + +| Field | Type | Notes | +| ---------------- | -------- | ---------------------------- | +| `y_axis_2_label` | _N/A_ | _Deprecated?_ | +| `y_axis_format` | _string_ | The **Y Axis Format** widget | +| `y_axis_zero` | _N/A_ | _Deprecated?_ | + +Note the `y_axis_format` is defined under various section for some charts. + +### Other + +| Field | Type | Notes | +| -------------- | -------- | ----- | +| `color_scheme` | _string_ | | + +### Unclassified + +| Field | Type | Notes | +| ----------------------------- | ----- | ----- | +| `add_to_dash` | _N/A_ | | +| `code` | _N/A_ | | +| `collapsed_fieldsets` | _N/A_ | | +| `comparison type` | _N/A_ | | +| `country_fieldtype` | _N/A_ | | +| `default_filters` | _N/A_ | | +| `entity` | _N/A_ | | +| `expanded_slices` | _N/A_ | | +| `filter_immune_slice_fields` | _N/A_ | | +| `filter_immune_slices` | _N/A_ | | +| `flt_col_0` | _N/A_ | | +| `flt_col_1` | _N/A_ | | +| `flt_eq_0` | _N/A_ | | +| `flt_eq_1` | _N/A_ | | +| `flt_op_0` | _N/A_ | | +| `flt_op_1` | _N/A_ | | +| `goto_dash` | _N/A_ | | +| `import_time` | _N/A_ | | +| `label` | _N/A_ | | +| `linear_color_scheme` | _N/A_ | | +| `new_dashboard_name` | _N/A_ | | +| `new_slice_name` | _N/A_ | | +| `num_period_compare` | _N/A_ | | +| `period_ratio_type` | _N/A_ | | +| `perm` | _N/A_ | | +| `rdo_save` | _N/A_ | | +| `refresh_frequency` | _N/A_ | | +| `remote_id` | _N/A_ | | +| `resample_fillmethod` | _N/A_ | | +| `resample_how` | _N/A_ | | +| `rose_area_proportion` | _N/A_ | | +| `save_to_dashboard_id` | _N/A_ | | +| `schema` | _N/A_ | | +| `series` | _N/A_ | | +| `show_bubbles` | _N/A_ | | +| `slice_name` | _N/A_ | | +| `timed_refresh_immune_slices` | _N/A_ | | +| `userid` | _N/A_ | | diff --git a/docs-v2/static/img/dashboard_card_view.jpg b/docs-v2/static/img/dashboard_card_view.jpg new file mode 100644 index 0000000000000..5b32c67b38be0 Binary files /dev/null and b/docs-v2/static/img/dashboard_card_view.jpg differ diff --git a/docs-v2/static/img/explore_ui.jpg b/docs-v2/static/img/explore_ui.jpg new file mode 100644 index 0000000000000..8097337278a7b Binary files /dev/null and b/docs-v2/static/img/explore_ui.jpg differ diff --git a/docs-v2/static/img/tutorial/add-data-upload.png b/docs-v2/static/img/tutorial/add-data-upload.png new file mode 100644 index 0000000000000..d72ad68dcecfe Binary files /dev/null and b/docs-v2/static/img/tutorial/add-data-upload.png differ diff --git a/docs-v2/static/img/tutorial/advanced_analytics_base.png b/docs-v2/static/img/tutorial/advanced_analytics_base.png new file mode 100644 index 0000000000000..5c10beda5238c Binary files /dev/null and b/docs-v2/static/img/tutorial/advanced_analytics_base.png differ diff --git a/docs-v2/static/img/tutorial/annotation.png b/docs-v2/static/img/tutorial/annotation.png new file mode 100644 index 0000000000000..62ac1f7abb6e7 Binary files /dev/null and b/docs-v2/static/img/tutorial/annotation.png differ diff --git a/docs-v2/static/img/tutorial/annotation_settings.png b/docs-v2/static/img/tutorial/annotation_settings.png new file mode 100644 index 0000000000000..246948786d07a Binary files /dev/null and b/docs-v2/static/img/tutorial/annotation_settings.png differ diff --git a/docs-v2/static/img/tutorial/average_aggregate_for_cost.png b/docs-v2/static/img/tutorial/average_aggregate_for_cost.png new file mode 100644 index 0000000000000..0c0c068782e1f Binary files /dev/null and b/docs-v2/static/img/tutorial/average_aggregate_for_cost.png differ diff --git a/docs-v2/static/img/tutorial/blue_bar_insert_component.png b/docs-v2/static/img/tutorial/blue_bar_insert_component.png new file mode 100644 index 0000000000000..2cfb01d3325d6 Binary files /dev/null and b/docs-v2/static/img/tutorial/blue_bar_insert_component.png differ diff --git a/docs-v2/static/img/tutorial/create_pivot.png b/docs-v2/static/img/tutorial/create_pivot.png new file mode 100644 index 0000000000000..2a24ee25153ce Binary files /dev/null and b/docs-v2/static/img/tutorial/create_pivot.png differ diff --git a/docs-v2/static/img/tutorial/csv_to_database_configuration.png b/docs-v2/static/img/tutorial/csv_to_database_configuration.png new file mode 100644 index 0000000000000..79fabca28a095 Binary files /dev/null and b/docs-v2/static/img/tutorial/csv_to_database_configuration.png differ diff --git a/docs-v2/static/img/tutorial/dashboard.png b/docs-v2/static/img/tutorial/dashboard.png new file mode 100644 index 0000000000000..99a734df63d3c Binary files /dev/null and b/docs-v2/static/img/tutorial/dashboard.png differ diff --git a/docs-v2/static/img/tutorial/edit-record.png b/docs-v2/static/img/tutorial/edit-record.png new file mode 100644 index 0000000000000..4725bf7e06f8a Binary files /dev/null and b/docs-v2/static/img/tutorial/edit-record.png differ diff --git a/docs-v2/static/img/tutorial/edit_annotation.png b/docs-v2/static/img/tutorial/edit_annotation.png new file mode 100644 index 0000000000000..30d14ae8d8cef Binary files /dev/null and b/docs-v2/static/img/tutorial/edit_annotation.png differ diff --git a/docs-v2/static/img/tutorial/filter_on_origin_country.png b/docs-v2/static/img/tutorial/filter_on_origin_country.png new file mode 100644 index 0000000000000..ee7693bb6f803 Binary files /dev/null and b/docs-v2/static/img/tutorial/filter_on_origin_country.png differ diff --git a/docs-v2/static/img/tutorial/markdown.png b/docs-v2/static/img/tutorial/markdown.png new file mode 100644 index 0000000000000..f0345ae2bf2c9 Binary files /dev/null and b/docs-v2/static/img/tutorial/markdown.png differ diff --git a/docs-v2/static/img/tutorial/no_filter_on_time_filter.png b/docs-v2/static/img/tutorial/no_filter_on_time_filter.png new file mode 100644 index 0000000000000..6d2cc3f78e903 Binary files /dev/null and b/docs-v2/static/img/tutorial/no_filter_on_time_filter.png differ diff --git a/docs-v2/static/img/tutorial/parse_dates_column.png b/docs-v2/static/img/tutorial/parse_dates_column.png new file mode 100644 index 0000000000000..a9def08b75423 Binary files /dev/null and b/docs-v2/static/img/tutorial/parse_dates_column.png differ diff --git a/docs-v2/static/img/tutorial/publish_dashboard.png b/docs-v2/static/img/tutorial/publish_dashboard.png new file mode 100644 index 0000000000000..e18885ae137b2 Binary files /dev/null and b/docs-v2/static/img/tutorial/publish_dashboard.png differ diff --git a/docs-v2/static/img/tutorial/resample.png b/docs-v2/static/img/tutorial/resample.png new file mode 100644 index 0000000000000..a4fcd75f0dd2c Binary files /dev/null and b/docs-v2/static/img/tutorial/resample.png differ diff --git a/docs-v2/static/img/tutorial/resize_tutorial_table_on_dashboard.png b/docs-v2/static/img/tutorial/resize_tutorial_table_on_dashboard.png new file mode 100644 index 0000000000000..11ed34fb94293 Binary files /dev/null and b/docs-v2/static/img/tutorial/resize_tutorial_table_on_dashboard.png differ diff --git a/docs-v2/static/img/tutorial/rolling_mean.png b/docs-v2/static/img/tutorial/rolling_mean.png new file mode 100644 index 0000000000000..0e0faf303581a Binary files /dev/null and b/docs-v2/static/img/tutorial/rolling_mean.png differ diff --git a/docs-v2/static/img/tutorial/save_tutorial_table.png b/docs-v2/static/img/tutorial/save_tutorial_table.png new file mode 100644 index 0000000000000..e2294c4390209 Binary files /dev/null and b/docs-v2/static/img/tutorial/save_tutorial_table.png differ diff --git a/docs-v2/static/img/tutorial/select_dates_pivot_table.png b/docs-v2/static/img/tutorial/select_dates_pivot_table.png new file mode 100644 index 0000000000000..b7dd7fd862f74 Binary files /dev/null and b/docs-v2/static/img/tutorial/select_dates_pivot_table.png differ diff --git a/docs-v2/static/img/tutorial/sum_cost_column.png b/docs-v2/static/img/tutorial/sum_cost_column.png new file mode 100644 index 0000000000000..a9f37cf63c1bc Binary files /dev/null and b/docs-v2/static/img/tutorial/sum_cost_column.png differ diff --git a/docs-v2/static/img/tutorial/time_comparison_absolute_difference.png b/docs-v2/static/img/tutorial/time_comparison_absolute_difference.png new file mode 100644 index 0000000000000..b14043cd0abea Binary files /dev/null and b/docs-v2/static/img/tutorial/time_comparison_absolute_difference.png differ diff --git a/docs-v2/static/img/tutorial/time_comparison_two_series.png b/docs-v2/static/img/tutorial/time_comparison_two_series.png new file mode 100644 index 0000000000000..ae655936616f1 Binary files /dev/null and b/docs-v2/static/img/tutorial/time_comparison_two_series.png differ diff --git a/docs-v2/static/img/tutorial/tutorial_01_sources_database.png b/docs-v2/static/img/tutorial/tutorial_01_sources_database.png new file mode 100644 index 0000000000000..33cbfcaa61a45 Binary files /dev/null and b/docs-v2/static/img/tutorial/tutorial_01_sources_database.png differ diff --git a/docs-v2/static/img/tutorial/tutorial_02_add_database.png b/docs-v2/static/img/tutorial/tutorial_02_add_database.png new file mode 100644 index 0000000000000..1a1ce19af58a6 Binary files /dev/null and b/docs-v2/static/img/tutorial/tutorial_02_add_database.png differ diff --git a/docs-v2/static/img/tutorial/tutorial_03_database_name.png b/docs-v2/static/img/tutorial/tutorial_03_database_name.png new file mode 100644 index 0000000000000..40ead2c66e034 Binary files /dev/null and b/docs-v2/static/img/tutorial/tutorial_03_database_name.png differ diff --git a/docs-v2/static/img/tutorial/tutorial_04_add_button.png b/docs-v2/static/img/tutorial/tutorial_04_add_button.png new file mode 100644 index 0000000000000..50f99b45df4d5 Binary files /dev/null and b/docs-v2/static/img/tutorial/tutorial_04_add_button.png differ diff --git a/docs-v2/static/img/tutorial/tutorial_08_sources_tables.png b/docs-v2/static/img/tutorial/tutorial_08_sources_tables.png new file mode 100644 index 0000000000000..67252a706e9a8 Binary files /dev/null and b/docs-v2/static/img/tutorial/tutorial_08_sources_tables.png differ diff --git a/docs-v2/static/img/tutorial/tutorial_09_add_new_table.png b/docs-v2/static/img/tutorial/tutorial_09_add_new_table.png new file mode 100644 index 0000000000000..c470f8c7d186f Binary files /dev/null and b/docs-v2/static/img/tutorial/tutorial_09_add_new_table.png differ diff --git a/docs-v2/static/img/tutorial/tutorial_calculated_column.png b/docs-v2/static/img/tutorial/tutorial_calculated_column.png new file mode 100644 index 0000000000000..0a07daaa418da Binary files /dev/null and b/docs-v2/static/img/tutorial/tutorial_calculated_column.png differ diff --git a/docs-v2/static/img/tutorial/tutorial_chart_resize.png b/docs-v2/static/img/tutorial/tutorial_chart_resize.png new file mode 100644 index 0000000000000..4193a4b10caad Binary files /dev/null and b/docs-v2/static/img/tutorial/tutorial_chart_resize.png differ diff --git a/docs-v2/static/img/tutorial/tutorial_column_properties.png b/docs-v2/static/img/tutorial/tutorial_column_properties.png new file mode 100644 index 0000000000000..3a7194444ede8 Binary files /dev/null and b/docs-v2/static/img/tutorial/tutorial_column_properties.png differ diff --git a/docs-v2/static/img/tutorial/tutorial_dashboard_access.png b/docs-v2/static/img/tutorial/tutorial_dashboard_access.png new file mode 100644 index 0000000000000..f1ce5d6273a1d Binary files /dev/null and b/docs-v2/static/img/tutorial/tutorial_dashboard_access.png differ diff --git a/docs-v2/static/img/tutorial/tutorial_explore_run.jpg b/docs-v2/static/img/tutorial/tutorial_explore_run.jpg new file mode 100644 index 0000000000000..d57747153fc8c Binary files /dev/null and b/docs-v2/static/img/tutorial/tutorial_explore_run.jpg differ diff --git a/docs-v2/static/img/tutorial/tutorial_explore_settings.jpg b/docs-v2/static/img/tutorial/tutorial_explore_settings.jpg new file mode 100644 index 0000000000000..5f877b409b10e Binary files /dev/null and b/docs-v2/static/img/tutorial/tutorial_explore_settings.jpg differ diff --git a/docs-v2/static/img/tutorial/tutorial_first_dashboard.png b/docs-v2/static/img/tutorial/tutorial_first_dashboard.png new file mode 100644 index 0000000000000..57aeb1297e421 Binary files /dev/null and b/docs-v2/static/img/tutorial/tutorial_first_dashboard.png differ diff --git a/docs-v2/static/img/tutorial/tutorial_launch_explore.png b/docs-v2/static/img/tutorial/tutorial_launch_explore.png new file mode 100644 index 0000000000000..a49d024f10f34 Binary files /dev/null and b/docs-v2/static/img/tutorial/tutorial_launch_explore.png differ diff --git a/docs-v2/static/img/tutorial/tutorial_line_chart.png b/docs-v2/static/img/tutorial/tutorial_line_chart.png new file mode 100644 index 0000000000000..d66ef25b5cf67 Binary files /dev/null and b/docs-v2/static/img/tutorial/tutorial_line_chart.png differ diff --git a/docs-v2/static/img/tutorial/tutorial_pencil_edit.png b/docs-v2/static/img/tutorial/tutorial_pencil_edit.png new file mode 100644 index 0000000000000..3dc79800b0525 Binary files /dev/null and b/docs-v2/static/img/tutorial/tutorial_pencil_edit.png differ diff --git a/docs-v2/static/img/tutorial/tutorial_pivot_table.png b/docs-v2/static/img/tutorial/tutorial_pivot_table.png new file mode 100644 index 0000000000000..b80ea134a6ac9 Binary files /dev/null and b/docs-v2/static/img/tutorial/tutorial_pivot_table.png differ diff --git a/docs-v2/static/img/tutorial/tutorial_save_slice.png b/docs-v2/static/img/tutorial/tutorial_save_slice.png new file mode 100644 index 0000000000000..89e267738388b Binary files /dev/null and b/docs-v2/static/img/tutorial/tutorial_save_slice.png differ diff --git a/docs-v2/static/img/tutorial/tutorial_sql_metric.png b/docs-v2/static/img/tutorial/tutorial_sql_metric.png new file mode 100644 index 0000000000000..bc687bcfd39a0 Binary files /dev/null and b/docs-v2/static/img/tutorial/tutorial_sql_metric.png differ diff --git a/docs-v2/static/img/tutorial/tutorial_table.png b/docs-v2/static/img/tutorial/tutorial_table.png new file mode 100644 index 0000000000000..2c481a2835b35 Binary files /dev/null and b/docs-v2/static/img/tutorial/tutorial_table.png differ diff --git a/docs-v2/static/img/tutorial/upload_a_csv.png b/docs-v2/static/img/tutorial/upload_a_csv.png new file mode 100644 index 0000000000000..3c23b3d3be7c3 Binary files /dev/null and b/docs-v2/static/img/tutorial/upload_a_csv.png differ