diff --git a/.github/ISSUE_TEMPLATE/config.yml b/.github/ISSUE_TEMPLATE/config.yml index 25a814f751618..5f465d2648573 100644 --- a/.github/ISSUE_TEMPLATE/config.yml +++ b/.github/ISSUE_TEMPLATE/config.yml @@ -8,5 +8,5 @@ contact_links: url: https://github.com/apache/superset/discussions/new?category=q-a-help about: Open a community Q&A thread on GitHub Discussions - name: Slack - url: bit.ly/join-superset-slack - about: Join the Superset Community on Slack for other discussions/assistance + url: https://bit.ly/join-superset-slack + about: Join the Superset Community on Slack for other discussions and assistance diff --git a/.github/ISSUE_TEMPLATE/sip.md b/.github/ISSUE_TEMPLATE/sip.md index 8261b0f881a98..d0ca3ef1d940e 100644 --- a/.github/ISSUE_TEMPLATE/sip.md +++ b/.github/ISSUE_TEMPLATE/sip.md @@ -1,13 +1,13 @@ --- name: SIP -about: "Superset Improvement Proposal. See https://github.com/apache/superset/issues/5602 for details. The purpose of a Superset Improvement Proposal (SIP) is to introduce any major change into Apache Superset, such as a major new feature, subsystem, or piece of functionality, or any change that impacts the public interfaces of the project" +about: "Superset Improvement Proposal. See SIP-0 (https://github.com/apache/superset/issues/5602) for details. A SIP introduces any major change into Apache Superset's code or process." labels: sip title: "[SIP] Your Title Here (do not add SIP number)" assignees: "apache/superset-committers" --- *Please make sure you are familiar with the SIP process documented* -(here)[https://github.com/apache/superset/issues/5602]. The SIP will be numbered by a committer upon acceptance. +[here](https://github.com/apache/superset/issues/5602). The SIP will be numbered by a committer upon acceptance. ## [SIP] Proposal for ... diff --git a/.github/SECURITY.md b/.github/SECURITY.md index f35b9c48f0eec..086ff8c0cad08 100644 --- a/.github/SECURITY.md +++ b/.github/SECURITY.md @@ -12,8 +12,8 @@ Apache Software Foundation takes a rigorous standpoint in annihilating the secur in its software projects. Apache Superset is highly sensitive and forthcoming to issues pertaining to its features and functionality. If you have any concern or believe you have found a vulnerability in Apache Superset, -please get in touch with the Apache Security Team privately at -e-mail address [security@apache.org](mailto:security@apache.org). +please get in touch with the Apache Superset Security Team privately at +e-mail address [security@superset.apache.org](mailto:security@superset.apache.org). More details can be found on the ASF website at [ASF vulnerability reporting process](https://apache.org/security/#reporting-a-vulnerability) diff --git a/CHANGELOG.md b/CHANGELOG.md index 170824f6f22db..7ff5192163300 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -19,8 +19,13 @@ under the License. ## Change Log +- [3.1.0](#310-tue-jan-9-150500-2023--0800) +- [3.0.3](#303-fri-dec-8-054009-2023--0800) +- [3.0.2](#302-mon-nov-20-073838-2023--0500) - [3.0.1](#301-tue-oct-13-103221-2023--0700) - [3.0.0](#300-thu-aug-24-133627-2023--0600) +- [2.1.3](#213-fri-dec-8-163651-2023--0700) +- [2.1.2](#212-wed-oct-18-165930-2023--0700) - [2.1.1](#211-sun-apr-23-154421-2023-0100) - [2.1.0](#210-thu-mar-16-211305-2023--0700) - [2.0.1](#201-fri-nov-4-103402-2022--0400) @@ -32,6 +37,489 @@ under the License. - [1.4.2](#142-sat-mar-19-000806-2022-0200) - [1.4.1](#141) +### 3.1.0 (Tue Jan 9 15:05:00 2023 -0800) + +**Database Migrations** + +- [#26160](https://github.com/apache/superset/pull/26160) fix: Migration order due to cherry which went astray (@john-bodley) +- [#24776](https://github.com/apache/superset/pull/24776) chore(sqlalchemy): Remove erroneous SQLAlchemy ORM session.merge operations (@john-bodley) +- [#25819](https://github.com/apache/superset/pull/25819) chore: Singularize tag models (@john-bodley) +- [#25911](https://github.com/apache/superset/pull/25911) chore: remove deprecated functions in SQLAlchemy (@gnought) +- [#25304](https://github.com/apache/superset/pull/25304) feat: Adds CLI commands to execute viz migrations (@michael-s-molina) +- [#25204](https://github.com/apache/superset/pull/25204) feat(datasource): Checkbox for always filtering main dttm in datasource (@Always-prog) +- [#24832](https://github.com/apache/superset/pull/24832) fix: Alembic migration head (@john-bodley) +- [#24701](https://github.com/apache/superset/pull/24701) feat(Tags): Allow users to favorite Tags on CRUD Listview page (@hughhhh) +- [#24755](https://github.com/apache/superset/pull/24755) feat: Add line width unit control in deckgl Polygon and Path (@kgabryje) +- [#24700](https://github.com/apache/superset/pull/24700) chore: Update pylint to 2.17.4 (@EugeneTorap) + +**Features** + +- [#26031](https://github.com/apache/superset/pull/26031) feat(deckgl-map): use an arbitraty Mabpox style URL (#26027) (@francois-travais) +- [#26136](https://github.com/apache/superset/pull/26136) feat: Adds legacy time support for Waterfall chart (@michael-s-molina) +- [#26123](https://github.com/apache/superset/pull/26123) feat(helm): Add option to deploy extra containers to remaining deployments (@bluemalkin) +- [#24714](https://github.com/apache/superset/pull/24714) feat: Add Apache Doris support (@liujiwen-up) +- [#26033](https://github.com/apache/superset/pull/26033) feat: Add Bubble chart migration logic (@michael-s-molina) +- [#25921](https://github.com/apache/superset/pull/25921) feat(metadb): handle decimals (@betodealmeida) +- [#24539](https://github.com/apache/superset/pull/24539) feat(sqllab): non-blocking persistence mode (@justinpark) +- [#25861](https://github.com/apache/superset/pull/25861) feat(sqllab): Show duration as separate column in Query History view (@sebastianliebscher) +- [#25809](https://github.com/apache/superset/pull/25809) feat(sqllab): TRINO_EXPAND_ROWS: expand columns from ROWs (@giftig) +- [#25952](https://github.com/apache/superset/pull/25952) feat: Add Area chart migration and tweaks the Timeseries chart migration (@michael-s-molina) +- [#25950](https://github.com/apache/superset/pull/25950) feat(explore): dataset macro: dttm filter context (@giftig) +- [#23973](https://github.com/apache/superset/pull/23973) feat: Adds Line chart migration logic (@michael-s-molina) +- [#20323](https://github.com/apache/superset/pull/20323) feat: safer insert RLS (@betodealmeida) +- [#25882](https://github.com/apache/superset/pull/25882) feat: method for dynamic `allows_alias_in_select` (@betodealmeida) +- [#25855](https://github.com/apache/superset/pull/25855) feat(sqllab): Dynamic query limit dropdown (@giftig) +- [#25344](https://github.com/apache/superset/pull/25344) feat(sqllab): Format sql (@justinpark) +- [#25557](https://github.com/apache/superset/pull/25557) feat: Improves the Waterfall chart (@michael-s-molina) +- [#23308](https://github.com/apache/superset/pull/23308) feat: support databend for superset (@hantmac) +- [#25795](https://github.com/apache/superset/pull/25795) feat: support server-side sessions (@dpgaspar) +- [#25783](https://github.com/apache/superset/pull/25783) feat(helm): Add option to deploy extra containers to init job (@bluemalkin) +- [#25696](https://github.com/apache/superset/pull/25696) feat(Export as PDF - rasterized): Adding rasterized pdf functionality to dashboard (@fisjac) +- [#25676](https://github.com/apache/superset/pull/25676) feat: add France's regions to country map visualization (@dmeaux) +- [#25569](https://github.com/apache/superset/pull/25569) feat: add database and schema names to dataset option (@soniagtm) +- [#25666](https://github.com/apache/superset/pull/25666) feat: Funnel/tooltip-customization (@CorbinBullard) +- [#25683](https://github.com/apache/superset/pull/25683) feat: Add week time grain for Elasticsearch datasets (@mikelv92) +- [#25423](https://github.com/apache/superset/pull/25423) feat(sqllab): ResultTable extension (@justinpark) +- [#25542](https://github.com/apache/superset/pull/25542) feat(sqllab): Add keyboard shortcut helper (@justinpark) +- [#25565](https://github.com/apache/superset/pull/25565) feat: migrate to docker compose v2 (@mdeshmu) +- [#24154](https://github.com/apache/superset/pull/24154) feat: Add Deck.gl Contour Layer (@Mattc1221) +- [#17906](https://github.com/apache/superset/pull/17906) feat(plugin-chart-echarts): Echarts Waterfall (@stephenLYZ) +- [#22107](https://github.com/apache/superset/pull/22107) feat: Adds the ECharts Bubble chart (@mayurnewase) +- [#25151](https://github.com/apache/superset/pull/25151) feat(sqllab): SPA migration (@justinpark) +- [#25247](https://github.com/apache/superset/pull/25247) feat: Implement using Playwright for taking screenshots in reports (@kgabryje) +- [#25303](https://github.com/apache/superset/pull/25303) feat: generic marshmallow error component (@betodealmeida) +- [#25377](https://github.com/apache/superset/pull/25377) feat(docker): Use docker buildx and Add ARM builds for dockerize and websocket (@alekseyolg) +- [#25343](https://github.com/apache/superset/pull/25343) feat: Adds Sunburst chart migration logic (@michael-s-molina) +- [#25345](https://github.com/apache/superset/pull/25345) feat(sqllab): extra logging when chart is downloaded (@zephyring) +- [#25280](https://github.com/apache/superset/pull/25280) feat(helm): Support HPA for supersetNode and supersetWorker (@tenkian4) +- [#25309](https://github.com/apache/superset/pull/25309) feat(tag): fast follow for Tags flatten api + update client with generator + some bug fixes (@hughhhh) +- [#24964](https://github.com/apache/superset/pull/24964) feat: Tags ListView Page (@hughhhh) +- [#24787](https://github.com/apache/superset/pull/24787) feat(sqllab): Show sql in the current result (@justinpark) +- [#25105](https://github.com/apache/superset/pull/25105) feat: removing renderCard from Tags/index.tsc to remove cardview from Tags ListView (@fisjac) +- [#25089](https://github.com/apache/superset/pull/25089) feat(docker): refactor docker images (@alekseyolg) +- [#24839](https://github.com/apache/superset/pull/24839) feat: Update Tags CRUD API (@hughhhh) +- [#25065](https://github.com/apache/superset/pull/25065) feat: adding Scarf pixels to gather telemetry on readme and website (@rusackas) +- [#14225](https://github.com/apache/superset/pull/14225) feat: a native SQLAlchemy dialect for Superset (@betodealmeida) +- [#25001](https://github.com/apache/superset/pull/25001) feat: Moves Profile to Single Page App (SPA) (@michael-s-molina) +- [#24983](https://github.com/apache/superset/pull/24983) feat(sqllab): Add /sqllab endpoint to the v1 api (@justinpark) +- [#24918](https://github.com/apache/superset/pull/24918) feat: command to test DB engine specs (@betodealmeida) +- [#24921](https://github.com/apache/superset/pull/24921) feat(gsheets): file upload (@betodealmeida) +- [#24934](https://github.com/apache/superset/pull/24934) feat: add MotherDuck DB engine spec (@betodealmeida) +- [#24909](https://github.com/apache/superset/pull/24909) feat: improve SQLite DB engine spec (@betodealmeida) +- [#24870](https://github.com/apache/superset/pull/24870) feat(chart): Added Central Asia countries to countries map (@Zoynels) +- [#24702](https://github.com/apache/superset/pull/24702) feat: add empty state for Tags (@hughhhh) +- [#24768](https://github.com/apache/superset/pull/24768) feat: add pandas performance dependencies (@sebastianliebscher) +- [#24618](https://github.com/apache/superset/pull/24618) feat(csv-upload): Configurable max filesize (@giftig) +- [#24580](https://github.com/apache/superset/pull/24580) feat(database): Database Filtering via custom configuration (@Antonio-RiveroMartnez) + +**Fixes** + +- [#26429](https://github.com/apache/superset/pull/26429) fix(post-processing): handle missing values in cumulative operator (@villebro) +- [#26424](https://github.com/apache/superset/pull/26424) fix(translations): Clear all (@capping) +- [#26404](https://github.com/apache/superset/pull/26404) fix(plugin-chart-echarts): support forced categorical x-axis (@villebro) +- [#26415](https://github.com/apache/superset/pull/26415) fix: In chart gallery thumbnail is rendered in case of no example in #16707 (@sivasathyaseeelan) +- [#26393](https://github.com/apache/superset/pull/26393) fix(chart): Resolve incorrect column customization when switching metrics in table chart (@soniagtm) +- [#26405](https://github.com/apache/superset/pull/26405) fix(sqllab): Bump duckdb-engine version to 0.9.5 (@guenp) +- [#26313](https://github.com/apache/superset/pull/26313) fix(dashboard): narrow empty drop area (@justinpark) +- [#26410](https://github.com/apache/superset/pull/26410) fix(dashboard): Chart menu disable is fixed on chart-fullscreen in issue #25992 (@sivasathyaseeelan) +- [#26362](https://github.com/apache/superset/pull/26362) fix: Reactivates native filters E2E tests (@michael-s-molina) +- [#26398](https://github.com/apache/superset/pull/26398) fix(embed): an error occurred while rendering the visualization: error: Item with key ... is not registered. (@rowdyroad) +- [#26353](https://github.com/apache/superset/pull/26353) fix(SelectControl): select zero value (@rekilina) +- [#26380](https://github.com/apache/superset/pull/26380) fix: Removes non-existent columns in the 2018 FCC Survey dataset (@michael-s-molina) +- [#26302](https://github.com/apache/superset/pull/26302) fix: Invalid references in the basic template (@michael-s-molina) +- [#26379](https://github.com/apache/superset/pull/26379) fix: Duplicated plugin registration (@michael-s-molina) +- [#26378](https://github.com/apache/superset/pull/26378) fix(databend): databend time grain expression (@hantmac) +- [#26151](https://github.com/apache/superset/pull/26151) fix(chart): Set max row limit + removed the option to use an empty row limit value (@CorbinBullard) +- [#26312](https://github.com/apache/superset/pull/26312) fix(Embedded): Avoid creating a filter key for guest users (@Vitor-Avila) +- [#26333](https://github.com/apache/superset/pull/26333) fix(logging): Add logging of change_dashboard_filter event for native dashboard filters (@john-bodley) +- [#26326](https://github.com/apache/superset/pull/26326) fix(accessibility): Enable tabbing on sort header of table chart (@arunthirumani) +- [#26324](https://github.com/apache/superset/pull/26324) fix(tagging): adding tags containing a “:” to dashboards (@lilykuang) +- [#26340](https://github.com/apache/superset/pull/26340) fix(dashboard): Don't switch to first tab when directPathToChild changes (@kgabryje) +- [#26283](https://github.com/apache/superset/pull/26283) fix(redshift): convert_dttm method for redshift dataset and tests (@gaurav7261) +- [#26281](https://github.com/apache/superset/pull/26281) fix(sql lab): Use quote_schema instead of quote method to format schema name (@guenp) +- [#25967](https://github.com/apache/superset/pull/25967) fix(typings): model_id is a multiple option (@gnought) +- [#26284](https://github.com/apache/superset/pull/26284) fix: Revert "fix(sqllab): flaky json explore modal due to over-rendering (#26156)" (@justinpark) +- [#26279](https://github.com/apache/superset/pull/26279) fix: Cannot expand initially hidden SQL Lab tab (@michael-s-molina) +- [#26269](https://github.com/apache/superset/pull/26269) fix(plugin-chart-echarts): use scale for truncating x-axis (@villebro) +- [#26264](https://github.com/apache/superset/pull/26264) fix: Stacked charts with numerical columns (@michael-s-molina) +- [#26243](https://github.com/apache/superset/pull/26243) fix(plugin-chart-echarts): undefined bounds for bubble chart (@villebro) +- [#26224](https://github.com/apache/superset/pull/26224) fix: Use page.locator in Playwright reports (@kgabryje) +- [#26156](https://github.com/apache/superset/pull/26156) fix(sqllab): flaky json explore modal due to over-rendering (@justinpark) +- [#25533](https://github.com/apache/superset/pull/25533) fix(menu): Styling active menu in SPA navigation (@justinpark) +- [#25977](https://github.com/apache/superset/pull/25977) fix(sqllab): table preview has gone (@justinpark) +- [#26066](https://github.com/apache/superset/pull/26066) fix: move driver import to method (@giftig) +- [#25934](https://github.com/apache/superset/pull/25934) fix(tag): update state to clear form on success (@hughhhh) +- [#25941](https://github.com/apache/superset/pull/25941) fix(sqllab): Allow router navigation to explore (@justinpark) +- [#25875](https://github.com/apache/superset/pull/25875) fix(typo): replace 'datasouce_id' with 'datasource_id' in openapi.json (@nero5700) +- [#25856](https://github.com/apache/superset/pull/25856) fix(tagging): change key from name to id for tagToSelectOption (@lilykuang) +- [#25831](https://github.com/apache/superset/pull/25831) fix: add validation on tag name to have name + onDelete refresh list view (@hughhhh) +- [#25851](https://github.com/apache/superset/pull/25851) fix: databend png pic (@hantmac) +- [#25803](https://github.com/apache/superset/pull/25803) fix(helm): Fix init extra containers (@bluemalkin) +- [#25739](https://github.com/apache/superset/pull/25739) fix(README): mismatched picture tags (@andy-clapson) +- [#25727](https://github.com/apache/superset/pull/25727) fix(metadb): handle durations (@betodealmeida) +- [#25718](https://github.com/apache/superset/pull/25718) fix(driver): bumping DuckDB to 0.9.2 (@rusackas) +- [#25603](https://github.com/apache/superset/pull/25603) fix(tags): +n tags for listview (@hughhhh) +- [#25578](https://github.com/apache/superset/pull/25578) fix(tags): Polish + Better messaging for skipped tags with bad permissions (@hughhhh) +- [#25582](https://github.com/apache/superset/pull/25582) fix(sqllab): Allow opening of SQL Lab in new browser tab (@justinpark) +- [#25615](https://github.com/apache/superset/pull/25615) fix(test-db): engine params (@betodealmeida) +- [#25532](https://github.com/apache/superset/pull/25532) fix: Breaking change in MachineAuthProvider constructor (@kgabryje) +- [#25547](https://github.com/apache/superset/pull/25547) fix: Make `host.docker.internal` available on linux (@sebastianliebscher) +- [#25536](https://github.com/apache/superset/pull/25536) fix: Tags Page ListView size to 10 (@hughhhh) +- [#25525](https://github.com/apache/superset/pull/25525) fix(test-db): removed attribute (@betodealmeida) +- [#25473](https://github.com/apache/superset/pull/25473) fix(tags): Update loading + pagination for Tags Page (@hughhhh) +- [#25470](https://github.com/apache/superset/pull/25470) fix(tags): fix clears delete on Tags Modal (@hughhhh) +- [#25496](https://github.com/apache/superset/pull/25496) fix: Tags Polish II (@hughhhh) +- [#24927](https://github.com/apache/superset/pull/24927) fix(Indian Map Changes): fixed-Indian-map-border (@Yaswanth-Perumalla) +- [#25403](https://github.com/apache/superset/pull/25403) fix: Tags Page Polish (@hughhhh) +- [#25306](https://github.com/apache/superset/pull/25306) fix(sqllab): misplaced limit warning alert (@justinpark) +- [#25361](https://github.com/apache/superset/pull/25361) fix: update helm chart app version (@hugosjoberg) +- [#25308](https://github.com/apache/superset/pull/25308) fix(sqllab): invalid persisted tab state (@justinpark) +- [#25216](https://github.com/apache/superset/pull/25216) fix(docs): Fixing a typo in README.md (@yousoph) +- [#25152](https://github.com/apache/superset/pull/25152) fix(sqllab): invalid reducer key name (@justinpark) +- [#25124](https://github.com/apache/superset/pull/25124) fix: Partially reverts #25007 (@michael-s-molina) +- [#25067](https://github.com/apache/superset/pull/25067) fix: small fixes for the meta DB (@betodealmeida) +- [#24963](https://github.com/apache/superset/pull/24963) fix(gsheets): add column names on file upload (@betodealmeida) +- [#24955](https://github.com/apache/superset/pull/24955) fix: timezone issue in Pandas 2 (@betodealmeida) +- [#24952](https://github.com/apache/superset/pull/24952) fix: `to_datetime` in Pandas 2 (@betodealmeida) +- [#24871](https://github.com/apache/superset/pull/24871) fix: Ignores hot update files when generating the manifest (@michael-s-molina) +- [#24868](https://github.com/apache/superset/pull/24868) fix: Ignores ResizeObserver errors in development mode (@michael-s-molina) + +**Others** + +- [#25770](https://github.com/apache/superset/pull/25770) chore: Add example charts for deck.gl (@willie-hung) +- [#26317](https://github.com/apache/superset/pull/26317) chore: Adds a tooltip for the alert's SQL input (@michael-s-molina) +- [#26297](https://github.com/apache/superset/pull/26297) chore: Add downloadAsImage types, change filter selector (@kgabryje) +- [#26315](https://github.com/apache/superset/pull/26315) chore: Use WEBDRIVER_OPTION_ARGS with Playwright (@kgabryje) +- [#26310](https://github.com/apache/superset/pull/26310) chore: Disables minor ticks by default (@michael-s-molina) +- [#26287](https://github.com/apache/superset/pull/26287) chore: update changelog for 2.1.3 (@eschutho) +- [#26251](https://github.com/apache/superset/pull/26251) chore: improve CSP add base uri restriction (@dpgaspar) +- [#26082](https://github.com/apache/superset/pull/26082) chore: lock the databend-sqlalchemy version (@hantmac) +- [#26212](https://github.com/apache/superset/pull/26212) chore: Moves xAxisLabelRotation to shared controls (@michael-s-molina) +- [#26188](https://github.com/apache/superset/pull/26188) chore: Lower giveup log level for retried functions to warning (@jfrag1) +- [#25961](https://github.com/apache/superset/pull/25961) chore: harmonize and clean up list views (@villebro) +- [#26147](https://github.com/apache/superset/pull/26147) chore: Rename SET_ACTIVE_TABS action, add a new action (@kgabryje) +- [#25996](https://github.com/apache/superset/pull/25996) chore(tags): Allow for lookup via ids vs. name in the API (@hughhhh) +- [#26058](https://github.com/apache/superset/pull/26058) chore: Adds the 3.1.0 Release Notes (@michael-s-molina) +- [#26000](https://github.com/apache/superset/pull/26000) docs(databases): Update pinot.mdx to incorporate username and password based connection. (@raamri) +- [#26075](https://github.com/apache/superset/pull/26075) chore: Adds 3.0.2 data to CHANGELOG.md (@michael-s-molina) +- [#25850](https://github.com/apache/superset/pull/25850) chore(command): Organize Commands according to SIP-92 (@john-bodley) +- [#26073](https://github.com/apache/superset/pull/26073) chore: Updates Announce template to include CHANGELOG.md and UPDATING.md files (@michael-s-molina) +- [#26064](https://github.com/apache/superset/pull/26064) build(deps-dev): bump @types/node from 20.9.3 to 20.9.4 in /superset-websocket (@dependabot[bot]) +- [#26063](https://github.com/apache/superset/pull/26063) build(deps): bump @types/lodash from 4.14.201 to 4.14.202 in /superset-websocket (@dependabot[bot]) +- [#25844](https://github.com/apache/superset/pull/25844) chore: Allow only iterables for BaseDAO.delete() (@john-bodley) +- [#25917](https://github.com/apache/superset/pull/25917) docs: update security policy and contributing (@dpgaspar) +- [#24773](https://github.com/apache/superset/pull/24773) chore(connector): Cleanup base models and views according to SIP-92 (@john-bodley) +- [#26039](https://github.com/apache/superset/pull/26039) docs(intro): fix a single broken link (BugHerd #97) (@sfirke) +- [#26049](https://github.com/apache/superset/pull/26049) build(deps-dev): bump @types/node from 20.9.1 to 20.9.3 in /superset-websocket (@dependabot[bot]) +- [#26048](https://github.com/apache/superset/pull/26048) build(deps-dev): bump @types/ws from 8.5.9 to 8.5.10 in /superset-websocket (@dependabot[bot]) +- [#26043](https://github.com/apache/superset/pull/26043) chore: bump shillelagh (@betodealmeida) +- [#26004](https://github.com/apache/superset/pull/26004) chore: Allow external extensions to include their own package.json files (@kgabryje) +- [#26044](https://github.com/apache/superset/pull/26044) docs(BH#109): Athena URI spec fix (@rusackas) +- [#26025](https://github.com/apache/superset/pull/26025) build(deps-dev): bump eslint from 8.53.0 to 8.54.0 in /superset-websocket (@dependabot[bot]) +- [#26013](https://github.com/apache/superset/pull/26013) chore: cleanup unused code in pandas 2.0+ (@gnought) +- [#26012](https://github.com/apache/superset/pull/26012) build(deps-dev): bump @types/node from 20.9.0 to 20.9.1 in /superset-websocket (@dependabot[bot]) +- [#26009](https://github.com/apache/superset/pull/26009) chore: Remove unnecessary autoflush from tagging and key/value workflows (@john-bodley) +- [#25551](https://github.com/apache/superset/pull/25551) docs: handling "System limit for number of file watchers reached" error (@nitish-samsung-jha) +- [#25986](https://github.com/apache/superset/pull/25986) chore: Remove more redundant code in utils/core (@sebastianliebscher) +- [#24485](https://github.com/apache/superset/pull/24485) style: Transition of Navbar from dark to light and vice-versa is now smooth (@git-init-priyanshu) +- [#25059](https://github.com/apache/superset/pull/25059) docs: add Tentacle to users list (@jdclarke5) +- [#25968](https://github.com/apache/superset/pull/25968) chore: Add entry point for SliceHeader frontend extension (@kgabryje) +- [#25891](https://github.com/apache/superset/pull/25891) chore: support different JWT CSRF cookie names (@dpgaspar) +- [#25953](https://github.com/apache/superset/pull/25953) build(deps-dev): bump axios from 0.25.0 to 1.6.0 in /superset-embedded-sdk (@dependabot[bot]) +- [#25927](https://github.com/apache/superset/pull/25927) build(deps-dev): bump @types/jsonwebtoken from 9.0.4 to 9.0.5 in /superset-websocket (@dependabot[bot]) +- [#25929](https://github.com/apache/superset/pull/25929) build(deps-dev): bump @types/uuid from 9.0.6 to 9.0.7 in /superset-websocket (@dependabot[bot]) +- [#25958](https://github.com/apache/superset/pull/25958) test: Reduce flaky integration tests triggered by `test_get_tag` (@sebastianliebscher) +- [#25948](https://github.com/apache/superset/pull/25948) chore: Simplify views/base (@sebastianliebscher) +- [#25951](https://github.com/apache/superset/pull/25951) build(deps): bump axios from 1.4.0 to 1.6.1 in /superset-frontend (@dependabot[bot]) +- [#25881](https://github.com/apache/superset/pull/25881) chore(issue template): attempting to fix two entries/links (@rusackas) +- [#25926](https://github.com/apache/superset/pull/25926) chore: removing unused chartMetadata field (@rusackas) +- [#25928](https://github.com/apache/superset/pull/25928) build(deps-dev): bump @types/node from 20.8.10 to 20.9.0 in /superset-websocket (@dependabot[bot]) +- [#25885](https://github.com/apache/superset/pull/25885) docs: Remove Python 3.8 from CONTRIBUTING.md (@koushik-rout-samsung) +- [#25900](https://github.com/apache/superset/pull/25900) chore: Simplify utils/cache by using default argument values (@sebastianliebscher) +- [#25912](https://github.com/apache/superset/pull/25912) chore: remove unused functions in utils/core (@sebastianliebscher) +- [#25907](https://github.com/apache/superset/pull/25907) build(deps): bump @types/lodash from 4.14.200 to 4.14.201 in /superset-websocket (@dependabot[bot]) +- [#25906](https://github.com/apache/superset/pull/25906) build(deps-dev): bump @types/ws from 8.5.7 to 8.5.9 in /superset-websocket (@dependabot[bot]) +- [#25905](https://github.com/apache/superset/pull/25905) build(deps-dev): bump @types/cookie from 0.5.3 to 0.5.4 in /superset-websocket (@dependabot[bot]) +- [#25262](https://github.com/apache/superset/pull/25262) chore: add more migration tests (@eschutho) +- [#25886](https://github.com/apache/superset/pull/25886) build(deps): bump cookie from 0.5.0 to 0.6.0 in /superset-websocket (@dependabot[bot]) +- [#25714](https://github.com/apache/superset/pull/25714) chore: Update INTHEWILD.md (@codek) +- [#25867](https://github.com/apache/superset/pull/25867) build(deps-dev): bump eslint from 8.52.0 to 8.53.0 in /superset-websocket (@dependabot[bot]) +- [#25852](https://github.com/apache/superset/pull/25852) chore: Updates Databend image extension reference in README.md (@michael-s-molina) +- [#25531](https://github.com/apache/superset/pull/25531) docs: Update location of `async_query_manager.py` (@emmanuel-ferdman) +- [#25817](https://github.com/apache/superset/pull/25817) chore(docker-compose): more host network specifiers (@giftig) +- [#25812](https://github.com/apache/superset/pull/25812) chore: Removes border of the color picker control (@michael-s-molina) +- [#25826](https://github.com/apache/superset/pull/25826) chore(websocket): Adding support for redis username in websocket server (@craig-rueda) +- [#25822](https://github.com/apache/superset/pull/25822) chore: Update sip.md to have a better call to action (@rusackas) +- [#25823](https://github.com/apache/superset/pull/25823) chore(issues): config.yaml added with feature request link to open a discussion (@rusackas) +- [#25816](https://github.com/apache/superset/pull/25816) build(deps-dev): bump @types/node from 20.8.7 to 20.8.10 in /superset-websocket (@dependabot[bot]) +- [#25530](https://github.com/apache/superset/pull/25530) docs: Add Cyberhaven to Users list (@ghost) +- [#25314](https://github.com/apache/superset/pull/25314) chore(celery): Cleanup config and async query specifications (@john-bodley) +- [#25778](https://github.com/apache/superset/pull/25778) build(deps): bump browserify-sign from 4.2.1 to 4.2.2 in /superset-frontend (@dependabot[bot]) +- [#24046](https://github.com/apache/superset/pull/24046) chore(security): Make get_database_perm/get_dataset_perm return optional (@john-bodley) +- [#25765](https://github.com/apache/superset/pull/25765) chore: Add config options for Playwright wait_until and default timeout (@kgabryje) +- [#25721](https://github.com/apache/superset/pull/25721) style(readme): reformatted (@bipinct) +- [#25737](https://github.com/apache/superset/pull/25737) chore: bump pymssql version (@gnought) +- [#25521](https://github.com/apache/superset/pull/25521) chore(websocket): [WIP] Making JWT algos configurable (@craig-rueda) +- [#25735](https://github.com/apache/superset/pull/25735) build(deps-dev): bump eslint from 8.51.0 to 8.52.0 in /superset-websocket (@dependabot[bot]) +- [#25726](https://github.com/apache/superset/pull/25726) chore: updated base DAO find_by_id to return generic type (@zephyring) +- [#25717](https://github.com/apache/superset/pull/25717) refactor: use DATE_TRUNC for Elasticsearch time grain (@mikelv92) +- [#25709](https://github.com/apache/superset/pull/25709) chore: helm chart: bump appVersion to 3.0.1 (@mdavidsen) +- [#25577](https://github.com/apache/superset/pull/25577) chore: Change the format for sha512 sum for releases (@sebastianliebscher) +- [#25689](https://github.com/apache/superset/pull/25689) build(deps-dev): bump @types/cookie from 0.5.1 to 0.5.3 in /superset-websocket (@dependabot[bot]) +- [#25701](https://github.com/apache/superset/pull/25701) build(deps-dev): bump @types/uuid from 9.0.4 to 9.0.6 in /superset-websocket (@dependabot[bot]) +- [#25710](https://github.com/apache/superset/pull/25710) docs(README): Fix typo (@RahulK4102) +- [#25700](https://github.com/apache/superset/pull/25700) build(deps-dev): bump @types/node from 20.8.6 to 20.8.7 in /superset-websocket (@dependabot[bot]) +- [#25322](https://github.com/apache/superset/pull/25322) chore: add latest docker tag (@eschutho) +- [#25691](https://github.com/apache/superset/pull/25691) chore: Adds 3.0.1 data to CHANGELOG.md (@michael-s-molina) +- [#25688](https://github.com/apache/superset/pull/25688) build(deps-dev): bump @types/jsonwebtoken from 9.0.3 to 9.0.4 in /superset-websocket (@dependabot[bot]) +- [#25543](https://github.com/apache/superset/pull/25543) chore: Cleanup hostNamesConfig.js (@john-bodley) +- [#25654](https://github.com/apache/superset/pull/25654) docs: make project-specific security page more prominent (@raboof) +- [#25667](https://github.com/apache/superset/pull/25667) chore: sync lock files (@villebro) +- [#25661](https://github.com/apache/superset/pull/25661) build(deps-dev): bump @babel/traverse from 7.16.0 to 7.23.2 in /superset-websocket (@dependabot[bot]) +- [#25653](https://github.com/apache/superset/pull/25653) build(deps-dev): bump @types/node from 20.8.5 to 20.8.6 in /superset-websocket (@dependabot[bot]) +- [#25645](https://github.com/apache/superset/pull/25645) chore: bump pip-tools (@villebro) +- [#25537](https://github.com/apache/superset/pull/25537) docs: invert logo color for dark theme in README (@Sea-n) +- [#25629](https://github.com/apache/superset/pull/25629) chore: adding resource links to readme (@rusackas) +- [#25638](https://github.com/apache/superset/pull/25638) build(ci): Provide diff for pre-commit failures (@jsoref) +- [#25632](https://github.com/apache/superset/pull/25632) build(deps-dev): bump @types/node from 20.8.4 to 20.8.5 in /superset-websocket (@dependabot[bot]) +- [#25455](https://github.com/apache/superset/pull/25455) chore(helm): spelling: initialize (@jsoref) +- [#25567](https://github.com/apache/superset/pull/25567) docs: BugHerd Tasks 88, 89, 90, 91 (@mdeshmu) +- [#25602](https://github.com/apache/superset/pull/25602) chore(feature?): Bump `scarf-js` to 1.3.0 to get more telemetry data (@rusackas) +- [#25502](https://github.com/apache/superset/pull/25502) build(deps): bump postcss from 8.3.11 to 8.4.31 in /docs (@dependabot[bot]) +- [#19056](https://github.com/apache/superset/pull/19056) docs: Add timezone information (@john-bodley) +- [#25340](https://github.com/apache/superset/pull/25340) refactor: Issue #25040; Refactored sync_role_definition function in order to reduce number of query. (@suicide11) +- [#25606](https://github.com/apache/superset/pull/25606) build(deps-dev): bump @types/ws from 8.5.6 to 8.5.7 in /superset-websocket (@dependabot[bot]) +- [#25585](https://github.com/apache/superset/pull/25585) build(deps): bump winston from 3.10.0 to 3.11.0 in /superset-websocket (@dependabot[bot]) +- [#25584](https://github.com/apache/superset/pull/25584) build(deps-dev): bump @types/node from 20.8.2 to 20.8.4 in /superset-websocket (@dependabot[bot]) +- [#25566](https://github.com/apache/superset/pull/25566) chore: Update pylint to 2.17.7 (@EugeneTorap) +- [#25574](https://github.com/apache/superset/pull/25574) build(deps-dev): bump eslint from 8.49.0 to 8.51.0 in /superset-websocket (@dependabot[bot]) +- [#25228](https://github.com/apache/superset/pull/25228) chore(sqllab): Typescript for SqlEditor component (@justinpark) +- [#25507](https://github.com/apache/superset/pull/25507) chore(tags): don't allow users to create new tags from property dropdowns (@hughhhh) +- [#25504](https://github.com/apache/superset/pull/25504) chore(tags): move tags column in dashboard and chart list (@lilykuang) +- [#24481](https://github.com/apache/superset/pull/24481) docs: fix for domain sharding results in failed requests with "Missing Authorization Header" (@ved-kashyap-samsung) +- [#25508](https://github.com/apache/superset/pull/25508) build(deps): bump ws and @types/ws in /superset-websocket (@dependabot[bot]) +- [#25498](https://github.com/apache/superset/pull/25498) build(deps-dev): bump @types/node from 20.6.0 to 20.8.2 in /superset-websocket (@dependabot[bot]) +- [#25480](https://github.com/apache/superset/pull/25480) docs: define localhost for docker (@mdeshmu) +- [#25479](https://github.com/apache/superset/pull/25479) docs: update docker compose instructions (@mdeshmu) +- [#25482](https://github.com/apache/superset/pull/25482) docs: add a FAQ about asset recovery from UI (@mdeshmu) +- [#25477](https://github.com/apache/superset/pull/25477) docs: add https & ldap instructions (@mdeshmu) +- [#25466](https://github.com/apache/superset/pull/25466) chore(async): Initial Refactoring of Global Async Queries (@craig-rueda) +- [#25120](https://github.com/apache/superset/pull/25120) build(deps-dev): bump prettier from 3.0.2 to 3.0.3 in /superset-websocket (@dependabot[bot]) +- [#25325](https://github.com/apache/superset/pull/25325) build(deps-dev): bump @types/jsonwebtoken from 9.0.2 to 9.0.3 in /superset-websocket (@dependabot[bot]) +- [#25435](https://github.com/apache/superset/pull/25435) docs(FAQ): remove reference to filter box, add Q&A re: usage analytics (@sfirke) +- [#25438](https://github.com/apache/superset/pull/25438) chore: Update Explore tooltip copy (@yousoph) +- [#25465](https://github.com/apache/superset/pull/25465) chore(misc): Typos in config.py (@JZ6) +- [#25457](https://github.com/apache/superset/pull/25457) chore(backend): Spelling (@jsoref) +- [#25456](https://github.com/apache/superset/pull/25456) chore(misc): Spelling (@jsoref) +- [#25453](https://github.com/apache/superset/pull/25453) chore(docs): Spelling (@jsoref) +- [#25441](https://github.com/apache/superset/pull/25441) build(deps): bump get-func-name from 2.0.0 to 2.0.2 in /superset-frontend/cypress-base (@dependabot[bot]) +- [#25276](https://github.com/apache/superset/pull/25276) chore: cryptography version bump (@lilykuang) +- [#25332](https://github.com/apache/superset/pull/25332) docs: update docker-compose (@nytai) +- [#25362](https://github.com/apache/superset/pull/25362) chore: upgrade node to most recent 16.x (@villebro) +- [#25360](https://github.com/apache/superset/pull/25360) chore: Adds 3.0 data to CHANGELOG and UPDATING (@michael-s-molina) +- [#25346](https://github.com/apache/superset/pull/25346) chore(async): Making create app configurable (@craig-rueda) +- [#24928](https://github.com/apache/superset/pull/24928) docs: jwks_uri addition to OAUTH provider (@kravi21) +- [#25312](https://github.com/apache/superset/pull/25312) docs: add snowflake-sqlalchemy in ./docker/requirements-local.txt (@janhavitripurwar) +- [#25324](https://github.com/apache/superset/pull/25324) docs: add ReadyTech to INTHEWILD.md (@jbat) +- [#25313](https://github.com/apache/superset/pull/25313) chore: bump gunicorn to v21 (@villebro) +- [#25311](https://github.com/apache/superset/pull/25311) build(deps-dev): bump @types/uuid from 9.0.3 to 9.0.4 in /superset-websocket (@dependabot[bot]) +- [#25274](https://github.com/apache/superset/pull/25274) chore(sqllab): Migrate tests to typescript (@justinpark) +- [#25291](https://github.com/apache/superset/pull/25291) chore: changing one word (disablement -> disabling) (@rusackas) +- [#25287](https://github.com/apache/superset/pull/25287) build(docker): bump geckodriver and firefox to latest (@alekseyolg) +- [#25293](https://github.com/apache/superset/pull/25293) build(deps): bump ws from 8.13.0 to 8.14.1 in /superset-websocket (@dependabot[bot]) +- [#25296](https://github.com/apache/superset/pull/25296) docs: rewrite superset docker localhost prose (@jsoref) +- [#25279](https://github.com/apache/superset/pull/25279) build(deps): bump uuid from 9.0.0 to 9.0.1 in /superset-websocket (@dependabot[bot]) +- [#25263](https://github.com/apache/superset/pull/25263) build(deps-dev): bump eslint from 8.48.0 to 8.49.0 in /superset-websocket (@dependabot[bot]) +- [#25253](https://github.com/apache/superset/pull/25253) build(deps-dev): bump @types/node from 20.5.7 to 20.6.0 in /superset-websocket (@dependabot[bot]) +- [#20631](https://github.com/apache/superset/pull/20631) refactor: Remove obsolete HiveEngineSpec.fetch_logs method (@john-bodley) +- [#25226](https://github.com/apache/superset/pull/25226) chore(read_csv): remove deprecated argument (@betodealmeida) +- [#25177](https://github.com/apache/superset/pull/25177) chore: Convert deckgl class components to functional (@kgabryje) +- [#24992](https://github.com/apache/superset/pull/24992) docs(FAQ): add answer re: necessary specs, copy-edit existing answer (@sfirke) +- [#25165](https://github.com/apache/superset/pull/25165) chore: back port 2.1.1 doc changes (@eschutho) +- [#25206](https://github.com/apache/superset/pull/25206) docs: add CVEs for 2.1.1 (@dpgaspar) +- [#25200](https://github.com/apache/superset/pull/25200) docs: fix wrong type in PREFERRED_DATABASES example (@cmontemuino) +- [#25160](https://github.com/apache/superset/pull/25160) chore: fix broken link to Celery worker docs (@wAVeckx) +- [#25142](https://github.com/apache/superset/pull/25142) build(deps-dev): bump @types/uuid from 9.0.2 to 9.0.3 in /superset-websocket (@dependabot[bot]) +- [#25141](https://github.com/apache/superset/pull/25141) build(deps): bump jsonwebtoken from 9.0.1 to 9.0.2 in /superset-websocket (@dependabot[bot]) +- [#25140](https://github.com/apache/superset/pull/25140) build(deps): bump jsonwebtoken from 9.0.1 to 9.0.2 in /superset-websocket/utils/client-ws-app (@dependabot[bot]) +- [#25088](https://github.com/apache/superset/pull/25088) chore: consolidate sqllab store into SPA store (@justinpark) +- [#25121](https://github.com/apache/superset/pull/25121) chore: move TypedDict from typing_extensions to typing (@sebastianliebscher) +- [#24896](https://github.com/apache/superset/pull/24896) chore: use contextlib.surpress instead of passing on error (@sebastianliebscher) +- [#25098](https://github.com/apache/superset/pull/25098) build(deps-dev): bump eslint from 8.47.0 to 8.48.0 in /superset-websocket (@dependabot[bot]) +- [#25097](https://github.com/apache/superset/pull/25097) build(deps-dev): bump @types/node from 20.5.6 to 20.5.7 in /superset-websocket (@dependabot[bot]) +- [#24933](https://github.com/apache/superset/pull/24933) chore: Refactor deck.gl plugins to Typescript (@kgabryje) +- [#24980](https://github.com/apache/superset/pull/24980) chore: Update docs for docker-compose installation (@hughhhh) +- [#24771](https://github.com/apache/superset/pull/24771) docs(docker-compose): add missing parenthesis (@sfirke) +- [#25082](https://github.com/apache/superset/pull/25082) build(deps-dev): bump @types/node from 20.5.1 to 20.5.6 in /superset-websocket (@dependabot[bot]) +- [#25080](https://github.com/apache/superset/pull/25080) chore(reports): add metrics to report schedule and log prune (@villebro) +- [#25047](https://github.com/apache/superset/pull/25047) chore(sqllab): typescript for getInitialState (@justinpark) +- [#25030](https://github.com/apache/superset/pull/25030) build(deps): Bump PyHive (@mdeshmu) +- [#24872](https://github.com/apache/superset/pull/24872) test(cypress): Fail Cypress on Console errors (@rusackas) +- [#25046](https://github.com/apache/superset/pull/25046) chore: Organizes the files of the ReportModal feature (@michael-s-molina) +- [#25045](https://github.com/apache/superset/pull/25045) chore(tests): Adding missing **init**.py files to various test packages (@craig-rueda) +- [#25038](https://github.com/apache/superset/pull/25038) build(deps-dev): bump @types/node from 20.5.0 to 20.5.1 in /superset-websocket (@dependabot[bot]) +- [#25010](https://github.com/apache/superset/pull/25010) chore(sqllab): Relocate user in SqlLab to root (@justinpark) +- [#25034](https://github.com/apache/superset/pull/25034) docs: fix line break in Apache Druid page (@giuliotal) +- [#24994](https://github.com/apache/superset/pull/24994) chore: rename `get_iterable` (@betodealmeida) +- [#25007](https://github.com/apache/superset/pull/25007) chore: Removes Saved Query old code (@michael-s-molina) +- [#24894](https://github.com/apache/superset/pull/24894) chore: Update DAOs to use singular deletion method instead of bulk (@jfrag1) +- [#25005](https://github.com/apache/superset/pull/25005) chore: Removes src/modules top folder (@michael-s-molina) +- [#24998](https://github.com/apache/superset/pull/24998) build(deps-dev): bump prettier from 3.0.1 to 3.0.2 in /superset-websocket (@dependabot[bot]) +- [#24941](https://github.com/apache/superset/pull/24941) chore(dashboard import/export): include additional fields to export/import commands (@Vitor-Avila) +- [#24967](https://github.com/apache/superset/pull/24967) chore(dao): Remove redundant convenience methods (@john-bodley) +- [#24973](https://github.com/apache/superset/pull/24973) build(deps-dev): bump @types/node from 20.4.9 to 20.5.0 in /superset-websocket (@dependabot[bot]) +- [#24972](https://github.com/apache/superset/pull/24972) build(deps-dev): bump eslint from 8.46.0 to 8.47.0 in /superset-websocket (@dependabot[bot]) +- [#24936](https://github.com/apache/superset/pull/24936) chore(sqllab): Relocate get bootstrap data logic (@justinpark) +- [#24467](https://github.com/apache/superset/pull/24467) chore(dao): Replace save/overwrite with create/update respectively (@john-bodley) +- [#24962](https://github.com/apache/superset/pull/24962) docs: Add wattbewerb to users list (@hbruch) +- [#24961](https://github.com/apache/superset/pull/24961) chore: Add Automattic to the list of users and contributors (@Khrol) +- [#24958](https://github.com/apache/superset/pull/24958) build(deps): bump tough-cookie and @cypress/request in /superset-frontend/cypress-base (@dependabot[bot]) +- [#24920](https://github.com/apache/superset/pull/24920) docs: Fixing Superset typo in docker-compose local installation guide (@TannerBarcelos) +- [#24924](https://github.com/apache/superset/pull/24924) build(deps-dev): bump @types/node from 20.4.8 to 20.4.9 in /superset-websocket (@dependabot[bot]) +- [#24915](https://github.com/apache/superset/pull/24915) docs: fix tip box in "Installing From Scratch" page (@giuliotal) +- [#24878](https://github.com/apache/superset/pull/24878) build(deps-dev): bump prettier from 2.8.8 to 3.0.1 in /superset-websocket (@dependabot[bot]) +- [#24900](https://github.com/apache/superset/pull/24900) build(deps-dev): bump eslint-config-prettier from 8.10.0 to 9.0.0 in /superset-websocket (@dependabot[bot]) +- [#24901](https://github.com/apache/superset/pull/24901) build(deps-dev): bump @types/node from 20.4.7 to 20.4.8 in /superset-websocket (@dependabot[bot]) +- [#24888](https://github.com/apache/superset/pull/24888) build(deps-dev): bump @types/node from 20.4.6 to 20.4.7 in /superset-websocket (@dependabot[bot]) +- [#24880](https://github.com/apache/superset/pull/24880) build(deps-dev): bump @types/node from 20.4.5 to 20.4.6 in /superset-websocket (@dependabot[bot]) +- [#24879](https://github.com/apache/superset/pull/24879) build(deps-dev): bump eslint-config-prettier from 8.8.0 to 8.10.0 in /superset-websocket (@dependabot[bot]) +- [#24873](https://github.com/apache/superset/pull/24873) docs(native-filters): Remove outdated statement (@john-bodley) +- [#24657](https://github.com/apache/superset/pull/24657) chore: Bump cryptography (@suryadev99) +- [#24842](https://github.com/apache/superset/pull/24842) build(deps-dev): bump eslint from 8.45.0 to 8.46.0 in /superset-websocket (@dependabot[bot]) +- [#24838](https://github.com/apache/superset/pull/24838) chore(api): clean up API spec (@sebastianliebscher) +- [#24834](https://github.com/apache/superset/pull/24834) docs(Kubernetes): Fix typos, clarify language re: Scarf (@sfirke) +- [#24819](https://github.com/apache/superset/pull/24819) chore: remove get_columns_description duplication (@betodealmeida) +- [#24817](https://github.com/apache/superset/pull/24817) docs: Adding a couple links to contributing page (@rusackas) +- [#24820](https://github.com/apache/superset/pull/24820) docs: fixing stack overflow link (@rusackas) +- [#24809](https://github.com/apache/superset/pull/24809) build(deps-dev): bump @types/node from 20.4.4 to 20.4.5 in /superset-websocket (@dependabot[bot]) +- [#19959](https://github.com/apache/superset/pull/19959) docs(K8s): Add instructions for loading the examples (@charris-msft) +- [#24147](https://github.com/apache/superset/pull/24147) chore: bump postgresql in docker-compose and github workflows (@sebastianliebscher) +- [#24779](https://github.com/apache/superset/pull/24779) build(deps-dev): bump @types/node from 20.4.2 to 20.4.4 in /superset-websocket (@dependabot[bot]) +- [#24751](https://github.com/apache/superset/pull/24751) docs: update AWS Athena and Redshift docs (@mdeshmu) +- [#24461](https://github.com/apache/superset/pull/24461) docs(docker-compose): note the risk of running a Docker Postgres volume in production (@sfirke) +- [#24705](https://github.com/apache/superset/pull/24705) chore(deps): bump pandas >=2.0 (@sebastianliebscher) +- [#24732](https://github.com/apache/superset/pull/24732) build(deps-dev): bump word-wrap from 1.2.3 to 1.2.4 in /superset-websocket (@dependabot[bot]) +- [#24715](https://github.com/apache/superset/pull/24715) chore: update deprecated arguments in schema (@sebastianliebscher) +- [#24733](https://github.com/apache/superset/pull/24733) build(deps): bump word-wrap from 1.2.3 to 1.2.4 in /superset-frontend/cypress-base (@dependabot[bot]) +- [#24735](https://github.com/apache/superset/pull/24735) build(deps-dev): bump word-wrap from 1.2.3 to 1.2.4 in /superset-frontend (@dependabot[bot]) +- [#24734](https://github.com/apache/superset/pull/24734) build(deps-dev): bump word-wrap from 1.2.3 to 1.2.4 in /superset-embedded-sdk (@dependabot[bot]) +- [#24712](https://github.com/apache/superset/pull/24712) build(deps-dev): bump eslint from 8.44.0 to 8.45.0 in /superset-websocket (@dependabot[bot]) +- [#24669](https://github.com/apache/superset/pull/24669) chore: remove obsolete fetchExploreJson function (@john-bodley) +- [#24682](https://github.com/apache/superset/pull/24682) build(deps-dev): bump @types/node from 20.4.1 to 20.4.2 in /superset-websocket (@dependabot[bot]) +- [#24674](https://github.com/apache/superset/pull/24674) docs: Fix typo in Rockset docs (@gadhagod) +- [#24672](https://github.com/apache/superset/pull/24672) build(deps-dev): bump @types/node from 20.4.0 to 20.4.1 in /superset-websocket (@dependabot[bot]) +- [#24673](https://github.com/apache/superset/pull/24673) build(deps-dev): bump @typescript-eslint/parser from 5.61.0 to 5.62.0 in /superset-websocket (@dependabot[bot]) +- [#24649](https://github.com/apache/superset/pull/24649) chore: Update Rockset—switching out rockset for rockset-sqlalchemy (@gadhagod) +- [#24653](https://github.com/apache/superset/pull/24653) build(deps): bump semver from 5.7.1 to 5.7.2 in /superset-frontend (@dependabot[bot]) +- [#24654](https://github.com/apache/superset/pull/24654) build(deps): bump semver from 6.3.0 to 6.3.1 in /superset-websocket (@dependabot[bot]) +- [#24655](https://github.com/apache/superset/pull/24655) build(deps): bump semver from 6.3.0 to 6.3.1 in /superset-frontend/cypress-base (@dependabot[bot]) +- [#24656](https://github.com/apache/superset/pull/24656) build(deps): bump trim and @superset-ui/core in /superset-frontend/cypress-base (@dependabot[bot]) +- [#24659](https://github.com/apache/superset/pull/24659) build(deps): bump winston from 3.9.0 to 3.10.0 in /superset-websocket (@dependabot[bot]) +- [#24626](https://github.com/apache/superset/pull/24626) chore: Re-enable some GitHub action workflows in draft mode (@john-bodley) +- [#24633](https://github.com/apache/superset/pull/24633) docs(databases): correct the way of using use environment variables (@duyet) +- [#24648](https://github.com/apache/superset/pull/24648) chore: update UI dev libs and fix warnings & vulnerabilities (@EugeneTorap) +- [#24651](https://github.com/apache/superset/pull/24651) build(deps): bump semver from 5.7.1 to 5.7.2 in /docs (@dependabot[bot]) +- [#24634](https://github.com/apache/superset/pull/24634) build(deps): bump tough-cookie from 4.0.0 to 4.1.3 in /superset-embedded-sdk (@dependabot[bot]) +- [#24614](https://github.com/apache/superset/pull/24614) build(deps): bump jsonwebtoken from 9.0.0 to 9.0.1 in /superset-websocket (@dependabot[bot]) +- [#23987](https://github.com/apache/superset/pull/23987) docs(frontend): Fixed typo in command (@ved-kashyap-samsung) +- [#23992](https://github.com/apache/superset/pull/23992) docs: correct databricks pip package name (@devonkinghorn) +- [#24632](https://github.com/apache/superset/pull/24632) build(deps): bump tough-cookie from 4.0.0 to 4.1.3 in /superset-websocket (@dependabot[bot]) +- [#24585](https://github.com/apache/superset/pull/24585) build(deps-dev): bump @typescript-eslint/eslint-plugin from 5.60.1 to 5.61.0 in /superset-websocket (@dependabot[bot]) +- [#24576](https://github.com/apache/superset/pull/24576) build(deps-dev): bump eslint from 8.43.0 to 8.44.0 in /superset-websocket (@dependabot[bot]) +- [#24601](https://github.com/apache/superset/pull/24601) build(deps-dev): bump @types/node from 20.3.2 to 20.4.0 in /superset-websocket (@dependabot[bot]) +- [#24584](https://github.com/apache/superset/pull/24584) build(deps-dev): bump @typescript-eslint/parser from 5.60.1 to 5.61.0 in /superset-websocket (@dependabot[bot]) +- [#24600](https://github.com/apache/superset/pull/24600) build(deps): bump jsonwebtoken from 9.0.0 to 9.0.1 in /superset-websocket/utils/client-ws-app (@dependabot[bot]) +- [#24570](https://github.com/apache/superset/pull/24570) docs(helm): reference the correct chart (@muniter) +- [#24564](https://github.com/apache/superset/pull/24564) docs: add notice not to use gevent worker with bigquery datasource (@okayhooni) +- [#24578](https://github.com/apache/superset/pull/24578) refactor: pkg_resources -> importlib.resources (@cwegener) +- [#24523](https://github.com/apache/superset/pull/24523) build(deps-dev): bump @typescript-eslint/eslint-plugin from 5.60.0 to 5.60.1 in /superset-websocket (@dependabot[bot]) + +### 3.0.3 (Fri Dec 8 05:40:09 2023 -0800) + +**Fixes** + +- [#26215](https://github.com/apache/superset/pull/26215) fix(plugin-chart-echarts): support truncated numeric x-axis (@villebro) +- [#26199](https://github.com/apache/superset/pull/26199) fix(chart-filter): Avoid column denormalization if not enabled (@Vitor-Avila) +- [#26211](https://github.com/apache/superset/pull/26211) fix: support custom links in markdown (@villebro) +- [#26189](https://github.com/apache/superset/pull/26189) fix(dashboard): title formatting (@nytai) +- [#26207](https://github.com/apache/superset/pull/26207) fix: Includes 90° x-axis label rotation (@michael-s-molina) +- [#26157](https://github.com/apache/superset/pull/26157) fix(init-job): Fix envFrom for init job in helm chart (@sumagoudb) +- [#25878](https://github.com/apache/superset/pull/25878) fix(embedded): Hide sensitive payload data from guest users (@jfrag1) +- [#25894](https://github.com/apache/superset/pull/25894) fix(Alerts/Reports): allow use of ";" separator in slack recipient entry (@rtexelm) +- [#26116](https://github.com/apache/superset/pull/26116) fix(database-import): Support importing a DB connection with a version set (@Vitor-Avila) +- [#26154](https://github.com/apache/superset/pull/26154) fix: set label on adhoc column should persist (@betodealmeida) +- [#26140](https://github.com/apache/superset/pull/26140) fix(annotations): time grain column (@betodealmeida) +- [#23916](https://github.com/apache/superset/pull/23916) fix: remove default secret key from helm (@dpgaspar) +- [#26120](https://github.com/apache/superset/pull/26120) fix: alias column when fetching values (@betodealmeida) +- [#26106](https://github.com/apache/superset/pull/26106) fix: flaky test_explore_json_async test v2 (@villebro) +- [#26091](https://github.com/apache/superset/pull/26091) fix: bump node-fetch to 2.6.7 (@dpgaspar) +- [#26087](https://github.com/apache/superset/pull/26087) fix(plugin-chart-echarts): support numerical x-axis (@villebro) +- [#26059](https://github.com/apache/superset/pull/26059) fix: Flaky test_explore_json_async test (@michael-s-molina) +- [#26023](https://github.com/apache/superset/pull/26023) fix: Prevent cached bootstrap data from leaking between users w/ same first/last name (@jfrag1) +- [#26060](https://github.com/apache/superset/pull/26060) fix: Optimize fetching samples logic (@john-bodley) +- [#26010](https://github.com/apache/superset/pull/26010) fix: Remove annotation Fuzzy to get french translation (@aehanno) +- [#26005](https://github.com/apache/superset/pull/26005) fix(security): restore default value of SESSION_COOKIE_SECURE to False (@sfirke) +- [#25883](https://github.com/apache/superset/pull/25883) fix(horizontal filter bar filter labels): Increase max-width to 96px (@rtexelm) + +**Others** + +- [#26208](https://github.com/apache/superset/pull/26208) chore: Adds note about numerical x-axis (@michael-s-molina) +- [#26158](https://github.com/apache/superset/pull/26158) chore: Clean up the examples dashboards (@michael-s-molina) +- [#25931](https://github.com/apache/superset/pull/25931) chore(deps): bump pillow deps (@gnought) + +### 3.0.2 (Mon Nov 20 07:38:38 2023 -0500) + +**Fixes** + +- [#26037](https://github.com/apache/superset/pull/26037) fix: update FAB to 4.3.10, Azure user info fix (@dpgaspar) +- [#25901](https://github.com/apache/superset/pull/25901) fix(native filters): rendering performance improvement by reduce overrendering (@justinpark) +- [#25985](https://github.com/apache/superset/pull/25985) fix(explore): redandant force param (@justinpark) +- [#25993](https://github.com/apache/superset/pull/25993) fix: Make Select component fire onChange listener when a selection is pasted in (@jfrag1) +- [#25997](https://github.com/apache/superset/pull/25997) fix(rls): Update text from tables to datasets in RLS modal (@yousoph) +- [#25703](https://github.com/apache/superset/pull/25703) fix(helm): Restart all related deployments when bootstrap script changed (@josedev-union) +- [#25973](https://github.com/apache/superset/pull/25973) fix: naming denomalized to denormalized in helpers.py (@hughhhh) +- [#25919](https://github.com/apache/superset/pull/25919) fix: always denorm column value before querying values (@hughhhh) +- [#25947](https://github.com/apache/superset/pull/25947) fix: update flask-caching to avoid breaking redis cache, solves #25339 (@ggbaro) +- [#25903](https://github.com/apache/superset/pull/25903) fix(sqllab): invalid sanitization on comparison symbol (@justinpark) +- [#25857](https://github.com/apache/superset/pull/25857) fix(table): Double percenting ad-hoc percentage metrics (@john-bodley) +- [#25872](https://github.com/apache/superset/pull/25872) fix(trino): allow impersonate_user flag to be imported (@FGrobelny) +- [#25897](https://github.com/apache/superset/pull/25897) fix: trino cursor (@betodealmeida) +- [#25898](https://github.com/apache/superset/pull/25898) fix: database version field (@betodealmeida) +- [#25877](https://github.com/apache/superset/pull/25877) fix: Saving Mixed Chart with dashboard filter applied breaks adhoc_filter_b (@kgabryje) +- [#25842](https://github.com/apache/superset/pull/25842) fix(charts): Time grain is None when dataset uses Jinja (@Antonio-RiveroMartnez) +- [#25843](https://github.com/apache/superset/pull/25843) fix: remove `update_charts_owners` (@betodealmeida) +- [#25707](https://github.com/apache/superset/pull/25707) fix(table chart): Show Cell Bars correctly #25625 (@SA-Ark) +- [#25429](https://github.com/apache/superset/pull/25429) fix: the temporal x-axis results in a none time_range. (@mapledan) +- [#25853](https://github.com/apache/superset/pull/25853) fix: Fires onChange when clearing all values of single select (@michael-s-molina) +- [#25814](https://github.com/apache/superset/pull/25814) fix(sqllab): infinite fetching status after results are landed (@justinpark) +- [#25768](https://github.com/apache/superset/pull/25768) fix(SQL field in edit dataset modal): display full sql query (@rtexelm) +- [#25804](https://github.com/apache/superset/pull/25804) fix: Resolve issue #24195 (@john-bodley) +- [#25801](https://github.com/apache/superset/pull/25801) fix: Revert "fix: Apply normalization to all dttm columns (#25147)" (@john-bodley) +- [#25779](https://github.com/apache/superset/pull/25779) fix: DB-specific quoting in Jinja macro (@betodealmeida) +- [#25640](https://github.com/apache/superset/pull/25640) fix: allow for backward compatible errors (@eschutho) +- [#25741](https://github.com/apache/superset/pull/25741) fix(sqllab): slow pop datasource query (@justinpark) +- [#25756](https://github.com/apache/superset/pull/25756) fix: dataset update uniqueness (@betodealmeida) +- [#25753](https://github.com/apache/superset/pull/25753) fix: Revert "fix(Charts): Set max row limit + removed the option to use an empty row limit value" (@geido) +- [#25732](https://github.com/apache/superset/pull/25732) fix(horizontal filter label): show full tooltip with ellipsis (@rtexelm) +- [#25712](https://github.com/apache/superset/pull/25712) fix: bump to FAB 4.3.9 remove CSP exception (@dpgaspar) +- [#24709](https://github.com/apache/superset/pull/24709) fix(chore): dashboard requests to database equal the number of slices it has (@Always-prog) +- [#25679](https://github.com/apache/superset/pull/25679) fix: remove unnecessary redirect (@Khrol) +- [#25680](https://github.com/apache/superset/pull/25680) fix(sqllab): reinstate "Force trino client async execution" (@giftig) +- [#25657](https://github.com/apache/superset/pull/25657) fix(dremio): Fixes issue with Dremio SQL generation for Charts with Series Limit (@OskarNS) +- [#23638](https://github.com/apache/superset/pull/23638) fix: warning of nth-child (@justinpark) +- [#25658](https://github.com/apache/superset/pull/25658) fix: improve upload ZIP file validation (@dpgaspar) +- [#25495](https://github.com/apache/superset/pull/25495) fix(header navlinks): link navlinks to path prefix (@fisjac) +- [#25112](https://github.com/apache/superset/pull/25112) fix: permalink save/overwrites in explore (@hughhhh) +- [#25493](https://github.com/apache/superset/pull/25493) fix(import): Make sure query context is overwritten for overwriting imports (@jfrag1) +- [#25553](https://github.com/apache/superset/pull/25553) fix: avoid 500 errors with SQLLAB_BACKEND_PERSISTENCE (@Khrol) +- [#25626](https://github.com/apache/superset/pull/25626) fix(sqllab): template validation error within comments (@justinpark) +- [#25523](https://github.com/apache/superset/pull/25523) fix(sqllab): Mistitled for new tab after rename (@justinpark) + +**Others** + +- [#25995](https://github.com/apache/superset/pull/25995) chore: Optimize fetching samples logic (@john-bodley) +- [#23619](https://github.com/apache/superset/pull/23619) chore(colors): Updating Airbnb brand colors (@john-bodley) + ### 3.0.1 (Tue Oct 13 10:32:21 2023 -0700) **Database Migrations** @@ -849,6 +1337,47 @@ under the License. - [#23158](https://github.com/apache/superset/pull/23158) chore: Bump cryptography to 39.0.1 (@EugeneTorap) - [#23108](https://github.com/apache/superset/pull/23108) chore: Remove yarn.lock from the root folder (@EugeneTorap) +### 2.1.3 (Fri Dec 8 16:36:51 2023 -0700) + +**Database Migrations** + +**Features** + +**Fixes** + +- [#25658](https://github.com/apache/superset/pull/25658) fix: improve upload ZIP file validation (@dpgaspar) +- [#25779](https://github.com/apache/superset/pull/25779) fix: DB-specific quoting in Jinja macro (@betodealmeida) +- [#25843](https://github.com/apache/superset/pull/25843) fix: remove `update_charts_owners` (@betodealmeida) + +**Others** + +- [#23862](https://github.com/apache/superset/pull/23862) chore: Use nh3 lib instead of bleach (@EugeneTorap) +- [#23965](https://github.com/apache/superset/pull/23965) chore: bump werkzeug and Flask (@dpgaspar) +- [#24033](https://github.com/apache/superset/pull/24033) chore: Update mypy and fix stubs issue (@EugeneTorap) +- [#24045](https://github.com/apache/superset/pull/24045) chore: Bump sqlparse to 0.4.4 (@EugeneTorap) +- [#24324](https://github.com/apache/superset/pull/24324) chore: rate limit requests (@betodealmeida) + +### 2.1.2 (Wed Oct 18 16:59:30 2023 -0700) + +**Database Migrations** + +**Features** + +**Fixes** + +- [#25150](https://github.com/apache/superset/pull/25150) fix: Chart series limit doesn't work for some databases (@KSPT-taylorjohn) +- [#25014](https://github.com/apache/superset/pull/25014) fix: CTE queries with non-SELECT statements (@dpgaspar) +- [#24849](https://github.com/apache/superset/pull/24849) fix: validation errors appearing after ssh tunnel switch (@hughhhh) +- [#24196](https://github.com/apache/superset/pull/24196) fix: SSH Tunnel creation with dynamic form (@hughhhh) +- [#24821](https://github.com/apache/superset/pull/24821) fix: Allow chart import to update the dataset an existing chart points to (@jfrag1) +- [#24317](https://github.com/apache/superset/pull/24317) fix: update order of build for testing a release (@eschutho) + +**Others** + +- [#24826](https://github.com/apache/superset/pull/24826) chore: remove CssTemplate and Annotation access from gamma role (@lilykuang) +- [#23680](https://github.com/apache/superset/pull/23680) chore: bump wtforms and add missing flask-limiter (@dpgaspar) +- [#24758](https://github.com/apache/superset/pull/24758) chore(view_api): return application/json as content-type for api/v1/form_data endpoint (@zephyring) + ### 2.1.1 (Sun Apr 23 15:44:21 2023 +0100) **Database Migrations** diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index d9e480ee95e5e..a955f123dbf59 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -180,6 +180,51 @@ See [Translating](#translating) for more details. There is a dedicated [`apache-superset` tag](https://stackoverflow.com/questions/tagged/apache-superset) on [StackOverflow](https://stackoverflow.com/). Please use it when asking questions. +## Types of Contributors + +Following the project governance model of the Apache Software Foundation (ASF), Apache Superset has a specific set of contributor roles: + +### PMC Member + +A Project Management Committee (PMC) member is a person who has been elected by the PMC to help manage the project. PMC members are responsible for the overall health of the project, including community development, release management, and project governance. PMC members are also responsible for the technical direction of the project. + +For more information about Apache Project PMCs, please refer to https://www.apache.org/foundation/governance/pmcs.html + +### Committer + +A committer is a person who has been elected by the PMC to have write access (commit access) to the code repository. They can modify the code, documentation, and website and accept contributions from others. + +The official list of committers and PMC members can be found [here](https://projects.apache.org/committee.html?superset). + +### Contributor + +A contributor is a person who has contributed to the project in any way, including but not limited to code, tests, documentation, issues, and discussions. + +> You can also review the Superset project's guidelines for PMC member promotion here: https://github.com/apache/superset/wiki/Guidelines-for-promoting-Superset-Committers-to-the-Superset-PMC + +### Security Team + +The security team is a selected subset of PMC members, committers and non-committers who are responsible for handling security issues. + +New members of the security team are selected by the PMC members in a vote. You can request to be added to the team by sending a message to private@superset.apache.org. However, the team should be small and focused on solving security issues, so the requests will be evaluated on a case-by-case basis and the team size will be kept relatively small, limited to only actively security-focused contributors. + +This security team must follow the [ASF vulnerability handling process](https://apache.org/security/committers.html#asf-project-security-for-committers). + +Each new security issue is tracked as a JIRA ticket on the [ASF's JIRA Superset security project](https://issues.apache.org/jira/secure/RapidBoard.jspa?rapidView=588&projectKey=SUPERSETSEC) + +Security team members must: + +- Have an [ICLA](https://www.apache.org/licenses/contributor-agreements.html) signed with Apache Software Foundation. +- Not reveal information about pending and unfixed security issues to anyone (including their employers) unless specifically authorised by the security team members, e.g., if the security team agrees that diagnosing and solving an issue requires the involvement of external experts. + +A release manager, the contributor overseeing the release of a specific version of Apache Superset, is by default a member of the security team. However, they are not expected to be active in assessing, discussing, and fixing security issues. + +Security team members should also follow these general expectations: + +- Actively participate in assessing, discussing, fixing, and releasing security issues in Superset. +- Avoid discussing security fixes in public forums. Pull request (PR) descriptions should not contain any information about security issues. The corresponding JIRA ticket should contain a link to the PR. +- Security team members who contribute to a fix may be listed as remediation developers in the CVE report, along with their job affiliation (if they choose to include it). + ## Pull Request Guidelines A philosophy we would like to strongly encourage is @@ -424,7 +469,7 @@ Commits to `master` trigger a rebuild and redeploy of the documentation site. Su Make sure your machine meets the [OS dependencies](https://superset.apache.org/docs/installation/installing-superset-from-scratch#os-dependencies) before following these steps. You also need to install MySQL or [MariaDB](https://mariadb.com/downloads). -Ensure that you are using Python version 3.8, 3.9, 3.10 or 3.11, then proceed with: +Ensure that you are using Python version 3.9, 3.10 or 3.11, then proceed with: ```bash # Create a virtual environment and activate it (recommended) @@ -610,6 +655,31 @@ Then put this: export NODE_OPTIONS=--no-experimental-fetch ``` +If while using the above commands you encounter an error related to the limit of file watchers: + +```bash +Error: ENOSPC: System limit for number of file watchers reached +``` +The error is thrown because the number of files monitored by the system has reached the limit. +You can address this this error by increasing the number of inotify watchers. + + +The current value of max watches can be checked with: +```bash +cat /proc/sys/fs/inotify/max_user_watches +``` +Edit the file /etc/sysctl.conf to increase this value. +The value needs to be decided based on the system memory [(see this StackOverflow answer for more context)](https://stackoverflow.com/questions/535768/what-is-a-reasonable-amount-of-inotify-watches-with-linux). + +Open the file in editor and add a line at the bottom specifying the max watches values. +```bash +fs.inotify.max_user_watches=524288 +``` +Save the file and exit editor. +To confirm that the change succeeded, run the following command to load the updated value of max_user_watches from sysctl.conf: +```bash +sudo sysctl -p +``` #### Webpack dev server The dev server by default starts at `http://localhost:9000` and proxies the backend requests to `http://localhost:8088`. diff --git a/README.md b/README.md index 757c0fb50364e..3588d9941913b 100644 --- a/README.md +++ b/README.md @@ -130,6 +130,7 @@ Here are some of the major database solutions that are supported: <img src="superset-frontend/src/assets/images/yugabyte.png" alt="yugabyte" border="0" width="200" height="80"/> <img src="superset-frontend/src/assets/images/databend.png" alt="databend" border="0" width="200" height="80"/> <img src="superset-frontend/src/assets/images/starrocks.png" alt="starrocks" border="0" width="200" height="80"/> + <img src="superset-frontend/src/assets/images/doris.png" alt="doris" border="0" width="200" height="80"/> </p> **A more comprehensive list of supported databases** along with the configuration instructions can be found [here](https://superset.apache.org/docs/databases/installing-database-drivers). diff --git a/RELEASING/README.md b/RELEASING/README.md index 8b23dafbf1eea..b007a891700b9 100644 --- a/RELEASING/README.md +++ b/RELEASING/README.md @@ -30,6 +30,7 @@ partaking in the process should join the channel. ## Release notes for recent releases +- [3.1](release-notes-3-1/README.md) - [2.0](release-notes-2-0/README.md) - [1.5](release-notes-1-5/README.md) - [1.4](release-notes-1-4/README.md) diff --git a/RELEASING/email_templates/announce.j2 b/RELEASING/email_templates/announce.j2 index 4eb89701be7ed..5e2318f79219a 100644 --- a/RELEASING/email_templates/announce.j2 +++ b/RELEASING/email_templates/announce.j2 @@ -35,6 +35,12 @@ The PyPI package: https://pypi.org/project/apache-superset/ +The Change Log for the release: +https://github.com/apache/{{ project_module }}/blob/{{ version }}/CHANGELOG.md + +The Updating instructions for the release: +https://github.com/apache/{{ project_module }}/blob/{{ version }}/UPDATING.md + If you have any usage questions or have problems when upgrading or find any issues with enhancements included in this release, please don't hesitate to let us know by sending feedback to this mailing diff --git a/RELEASING/release-notes-3-1/README.md b/RELEASING/release-notes-3-1/README.md new file mode 100644 index 0000000000000..97635139b1379 --- /dev/null +++ b/RELEASING/release-notes-3-1/README.md @@ -0,0 +1,166 @@ +<!-- +Licensed to the Apache Software Foundation (ASF) under one +or more contributor license agreements. See the NOTICE file +distributed with this work for additional information +regarding copyright ownership. The ASF licenses this file +to you under the Apache License, Version 2.0 (the +"License"); you may not use this file except in compliance +with the License. You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + +Unless required by applicable law or agreed to in writing, +software distributed under the License is distributed on an +"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +KIND, either express or implied. See the License for the +specific language governing permissions and limitations +under the License. +--> + +# Release Notes for Superset 3.1.0 + +Superset 3.1.0 brings a range of new features and quality of life improvements. This release is a minor version, meaning it doesn't include any breaking changes to ensure a seamless transition for our users. Here are some of the highlights of this release. + +### Waterfall chart + +The new [Waterfall chart](https://github.com/apache/superset/pull/25557) visualization provides a visual representation of how a value changes over time or across different categories. They are very helpful to show the cumulative effect of positive and negative changes from a starting value. Superset's Waterfall chart supports Breakdowns which can be used to analyze the contribution of different dimensions or factors to a specific metric. By breaking down the data into various categories or dimensions, you can identify the individual components that contribute to the overall variation or change in the metric. + +The chart example below displays the total sales grouped by year and broken down by product line. + +![Waterfall](media/waterfall_chart.png) + +### Bubble Chart ECharts version + +The new ECharts [Bubble chart](https://github.com/apache/superset/pull/22107) offers feature parity with the previous NVD3 version which should be removed in the next major release. This work is part of the [ECharts migration effort](https://github.com/apache/superset/issues/10418) to increase consistency and quality of our plugins. We'll add a migration to the new plugin soon which you'll be able to execute using the new CLI command. + +![Bubble](media/bubble_chart.png) + +### Improved Dataset selectors + +The [dataset selectors](https://github.com/apache/superset/pull/25569) have been improved to also display the database and schema names which will help users locate the correct dataset, particularly when there are multiple tables/datasets with the same name that could benefit from disambiguation. + +![Dataset](media/dataset_selector.png) + +### SQL Lab improvements + +SQL Lab received many user experience and performance improvements in this release. We’ll continue to improve the capabilities of SQL Lab with feedback from the community. + +Now users can [automatically format](https://github.com/apache/superset/pull/25344) their SQL queries using the `Ctrl+Shift+F` shortcut or the Format SQL menu option available in the SQL configuration panel. Another improvement is that the results panel now shows the [executed query](https://github.com/apache/superset/pull/24787) which is very helpful when your SQL Lab editor has multiple queries. + +![SQL Formatting](media/sql_formatting.png) + +In the SQL panel configurations, there's a menu option to show the [keyboard shortcuts](https://github.com/apache/superset/pull/25542) a user has access to. + +![Keyboard Shortcuts](media/keyboard_shortcuts.png) + +SQL Lab has launched a non-blocking persistence mode, as outlined in [SIP-93](https://github.com/apache/superset/issues/21385). This enhancement ensures that your SQL editor content is preserved, even if your internet or service goes offline. Moreover, it improves user interaction by saving changes in a non-blocking way, similar to how Google Docs does. + +Finally, the [SQL Lab module was moved to the Single Page Application](https://github.com/apache/superset/pull/25151) context. This means that both navigation and loading time of that module is significantly faster than previous versions (particularly when navigating to and from this page from other pages in Superset). This also reduces the number of requests to the server and pays some of our technical debt. Try it out! The difference is quite impressive! + +### Country Map improvements + +The Country Map visualization received some improvements in this release. The community added [France's regions](https://github.com/apache/superset/pull/25676) in addition to its departments and also many [Central Asia countries](https://github.com/apache/superset/pull/24870). + +<table> + <tr> + <td width="33%">France's regions</td> + <td width="33%">Kazakhstan</td> + <td width="33%">Kyrgyzstan</td> + </tr> + <tr> + <td width="33%"><img src="media/france.png" width="100%"/></td> + <td width="33%"><img src="media/kazakhstan.png" width="100%"></td> + <td width="33%"><img src="media/kyrgyzstan.png" width="100%"></td> + </tr> + <tr> + <td width="33%">Tajikistan</td> + <td width="33%">Turkmenistan</td> + <td width="33%">Uzbekistan</td> + </tr> + <tr> + <td width="33%"><img src="media/tajikistan.png" width="100%"/></td> + <td width="33%"><img src="media/turkmenistan.png" width="100%"></td> + <td width="33%"><img src="media/uzbekistan.png" width="100%"></td> + </tr> +</table> + +### Deck.gl ContourLayer + +We [added](https://github.com/apache/superset/pull/24154) the Deck.gl [ContourLayer](https://deck.gl/docs/api-reference/aggregation-layers/contour-layer) which aggregates data into Isolines or Isobands for a given threshold and cell size. By expanding the range of available [Deck.gl](https://deck.gl/) visualization layers, users will have more options to choose from when creating their visualizations. This will allow them to tailor their visualizations to their specific needs and explore their data in different ways. + +![Contour](media/contour.png) + +### New Databases + +Superset has added support for two new databases: + +- [Databend](https://databend.rs/), an open-source, elastic, and workload-aware cloud data warehouse built in Rust. You can see the PR [here](https://github.com/apache/superset/pull/23308), and the updated documentation [here](https://superset.apache.org/docs/databases/databend). +- [Apache Doris](https://doris.apache.org/), which is based on the MySQL protocol and introduces the concept of Multi Catalog. You can see the PR [here](https://github.com/apache/superset/pull/24714/) and the updated documentation [here](https://superset.apache.org/docs/databases/doris). + +<table> + <tr> + <td width="50%"><img src="media/databend.png" width="100%"/></td> + <td width="50%"><img src="media/doris.png" width="100%"></td> + </tr> +</table> + +### CLI command to execute viz migrations + +A new [CLI command](https://github.com/apache/superset/pull/25304) called viz-migrations was added to allow users to migrate charts of a specific type. This command is particularly helpful to migrate visualizations to their latest version and at the same time disable their legacy versions with the `VIZ_TYPE_DENYLIST` configuration. The main advantage of this command is that you can migrate your visualizations without needing to wait for a major release, where we generally remove the legacy plugins. + +Currently, you can use the command to migrate Area, Bubble, Line, and Sunburst chart types but we'll add more as the ECharts migrations continue. Note that migrations for deprecated charts may be forced in upcoming major versions when the code is removed. Running migrations earlier will allow you to de-risk future upgrades while improving user experience. + +```bash +Usage: superset viz-migrations [OPTIONS] COMMAND [ARGS]... + + Migrates a viz from one type to another. + +Commands: + downgrade Downgrades a viz to the previous version. + upgrade Upgrade a viz to the latest version. +``` + +Note: When migrating dashboards from one Superset instance to another (using import/export features or the Superset CLI), or restoring a backup of prior charts and dashboards, Superset will apply the existing migrations that are used during version upgrades. This will ensure that your charts and dashboards are using the latest and greatest charts that Superset officially supports. + +### Database engine spec improvements + +Many database engine improvements were added in this release. Some highlights: + +- [feat: improve SQLite DB engine spec](https://github.com/apache/superset/pull/24909) +- [feat: add MotherDuck DB engine spec](https://github.com/apache/superset/pull/24934) +- [feat: Add week time grain for Elasticsearch datasets](https://github.com/apache/superset/pull/25683) +- [feat: method for dynamic allows_alias_in_select](https://github.com/apache/superset/pull/25882) + +We even added a new [CLI command](https://github.com/apache/superset/pull/24918) to test DB engine specs, SQLAlchemy dialects, and database connections. + +```bash +Usage: superset test-db [OPTIONS] SQLALCHEMY_URI + + Run a series of tests against an analytical database. + + This command tests: + 1. The Superset DB engine spec. + 2. The SQLAlchemy dialect. + 3. The database connectivity and performance. + + It's useful for people developing DB engine specs and/or SQLAlchemy + dialects, and also to test new versions of DB API 2.0 drivers. + +Options: + -c, --connect-args TEXT Connect args as JSON or YAML + --help Show this message and exit. +``` + +### Playwright as an alternative to Selenium + +Per [SIP-98](https://github.com/apache/superset/issues/24948), we [introduced Playwright](https://github.com/apache/superset/pull/25247) for rendering charts in Superset reports. [Playwright](https://playwright.dev/) is an open-source library for automating web browsers, similar to Selenium but with better support for modern browser features and improved performance. By using Playwright, we aim to provide a more stable and accurate chart rendering experience in Superset reports, especially for [Deck.gl](https://deck.gl/) charts. + +Since configuring Playwright requires installing additional dependencies, in order to prevent breaking changes in existing deployments, we put the new flow behind a feature flag called `PLAYWRIGHT_REPORTS_AND_THUMBNAILS`. Users that don't enable the feature flag will be unaffected by the changes. + +### Pandas upgraded to v2 + +We [upgraded Pandas to v2](https://github.com/apache/superset/pull/24705) and [added performance dependencies](https://github.com/apache/superset/pull/24768) to provide speed improvements, especially when working with large data sets. For the full list of changes, check [Pandas 2.0.0 Release Notes](https://pandas.pydata.org/docs/dev/whatsnew/v2.0.0.html). + +### Tags + +Tags evolved a lot since 3.0, with many PRs that further improved the feature. During this phase, the community also made [great suggestions](https://github.com/apache/superset/discussions/25918) to make sure the feature is scalable, adhere to our security model, and offer a consistent design. We're still working on this feedback and new improvements will follow. For that reason, we're keeping the feature as beta behind the `TAGGING_SYSTEM` feature flag. diff --git a/RELEASING/release-notes-3-1/media/bubble_chart.png b/RELEASING/release-notes-3-1/media/bubble_chart.png new file mode 100644 index 0000000000000..505913ed2cf6a Binary files /dev/null and b/RELEASING/release-notes-3-1/media/bubble_chart.png differ diff --git a/RELEASING/release-notes-3-1/media/contour.png b/RELEASING/release-notes-3-1/media/contour.png new file mode 100644 index 0000000000000..16a16d7b24c15 Binary files /dev/null and b/RELEASING/release-notes-3-1/media/contour.png differ diff --git a/RELEASING/release-notes-3-1/media/databend.png b/RELEASING/release-notes-3-1/media/databend.png new file mode 100644 index 0000000000000..60ae9ea8e2876 Binary files /dev/null and b/RELEASING/release-notes-3-1/media/databend.png differ diff --git a/RELEASING/release-notes-3-1/media/dataset_selector.png b/RELEASING/release-notes-3-1/media/dataset_selector.png new file mode 100644 index 0000000000000..d18c3315be7ce Binary files /dev/null and b/RELEASING/release-notes-3-1/media/dataset_selector.png differ diff --git a/RELEASING/release-notes-3-1/media/doris.png b/RELEASING/release-notes-3-1/media/doris.png new file mode 100644 index 0000000000000..f3d2fc40dcc2f Binary files /dev/null and b/RELEASING/release-notes-3-1/media/doris.png differ diff --git a/RELEASING/release-notes-3-1/media/france.png b/RELEASING/release-notes-3-1/media/france.png new file mode 100644 index 0000000000000..8deed333a3561 Binary files /dev/null and b/RELEASING/release-notes-3-1/media/france.png differ diff --git a/RELEASING/release-notes-3-1/media/kazakhstan.png b/RELEASING/release-notes-3-1/media/kazakhstan.png new file mode 100644 index 0000000000000..a73c3efa8881e Binary files /dev/null and b/RELEASING/release-notes-3-1/media/kazakhstan.png differ diff --git a/RELEASING/release-notes-3-1/media/keyboard_shortcuts.png b/RELEASING/release-notes-3-1/media/keyboard_shortcuts.png new file mode 100644 index 0000000000000..60f147d11ebae Binary files /dev/null and b/RELEASING/release-notes-3-1/media/keyboard_shortcuts.png differ diff --git a/RELEASING/release-notes-3-1/media/kyrgyzstan.png b/RELEASING/release-notes-3-1/media/kyrgyzstan.png new file mode 100644 index 0000000000000..13a791c3ef30f Binary files /dev/null and b/RELEASING/release-notes-3-1/media/kyrgyzstan.png differ diff --git a/RELEASING/release-notes-3-1/media/sql_formatting.png b/RELEASING/release-notes-3-1/media/sql_formatting.png new file mode 100644 index 0000000000000..a4a4e57fca017 Binary files /dev/null and b/RELEASING/release-notes-3-1/media/sql_formatting.png differ diff --git a/RELEASING/release-notes-3-1/media/tajikistan.png b/RELEASING/release-notes-3-1/media/tajikistan.png new file mode 100644 index 0000000000000..0114ef9068e24 Binary files /dev/null and b/RELEASING/release-notes-3-1/media/tajikistan.png differ diff --git a/RELEASING/release-notes-3-1/media/turkmenistan.png b/RELEASING/release-notes-3-1/media/turkmenistan.png new file mode 100644 index 0000000000000..b4999d880fb19 Binary files /dev/null and b/RELEASING/release-notes-3-1/media/turkmenistan.png differ diff --git a/RELEASING/release-notes-3-1/media/uzbekistan.png b/RELEASING/release-notes-3-1/media/uzbekistan.png new file mode 100644 index 0000000000000..d1c1230eeb008 Binary files /dev/null and b/RELEASING/release-notes-3-1/media/uzbekistan.png differ diff --git a/RELEASING/release-notes-3-1/media/waterfall_chart.png b/RELEASING/release-notes-3-1/media/waterfall_chart.png new file mode 100644 index 0000000000000..0fd61e5175ba2 Binary files /dev/null and b/RELEASING/release-notes-3-1/media/waterfall_chart.png differ diff --git a/RESOURCES/INTHEWILD.md b/RESOURCES/INTHEWILD.md index 155cbe83b4c7e..a049f2d7f47c6 100644 --- a/RESOURCES/INTHEWILD.md +++ b/RESOURCES/INTHEWILD.md @@ -111,6 +111,7 @@ Join our growing community! - [Steamroot](https://streamroot.io/) - [TechAudit](https://www.techaudit.info) [@ETselikov] - [Tenable](https://www.tenable.com) [@dflionis] +- [Tentacle](https://public.tentaclecmi.com) [@jdclarke5] - [timbr.ai](https://timbr.ai/) [@semantiDan] - [Tobii](http://www.tobii.com/) [@dwa] - [Tooploox](https://www.tooploox.com/) [@jakubczaplicki] diff --git a/UPDATING.md b/UPDATING.md index 542938c35b7a5..96a15349a9091 100644 --- a/UPDATING.md +++ b/UPDATING.md @@ -22,18 +22,18 @@ under the License. This file documents any backwards-incompatible changes in Superset and assists people when migrating to a new version. -## Next +## 3.1.0 - [24657](https://github.com/apache/superset/pull/24657): Bumps the cryptography package to augment the OpenSSL security vulnerability. -### Breaking Changes - -### Potential Downtime - ### Other - [24982](https://github.com/apache/superset/pull/24982): By default, physical datasets on Oracle-like dialects like Snowflake will now use denormalized column names. However, existing datasets won't be affected. To change this behavior, the "Advanced" section on the dataset modal has a "Normalize column names" flag which can be changed to change this behavior. +## 3.0.3 + +- [26034](https://github.com/apache/superset/issues/26034): Fixes a problem where numeric x-axes were being treated as categorical values. As a consequence of that, the way labels are displayed might change given that ECharts has a different treatment for numerical and categorical values. To revert to the old behavior, users need to manually convert numerical columns to text so that they are treated as categories. Check https://github.com/apache/superset/issues/26159 for more details. + ## 3.0.0 - [25053](https://github.com/apache/superset/pull/25053): Extends the `ab_user.email` column from 64 to 320 characters which has an associated unique key constraint. This will be problematic for MySQL metadata databases which use the InnoDB storage engine with the `innodb_large_prefix` parameter disabled as the key prefix limit is 767 bytes. Enabling said parameter and ensuring that the table uses either the `DYNAMIC` or `COMPRESSED` row format should remedy the problem. See [here](https://dev.mysql.com/doc/refman/5.7/en/innodb-limits.html) for more details. diff --git a/docs/docs/databases/doris.mdx b/docs/docs/databases/doris.mdx new file mode 100644 index 0000000000000..62c16afeb3e1a --- /dev/null +++ b/docs/docs/databases/doris.mdx @@ -0,0 +1,26 @@ +--- +title: Apache Doris +hide_title: true +sidebar_position: 5 +version: 1 +--- + +## Doris + +The [sqlalchemy-doris](https://pypi.org/project/pydoris/) library is the recommended way to connect to Apache Doris through SQLAlchemy. + +You'll need the following setting values to form the connection string: + +- **User**: User Name +- **Password**: Password +- **Host**: Doris FE Host +- **Port**: Doris FE port +- **Catalog**: Catalog Name +- **Database**: Database Name + + +Here's what the connection string looks like: + +``` +doris://<User>:<Password>@<Host>:<Port>/<Catalog>.<Database> +``` diff --git a/docs/docs/databases/installing-database-drivers.mdx b/docs/docs/databases/installing-database-drivers.mdx index b4be939c3b5a8..f11b4ec5eb722 100644 --- a/docs/docs/databases/installing-database-drivers.mdx +++ b/docs/docs/databases/installing-database-drivers.mdx @@ -22,47 +22,48 @@ as well as the packages needed to connect to the databases you want to access th Some of the recommended packages are shown below. Please refer to [setup.py](https://github.com/apache/superset/blob/master/setup.py) for the versions that are compatible with Superset. -| Database | PyPI package | Connection String | -| --------------------------------------------------------- | ---------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------- | -| [Amazon Athena](/docs/databases/athena) | `pip install pyathena[pandas]` , `pip install PyAthenaJDBC` | `awsathena+rest://{aws_access_key_id}:{aws_secret_access_key}@athena.{region_name}.amazonaws.com/{ ` | -| [Amazon DynamoDB](/docs/databases/dynamodb) | `pip install pydynamodb` | `dynamodb://{access_key_id}:{secret_access_key}@dynamodb.{region_name}.amazonaws.com?connector=superset` | -| [Amazon Redshift](/docs/databases/redshift) | `pip install sqlalchemy-redshift` | ` redshift+psycopg2://<userName>:<DBPassword>@<AWS End Point>:5439/<Database Name>` | -| [Apache Drill](/docs/databases/drill) | `pip install sqlalchemy-drill` | `drill+sadrill:// For JDBC drill+jdbc://` | -| [Apache Druid](/docs/databases/druid) | `pip install pydruid` | `druid://<User>:<password>@<Host>:<Port-default-9088>/druid/v2/sql` | -| [Apache Hive](/docs/databases/hive) | `pip install pyhive` | `hive://hive@{hostname}:{port}/{database}` | -| [Apache Impala](/docs/databases/impala) | `pip install impyla` | `impala://{hostname}:{port}/{database}` | -| [Apache Kylin](/docs/databases/kylin) | `pip install kylinpy` | `kylin://<username>:<password>@<hostname>:<port>/<project>?<param1>=<value1>&<param2>=<value2>` | -| [Apache Pinot](/docs/databases/pinot) | `pip install pinotdb` | `pinot://BROKER:5436/query?server=http://CONTROLLER:5983/` | -| [Apache Solr](/docs/databases/solr) | `pip install sqlalchemy-solr` | `solr://{username}:{password}@{hostname}:{port}/{server_path}/{collection}` | -| [Apache Spark SQL](/docs/databases/spark-sql) | `pip install pyhive` | `hive://hive@{hostname}:{port}/{database}` | -| [Ascend.io](/docs/databases/ascend) | `pip install impyla` | `ascend://{username}:{password}@{hostname}:{port}/{database}?auth_mechanism=PLAIN;use_ssl=true` | -| [Azure MS SQL](/docs/databases/sql-server) | `pip install pymssql` | `mssql+pymssql://UserName@presetSQL:TestPassword@presetSQL.database.windows.net:1433/TestSchema` | -| [Big Query](/docs/databases/bigquery) | `pip install sqlalchemy-bigquery` | `bigquery://{project_id}` | -| [ClickHouse](/docs/databases/clickhouse) | `pip install clickhouse-connect` | `clickhousedb://{username}:{password}@{hostname}:{port}/{database}` | -| [CockroachDB](/docs/databases/cockroachdb) | `pip install cockroachdb` | `cockroachdb://root@{hostname}:{port}/{database}?sslmode=disable` | -| [Dremio](/docs/databases/dremio) | `pip install sqlalchemy_dremio` | `dremio://user:pwd@host:31010/` | -| [Elasticsearch](/docs/databases/elasticsearch) | `pip install elasticsearch-dbapi` | `elasticsearch+http://{user}:{password}@{host}:9200/` | -| [Exasol](/docs/databases/exasol) | `pip install sqlalchemy-exasol` | `exa+pyodbc://{username}:{password}@{hostname}:{port}/my_schema?CONNECTIONLCALL=en_US.UTF-8&driver=EXAODBC` | -| [Google Sheets](/docs/databases/google-sheets) | `pip install shillelagh[gsheetsapi]` | `gsheets://` | -| [Firebolt](/docs/databases/firebolt) | `pip install firebolt-sqlalchemy` | `firebolt://{username}:{password}@{database} or firebolt://{username}:{password}@{database}/{engine_name}` | -| [Hologres](/docs/databases/hologres) | `pip install psycopg2` | `postgresql+psycopg2://<UserName>:<DBPassword>@<Database Host>/<Database Name>` | -| [IBM Db2](/docs/databases/ibm-db2) | `pip install ibm_db_sa` | `db2+ibm_db://` | -| [IBM Netezza Performance Server](/docs/databases/netezza) | `pip install nzalchemy` | `netezza+nzpy://<UserName>:<DBPassword>@<Database Host>/<Database Name>` | -| [MySQL](/docs/databases/mysql) | `pip install mysqlclient` | `mysql://<UserName>:<DBPassword>@<Database Host>/<Database Name>` | -| [Oracle](/docs/databases/oracle) | `pip install cx_Oracle` | `oracle://` | -| [PostgreSQL](/docs/databases/postgres) | `pip install psycopg2` | `postgresql://<UserName>:<DBPassword>@<Database Host>/<Database Name>` | -| [Presto](/docs/databases/presto) | `pip install pyhive` | `presto://` | -| [Rockset](/docs/databases/rockset) | `pip install rockset-sqlalchemy` | `rockset://<api_key>:@<api_server>` | -| [SAP Hana](/docs/databases/hana) | `pip install hdbcli sqlalchemy-hana or pip install apache-superset[hana]` | `hana://{username}:{password}@{host}:{port}` | -| [StarRocks](/docs/databases/starrocks) | `pip install starrocks` | `starrocks://<User>:<Password>@<Host>:<Port>/<Catalog>.<Database>` | -| [Snowflake](/docs/databases/snowflake) | `pip install snowflake-sqlalchemy` | `snowflake://{user}:{password}@{account}.{region}/{database}?role={role}&warehouse={warehouse}` | -| SQLite | No additional library needed | `sqlite://path/to/file.db?check_same_thread=false` | -| [SQL Server](/docs/databases/sql-server) | `pip install pymssql` | `mssql+pymssql://` | -| [Teradata](/docs/databases/teradata) | `pip install teradatasqlalchemy` | `teradatasql://{user}:{password}@{host}` | -| [TimescaleDB](/docs/databases/timescaledb) | `pip install psycopg2` | `postgresql://<UserName>:<DBPassword>@<Database Host>:<Port>/<Database Name>` | -| [Trino](/docs/databases/trino) | `pip install trino` | `trino://{username}:{password}@{hostname}:{port}/{catalog}` | -| [Vertica](/docs/databases/vertica) | `pip install sqlalchemy-vertica-python` | `vertica+vertica_python://<UserName>:<DBPassword>@<Database Host>/<Database Name>` | -| [YugabyteDB](/docs/databases/yugabytedb) | `pip install psycopg2` | `postgresql://<UserName>:<DBPassword>@<Database Host>/<Database Name>` | +| Database | PyPI package | Connection String | +| --------------------------------------------------------- | ---------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------ | +| [Amazon Athena](/docs/databases/athena) | `pip install pyathena[pandas]` , `pip install PyAthenaJDBC` | `awsathena+rest://{aws_access_key_id}:{aws_secret_access_key}@athena.{region_name}.amazonaws.com/{schema_name}?s3_staging_dir={s3_staging_dir}&... ` | +| [Apache Doris](/docs/databases/doris) | `pip install pydoris` | `doris://<User>:<Password>@<Host>:<Port>/<Catalog>.<Database>` | +| [Amazon DynamoDB](/docs/databases/dynamodb) | `pip install pydynamodb` | `dynamodb://{access_key_id}:{secret_access_key}@dynamodb.{region_name}.amazonaws.com?connector=superset` | +| [Amazon Redshift](/docs/databases/redshift) | `pip install sqlalchemy-redshift` | ` redshift+psycopg2://<userName>:<DBPassword>@<AWS End Point>:5439/<Database Name>` | +| [Apache Drill](/docs/databases/drill) | `pip install sqlalchemy-drill` | `drill+sadrill:// For JDBC drill+jdbc://` | +| [Apache Druid](/docs/databases/druid) | `pip install pydruid` | `druid://<User>:<password>@<Host>:<Port-default-9088>/druid/v2/sql` | +| [Apache Hive](/docs/databases/hive) | `pip install pyhive` | `hive://hive@{hostname}:{port}/{database}` | +| [Apache Impala](/docs/databases/impala) | `pip install impyla` | `impala://{hostname}:{port}/{database}` | +| [Apache Kylin](/docs/databases/kylin) | `pip install kylinpy` | `kylin://<username>:<password>@<hostname>:<port>/<project>?<param1>=<value1>&<param2>=<value2>` | +| [Apache Pinot](/docs/databases/pinot) | `pip install pinotdb` | `pinot://BROKER:5436/query?server=http://CONTROLLER:5983/` | +| [Apache Solr](/docs/databases/solr) | `pip install sqlalchemy-solr` | `solr://{username}:{password}@{hostname}:{port}/{server_path}/{collection}` | +| [Apache Spark SQL](/docs/databases/spark-sql) | `pip install pyhive` | `hive://hive@{hostname}:{port}/{database}` | +| [Ascend.io](/docs/databases/ascend) | `pip install impyla` | `ascend://{username}:{password}@{hostname}:{port}/{database}?auth_mechanism=PLAIN;use_ssl=true` | +| [Azure MS SQL](/docs/databases/sql-server) | `pip install pymssql` | `mssql+pymssql://UserName@presetSQL:TestPassword@presetSQL.database.windows.net:1433/TestSchema` | +| [Big Query](/docs/databases/bigquery) | `pip install sqlalchemy-bigquery` | `bigquery://{project_id}` | +| [ClickHouse](/docs/databases/clickhouse) | `pip install clickhouse-connect` | `clickhousedb://{username}:{password}@{hostname}:{port}/{database}` | +| [CockroachDB](/docs/databases/cockroachdb) | `pip install cockroachdb` | `cockroachdb://root@{hostname}:{port}/{database}?sslmode=disable` | +| [Dremio](/docs/databases/dremio) | `pip install sqlalchemy_dremio` | `dremio://user:pwd@host:31010/` | +| [Elasticsearch](/docs/databases/elasticsearch) | `pip install elasticsearch-dbapi` | `elasticsearch+http://{user}:{password}@{host}:9200/` | +| [Exasol](/docs/databases/exasol) | `pip install sqlalchemy-exasol` | `exa+pyodbc://{username}:{password}@{hostname}:{port}/my_schema?CONNECTIONLCALL=en_US.UTF-8&driver=EXAODBC` | +| [Google Sheets](/docs/databases/google-sheets) | `pip install shillelagh[gsheetsapi]` | `gsheets://` | +| [Firebolt](/docs/databases/firebolt) | `pip install firebolt-sqlalchemy` | `firebolt://{username}:{password}@{database} or firebolt://{username}:{password}@{database}/{engine_name}` | +| [Hologres](/docs/databases/hologres) | `pip install psycopg2` | `postgresql+psycopg2://<UserName>:<DBPassword>@<Database Host>/<Database Name>` | +| [IBM Db2](/docs/databases/ibm-db2) | `pip install ibm_db_sa` | `db2+ibm_db://` | +| [IBM Netezza Performance Server](/docs/databases/netezza) | `pip install nzalchemy` | `netezza+nzpy://<UserName>:<DBPassword>@<Database Host>/<Database Name>` | +| [MySQL](/docs/databases/mysql) | `pip install mysqlclient` | `mysql://<UserName>:<DBPassword>@<Database Host>/<Database Name>` | +| [Oracle](/docs/databases/oracle) | `pip install cx_Oracle` | `oracle://` | +| [PostgreSQL](/docs/databases/postgres) | `pip install psycopg2` | `postgresql://<UserName>:<DBPassword>@<Database Host>/<Database Name>` | +| [Presto](/docs/databases/presto) | `pip install pyhive` | `presto://` | +| [Rockset](/docs/databases/rockset) | `pip install rockset-sqlalchemy` | `rockset://<api_key>:@<api_server>` | +| [SAP Hana](/docs/databases/hana) | `pip install hdbcli sqlalchemy-hana or pip install apache-superset[hana]` | `hana://{username}:{password}@{host}:{port}` | +| [StarRocks](/docs/databases/starrocks) | `pip install starrocks` | `starrocks://<User>:<Password>@<Host>:<Port>/<Catalog>.<Database>` | +| [Snowflake](/docs/databases/snowflake) | `pip install snowflake-sqlalchemy` | `snowflake://{user}:{password}@{account}.{region}/{database}?role={role}&warehouse={warehouse}` | +| SQLite | No additional library needed | `sqlite://path/to/file.db?check_same_thread=false` | +| [SQL Server](/docs/databases/sql-server) | `pip install pymssql` | `mssql+pymssql://` | +| [Teradata](/docs/databases/teradata) | `pip install teradatasqlalchemy` | `teradatasql://{user}:{password}@{host}` | +| [TimescaleDB](/docs/databases/timescaledb) | `pip install psycopg2` | `postgresql://<UserName>:<DBPassword>@<Database Host>:<Port>/<Database Name>` | +| [Trino](/docs/databases/trino) | `pip install trino` | `trino://{username}:{password}@{hostname}:{port}/{catalog}` | +| [Vertica](/docs/databases/vertica) | `pip install sqlalchemy-vertica-python` | `vertica+vertica_python://<UserName>:<DBPassword>@<Database Host>/<Database Name>` | +| [YugabyteDB](/docs/databases/yugabytedb) | `pip install psycopg2` | `postgresql://<UserName>:<DBPassword>@<Database Host>/<Database Name>` | --- Note that many other databases are supported, the main criteria being the existence of a functional diff --git a/docs/docs/databases/pinot.mdx b/docs/docs/databases/pinot.mdx index 8d5b8c2062e9b..e6add897ba5de 100644 --- a/docs/docs/databases/pinot.mdx +++ b/docs/docs/databases/pinot.mdx @@ -14,3 +14,9 @@ The expected connection string is formatted as follows: ``` pinot+http://<pinot-broker-host>:<pinot-broker-port>/query?controller=http://<pinot-controller-host>:<pinot-controller-port>/`` ``` + +The expected connection string using username and password is formatted as follows: + +``` +pinot://<username>:<password>@<pinot-broker-host>:<pinot-broker-port>/query/sql?controller=http://<pinot-controller-host>:<pinot-controller-port>/verify_ssl=true`` +``` diff --git a/docs/docs/intro.mdx b/docs/docs/intro.mdx index 0f0315fc0503a..cd93aa6be6177 100644 --- a/docs/docs/intro.mdx +++ b/docs/docs/intro.mdx @@ -15,7 +15,7 @@ Here are a **few different ways you can get started with Superset**: - Install Superset [from scratch](https://superset.apache.org/docs/installation/installing-superset-from-scratch/) - Deploy Superset locally with one command - [using Docker Compose](installation/installing-superset-using-docker-compose) + [using Docker Compose](https://superset.apache.org/docs/installation/installing-superset-using-docker-compose) - Deploy Superset [with Kubernetes](https://superset.apache.org/docs/installation/running-on-kubernetes) - Run a [Docker image](https://hub.docker.com/r/apache/superset) from Dockerhub - Download Superset [from Pypi here](https://pypi.org/project/apache-superset/) diff --git a/docs/src/resources/data.js b/docs/src/resources/data.js index a07be552673ef..42cf835a495b4 100644 --- a/docs/src/resources/data.js +++ b/docs/src/resources/data.js @@ -117,4 +117,9 @@ export const Databases = [ href: 'https://www.microsoft.com/en-us/sql-server', imgName: 'msql.png', }, + { + title: 'Apache Doris', + href: 'https://doris.apache.org/', + imgName: 'doris.png', + }, ]; diff --git a/docs/src/styles/main.less b/docs/src/styles/main.less index 80dee90ecabd5..d10047fdea72d 100644 --- a/docs/src/styles/main.less +++ b/docs/src/styles/main.less @@ -117,6 +117,7 @@ a > span > svg { font-size: 14px; font-weight: 400; background-color: #fff; + transition: all 0.5s; .get-started-button { border-radius: 10px; diff --git a/docs/static/img/databases/doris.png b/docs/static/img/databases/doris.png new file mode 100644 index 0000000000000..4d88f2a36cf72 Binary files /dev/null and b/docs/static/img/databases/doris.png differ diff --git a/helm/superset/Chart.yaml b/helm/superset/Chart.yaml index 60e2510eb9397..cbca942569349 100644 --- a/helm/superset/Chart.yaml +++ b/helm/superset/Chart.yaml @@ -29,7 +29,7 @@ maintainers: - name: craig-rueda email: craig@craigrueda.com url: https://github.com/craig-rueda -version: 0.10.14 +version: 0.11.2 dependencies: - name: postgresql version: 12.1.6 diff --git a/helm/superset/README.md b/helm/superset/README.md index d32ee985fe70f..1eaf4928c158a 100644 --- a/helm/superset/README.md +++ b/helm/superset/README.md @@ -23,7 +23,7 @@ NOTE: This file is generated by helm-docs: https://github.com/norwoodj/helm-docs # superset -![Version: 0.10.14](https://img.shields.io/badge/Version-0.10.14-informational?style=flat-square) +![Version: 0.11.2](https://img.shields.io/badge/Version-0.11.2-informational?style=flat-square) Apache Superset is a modern, enterprise-ready business intelligence web application @@ -40,6 +40,12 @@ helm repo add superset http://apache.github.io/superset/ helm install my-superset superset/superset ``` +Make sure you set your own `SECRET_KEY` to something unique and secret. This secret key is used by Flask for +securely signing the session cookie and will be used to encrypt sensitive data on Superset's metadata database. +It should be a long random bytes or str. + +On helm this can be set on `extraSecretEnv.SUPERSET_SECRET_KEY` or `configOverrides.secrets` + ## Requirements | Repository | Name | Version | @@ -124,6 +130,7 @@ helm install my-superset superset/superset | supersetCeleryBeat.containerSecurityContext | object | `{}` | | | supersetCeleryBeat.deploymentAnnotations | object | `{}` | Annotations to be added to supersetCeleryBeat deployment | | supersetCeleryBeat.enabled | bool | `false` | This is only required if you intend to use alerts and reports | +| supersetCeleryBeat.extraContainers | list | `[]` | Launch additional containers into supersetCeleryBeat pods | | supersetCeleryBeat.forceReload | bool | `false` | If true, forces deployment to reload on each upgrade | | supersetCeleryBeat.initContainers | list | a container waiting for postgres | List of init containers | | supersetCeleryBeat.podAnnotations | object | `{}` | Annotations to be added to supersetCeleryBeat pods | @@ -136,6 +143,7 @@ helm install my-superset superset/superset | supersetCeleryFlower.containerSecurityContext | object | `{}` | | | supersetCeleryFlower.deploymentAnnotations | object | `{}` | Annotations to be added to supersetCeleryFlower deployment | | supersetCeleryFlower.enabled | bool | `false` | Enables a Celery flower deployment (management UI to monitor celery jobs) WARNING: on superset 1.x, this requires a Superset image that has `flower<1.0.0` installed (which is NOT the case of the default images) flower>=1.0.0 requires Celery 5+ which Superset 1.5 does not support | +| supersetCeleryFlower.extraContainers | list | `[]` | Launch additional containers into supersetCeleryFlower pods | | supersetCeleryFlower.initContainers | list | a container waiting for postgres and redis | List of init containers | | supersetCeleryFlower.livenessProbe.failureThreshold | int | `3` | | | supersetCeleryFlower.livenessProbe.httpGet.path | string | `"/api/workers"` | | @@ -223,6 +231,7 @@ helm install my-superset superset/superset | supersetWebsockets.containerSecurityContext | object | `{}` | | | supersetWebsockets.deploymentAnnotations | object | `{}` | | | supersetWebsockets.enabled | bool | `false` | This is only required if you intend to use `GLOBAL_ASYNC_QUERIES` in `ws` mode see https://github.com/apache/superset/blob/master/CONTRIBUTING.md#async-chart-queries | +| supersetWebsockets.extraContainers | list | `[]` | Launch additional containers into supersetWebsockets pods | | supersetWebsockets.image.pullPolicy | string | `"IfNotPresent"` | | | supersetWebsockets.image.repository | string | `"oneacrefund/superset-websocket"` | There is no official image (yet), this one is community-supported | | supersetWebsockets.image.tag | string | `"latest"` | | diff --git a/helm/superset/README.md.gotmpl b/helm/superset/README.md.gotmpl index c17a7e31a7372..facb955e31d27 100644 --- a/helm/superset/README.md.gotmpl +++ b/helm/superset/README.md.gotmpl @@ -39,6 +39,12 @@ helm repo add superset http://apache.github.io/superset/ helm install my-superset superset/superset ``` +Make sure you set your own `SECRET_KEY` to something unique and secret. This secret key is used by Flask for +securely signing the session cookie and will be used to encrypt sensitive data on Superset's metadata database. +It should be a long random bytes or str. + +On helm this can be set on `extraSecretEnv.SUPERSET_SECRET_KEY` or `configOverrides.secrets` + {{ template "chart.requirementsSection" . }} {{ template "chart.valuesSection" . }} diff --git a/helm/superset/templates/_helpers.tpl b/helm/superset/templates/_helpers.tpl index 40b769054e66e..26d68ce6038e6 100644 --- a/helm/superset/templates/_helpers.tpl +++ b/helm/superset/templates/_helpers.tpl @@ -82,7 +82,6 @@ DATA_CACHE_CONFIG = CACHE_CONFIG SQLALCHEMY_DATABASE_URI = f"postgresql+psycopg2://{env('DB_USER')}:{env('DB_PASS')}@{env('DB_HOST')}:{env('DB_PORT')}/{env('DB_NAME')}" SQLALCHEMY_TRACK_MODIFICATIONS = True -SECRET_KEY = env('SECRET_KEY', 'thisISaSECRET_1234') class CeleryConfig: imports = ("superset.sql_lab", ) diff --git a/helm/superset/templates/deployment-beat.yaml b/helm/superset/templates/deployment-beat.yaml index 43754efb06147..30d1eff61ac71 100644 --- a/helm/superset/templates/deployment-beat.yaml +++ b/helm/superset/templates/deployment-beat.yaml @@ -42,6 +42,7 @@ spec: metadata: annotations: checksum/superset_config.py: {{ include "superset-config" . | sha256sum }} + checksum/superset_bootstrap.sh: {{ tpl .Values.bootstrapScript . | sha256sum }} checksum/connections: {{ .Values.supersetNode.connections | toYaml | sha256sum }} checksum/extraConfigs: {{ .Values.extraConfigs | toYaml | sha256sum }} checksum/extraSecrets: {{ .Values.extraSecrets | toYaml | sha256sum }} @@ -119,6 +120,9 @@ spec: {{- else }} {{- toYaml .Values.resources | nindent 12 }} {{- end }} + {{- if .Values.supersetCeleryBeat.extraContainers }} + {{- toYaml .Values.supersetCeleryBeat.extraContainers | nindent 8 }} + {{- end }} {{- with .Values.nodeSelector }} nodeSelector: {{- toYaml . | nindent 8 }} {{- end }} diff --git a/helm/superset/templates/deployment-flower.yaml b/helm/superset/templates/deployment-flower.yaml index 2213ffa353fd7..e4b05a17e9a1f 100644 --- a/helm/superset/templates/deployment-flower.yaml +++ b/helm/superset/templates/deployment-flower.yaml @@ -115,6 +115,9 @@ spec: {{- else }} {{- toYaml .Values.resources | nindent 12 }} {{- end }} + {{- if .Values.supersetCeleryFlower.extraContainers }} + {{- toYaml .Values.supersetCeleryFlower.extraContainers | nindent 8 }} + {{- end }} {{- with .Values.nodeSelector }} nodeSelector: {{- toYaml . | nindent 8 }} {{- end }} diff --git a/helm/superset/templates/deployment-worker.yaml b/helm/superset/templates/deployment-worker.yaml index d84e7e9561103..2710ff40fe444 100644 --- a/helm/superset/templates/deployment-worker.yaml +++ b/helm/superset/templates/deployment-worker.yaml @@ -48,6 +48,7 @@ spec: metadata: annotations: checksum/superset_config.py: {{ include "superset-config" . | sha256sum }} + checksum/superset_bootstrap.sh: {{ tpl .Values.bootstrapScript . | sha256sum }} checksum/connections: {{ .Values.supersetNode.connections | toYaml | sha256sum }} checksum/extraConfigs: {{ .Values.extraConfigs | toYaml | sha256sum }} checksum/extraSecrets: {{ .Values.extraSecrets | toYaml | sha256sum }} diff --git a/helm/superset/templates/deployment-ws.yaml b/helm/superset/templates/deployment-ws.yaml index 6bc9faac672a2..7612900b07d34 100644 --- a/helm/superset/templates/deployment-ws.yaml +++ b/helm/superset/templates/deployment-ws.yaml @@ -114,6 +114,9 @@ spec: {{- if .Values.supersetWebsockets.livenessProbe }} livenessProbe: {{- .Values.supersetWebsockets.livenessProbe | toYaml | nindent 12 }} {{- end }} + {{- if .Values.supersetWebsockets.extraContainers }} + {{- toYaml .Values.supersetWebsockets.extraContainers | nindent 8 }} + {{- end }} {{- with .Values.nodeSelector }} nodeSelector: {{- toYaml . | nindent 8 }} {{- end }} diff --git a/helm/superset/templates/init-job.yaml b/helm/superset/templates/init-job.yaml index 5b39d20e10c46..43839c0d9538c 100644 --- a/helm/superset/templates/init-job.yaml +++ b/helm/superset/templates/init-job.yaml @@ -63,7 +63,7 @@ spec: name: {{ tpl .Values.envFromSecret . }} {{- range .Values.envFromSecrets }} - secretRef: - name: {{ tpl . $ }} + name: {{ tpl . $ | quote }} {{- end }} imagePullPolicy: {{ .Values.image.pullPolicy }} {{- if .Values.init.containerSecurityContext }} diff --git a/helm/superset/values.yaml b/helm/superset/values.yaml index 67f685bf18fd5..26d454742055c 100644 --- a/helm/superset/values.yaml +++ b/helm/superset/values.yaml @@ -93,6 +93,8 @@ extraSecretEnv: {} # # Google API Keys: https://console.cloud.google.com/apis/credentials # GOOGLE_KEY: ... # GOOGLE_SECRET: ... + # # Generate your own secret key for encryption. Use openssl rand -base64 42 to generate a good key + # SUPERSET_SECRET_KEY: 'CHANGE_ME_TO_A_COMPLEX_RANDOM_SECRET' # -- Extra files to mount on `/app/pythonpath` extraConfigs: {} @@ -441,6 +443,8 @@ supersetCeleryBeat: - /bin/sh - -c - dockerize -wait "tcp://$DB_HOST:$DB_PORT" -wait "tcp://$REDIS_HOST:$REDIS_PORT" -timeout 120s + # -- Launch additional containers into supersetCeleryBeat pods + extraContainers: [] # -- Annotations to be added to supersetCeleryBeat deployment deploymentAnnotations: {} # -- Affinity to be added to supersetCeleryBeat deployment @@ -522,6 +526,8 @@ supersetCeleryFlower: - /bin/sh - -c - dockerize -wait "tcp://$DB_HOST:$DB_PORT" -wait "tcp://$REDIS_HOST:$REDIS_PORT" -timeout 120s + # -- Launch additional containers into supersetCeleryFlower pods + extraContainers: [] # -- Annotations to be added to supersetCeleryFlower deployment deploymentAnnotations: {} # -- Affinity to be added to supersetCeleryFlower deployment @@ -588,6 +594,8 @@ supersetWebsockets: http: nil command: [] resources: {} + # -- Launch additional containers into supersetWebsockets pods + extraContainers: [] deploymentAnnotations: {} # -- Affinity to be added to supersetWebsockets deployment affinity: {} diff --git a/pytest.ini b/pytest.ini index fdb50114d8d18..3fec965e72ae6 100644 --- a/pytest.ini +++ b/pytest.ini @@ -17,4 +17,4 @@ [pytest] testpaths = tests -python_files = *_test.py test_*.py *_tests.py +python_files = *_test.py test_*.py *_tests.py *viz/utils.py diff --git a/requirements/base.txt b/requirements/base.txt index 2bc8e38132310..cebf810b61294 100644 --- a/requirements/base.txt +++ b/requirements/base.txt @@ -18,7 +18,10 @@ apsw==3.42.0.1 async-timeout==4.0.2 # via redis attrs==23.1.0 - # via jsonschema + # via + # cattrs + # jsonschema + # requests-cache babel==2.9.1 # via flask-babel backoff==1.11.1 @@ -31,10 +34,12 @@ bottleneck==1.3.7 # via pandas brotli==1.0.9 # via flask-compress -cachelib==0.6.0 +cachelib==0.9.0 # via # flask-caching # flask-session +cattrs==23.2.1 + # via requests-cache celery==5.2.2 # via apache-superset certifi==2023.7.22 @@ -85,6 +90,8 @@ dnspython==2.1.0 # via email-validator email-validator==1.1.3 # via flask-appbuilder +exceptiongroup==1.1.1 + # via cattrs flask==2.2.5 # via # apache-superset @@ -99,11 +106,11 @@ flask==2.2.5 # flask-session # flask-sqlalchemy # flask-wtf -flask-appbuilder==4.3.9 +flask-appbuilder==4.3.10 # via apache-superset flask-babel==1.0.0 # via flask-appbuilder -flask-caching==1.11.1 +flask-caching==2.1.0 # via apache-superset flask-compress==1.13 # via apache-superset @@ -136,7 +143,9 @@ geographiclib==1.52 geopy==2.2.0 # via apache-superset greenlet==2.0.2 - # via shillelagh + # via + # shillelagh + # sqlalchemy gunicorn==21.2.0 # via apache-superset hashids==1.3.1 @@ -152,7 +161,10 @@ idna==3.2 # email-validator # requests importlib-metadata==6.6.0 - # via apache-superset + # via + # apache-superset + # flask + # shillelagh importlib-resources==5.12.0 # via limits isodate==0.6.0 @@ -233,13 +245,15 @@ parsedatetime==2.6 # via apache-superset pgsanity==0.2.9 # via apache-superset +platformdirs==3.8.1 + # via requests-cache polyline==2.0.0 # via apache-superset prison==0.2.1 # via flask-appbuilder prompt-toolkit==3.0.38 # via click-repl -pyarrow==12.0.0 +pyarrow==14.0.1 # via apache-superset pycparser==2.20 # via cffi @@ -286,12 +300,16 @@ pyyaml==6.0.1 redis==4.5.4 # via apache-superset requests==2.31.0 + # via + # requests-cache + # shillelagh +requests-cache==1.1.1 # via shillelagh rich==13.3.4 # via flask-limiter selenium==3.141.0 # via apache-superset -shillelagh==1.2.6 +shillelagh==1.2.10 # via apache-superset shortid==0.1.2 # via apache-superset @@ -304,6 +322,7 @@ six==1.16.0 # paramiko # prison # python-dateutil + # url-normalize # wtforms-json slack-sdk==3.21.3 # via apache-superset @@ -329,14 +348,18 @@ tabulate==0.8.9 typing-extensions==4.4.0 # via # apache-superset + # cattrs # flask-limiter # limits # shillelagh tzdata==2023.3 # via pandas +url-normalize==1.4.3 + # via requests-cache urllib3==1.26.6 # via # requests + # requests-cache # selenium vine==5.0.0 # via @@ -349,6 +372,7 @@ werkzeug==2.3.3 # via # apache-superset # flask + # flask-appbuilder # flask-jwt-extended # flask-login wrapt==1.15.0 @@ -364,7 +388,9 @@ wtforms-json==0.3.5 xlsxwriter==3.0.7 # via apache-superset zipp==3.15.0 - # via importlib-metadata + # via + # importlib-metadata + # importlib-resources # The following packages are considered to be unsafe in a requirements file: # setuptools diff --git a/requirements/development.txt b/requirements/development.txt index 04962ae5378eb..a73e3a70c5c27 100644 --- a/requirements/development.txt +++ b/requirements/development.txt @@ -74,8 +74,6 @@ pickleshare==0.7.5 # via ipython pillow==9.5.0 # via apache-superset -platformdirs==3.8.1 - # via pylint progress==1.6 # via -r requirements/development.in psycopg2-binary==2.9.6 diff --git a/requirements/testing.txt b/requirements/testing.txt index 00fe7345404cf..c1f6e55d1293a 100644 --- a/requirements/testing.txt +++ b/requirements/testing.txt @@ -26,8 +26,6 @@ docker==6.1.1 # via -r requirements/testing.in ephem==4.1.4 # via lunarcalendar -exceptiongroup==1.1.1 - # via pytest flask-testing==0.8.1 # via -r requirements/testing.in fonttools==4.39.4 @@ -123,8 +121,6 @@ pyee==9.0.4 # via playwright pyfakefs==5.2.2 # via -r requirements/testing.in -pyhive[presto]==0.7.0 - # via apache-superset pytest==7.3.1 # via # -r requirements/testing.in @@ -144,8 +140,6 @@ rsa==4.9 # via google-auth setuptools-git==1.2 # via prophet -shillelagh[gsheetsapi]==1.2.6 - # via apache-superset sqlalchemy-bigquery==1.6.1 # via apache-superset statsd==4.0.1 diff --git a/setup.py b/setup.py index 7200c0c5ca8de..eafd66cf1a5c6 100644 --- a/setup.py +++ b/setup.py @@ -83,8 +83,8 @@ def get_git_sha() -> str: "cryptography>=41.0.2, <41.1.0", "deprecation>=2.1.0, <2.2.0", "flask>=2.2.5, <3.0.0", - "flask-appbuilder>=4.3.9, <5.0.0", - "flask-caching>=1.11.1, <2.0", + "flask-appbuilder>=4.3.10, <5.0.0", + "flask-caching>=2.1.0, <3", "flask-compress>=1.13, <2.0", "flask-talisman>=1.0.0, <2.0", "flask-login>=0.6.0, < 1.0", @@ -114,12 +114,12 @@ def get_git_sha() -> str: "python-dateutil", "python-dotenv", "python-geohash", - "pyarrow>=12.0.0, <13", + "pyarrow>=14.0.1, <15", "pyyaml>=6.0.0, <7.0.0", "PyJWT>=2.4.0, <3.0", "redis>=4.5.4, <5.0", "selenium>=3.141.0, <4.10.0", - "shillelagh>=1.2.6,<2.0", + "shillelagh>=1.2.10, <2.0", "shortid", "sshtunnel>=0.4.0, <0.5", "simplejson>=3.15.0", @@ -147,6 +147,7 @@ def get_git_sha() -> str: "cockroachdb": ["cockroachdb>=0.3.5, <0.4"], "cors": ["flask-cors>=2.0.0"], "crate": ["crate[sqlalchemy]>=0.26.0, <0.27"], + "databend": ["databend-sqlalchemy>=0.3.2, <1.0"], "databricks": [ "databricks-sql-connector>=2.0.2, <3", "sqlalchemy-databricks>=0.2.0", @@ -155,7 +156,7 @@ def get_git_sha() -> str: "dremio": ["sqlalchemy-dremio>=1.1.5, <1.3"], "drill": ["sqlalchemy-drill==0.1.dev"], "druid": ["pydruid>=0.6.5,<0.7"], - "duckdb": ["duckdb-engine==0.9.2"], + "duckdb": ["duckdb-engine>=0.9.5, <0.10"], "dynamodb": ["pydynamodb>=0.4.2"], "solr": ["sqlalchemy-solr >= 0.2.0"], "elasticsearch": ["elasticsearch-dbapi>=0.2.9, <0.3.0"], @@ -163,7 +164,7 @@ def get_git_sha() -> str: "excel": ["xlrd>=1.2.0, <1.3"], "firebird": ["sqlalchemy-firebird>=0.7.0, <0.8"], "firebolt": ["firebolt-sqlalchemy>=0.0.1"], - "gsheets": ["shillelagh[gsheetsapi]>=1.2.6, <2"], + "gsheets": ["shillelagh[gsheetsapi]>=1.2.10, <2"], "hana": ["hdbcli==2.4.162", "sqlalchemy_hana==0.4.0"], "hive": [ "pyhive[hive]>=0.6.5;python_version<'3.11'", @@ -192,7 +193,7 @@ def get_git_sha() -> str: "redshift": ["sqlalchemy-redshift>=0.8.1, < 0.9"], "rockset": ["rockset-sqlalchemy>=0.0.1, <1.0.0"], "shillelagh": [ - "shillelagh[datasetteapi,gsheetsapi,socrata,weatherapi]>=1.2.6,<2" + "shillelagh[datasetteapi,gsheetsapi,socrata,weatherapi]>=1.2.10, <2" ], "snowflake": ["snowflake-sqlalchemy>=1.2.4, <2"], "spark": [ @@ -202,10 +203,11 @@ def get_git_sha() -> str: "thrift>=0.14.1, <1.0.0", ], "teradata": ["teradatasql>=16.20.0.23"], - "thumbnails": ["Pillow>=9.5.0, <10.0.0"], + "thumbnails": ["Pillow>=10.0.1, <11"], "vertica": ["sqlalchemy-vertica-python>=0.5.9, < 0.6"], "netezza": ["nzalchemy>=11.0.2"], "starrocks": ["starrocks>=1.0.0"], + "doris": ["pydoris>=1.0.0, <2.0.0"], }, python_requires="~=3.9", author="Apache Software Foundation", diff --git a/superset-embedded-sdk/package-lock.json b/superset-embedded-sdk/package-lock.json index febe3b9ea5706..0112ed3a388e7 100644 --- a/superset-embedded-sdk/package-lock.json +++ b/superset-embedded-sdk/package-lock.json @@ -18,7 +18,7 @@ "@babel/preset-env": "^7.16.11", "@babel/preset-typescript": "^7.16.7", "@types/jest": "^27.4.1", - "axios": "^0.25.0", + "axios": "^1.6.0", "babel-loader": "^8.2.3", "jest": "^27.5.1", "typescript": "^4.5.5", @@ -2977,12 +2977,28 @@ "dev": true }, "node_modules/axios": { - "version": "0.25.0", - "resolved": "https://registry.npmjs.org/axios/-/axios-0.25.0.tgz", - "integrity": "sha512-cD8FOb0tRH3uuEe6+evtAbgJtfxr7ly3fQjYcMcuPlgkwVS9xboaVIpcDV+cYQe+yGykgwZCs1pzjntcGa6l5g==", + "version": "1.6.0", + "resolved": "https://registry.npmjs.org/axios/-/axios-1.6.0.tgz", + "integrity": "sha512-EZ1DYihju9pwVB+jg67ogm+Tmqc6JmhamRN6I4Zt8DfZu5lbcQGw3ozH9lFejSJgs/ibaef3A9PMXPLeefFGJg==", "dev": true, "dependencies": { - "follow-redirects": "^1.14.7" + "follow-redirects": "^1.15.0", + "form-data": "^4.0.0", + "proxy-from-env": "^1.1.0" + } + }, + "node_modules/axios/node_modules/form-data": { + "version": "4.0.0", + "resolved": "https://registry.npmjs.org/form-data/-/form-data-4.0.0.tgz", + "integrity": "sha512-ETEklSGi5t0QMZuiXoA/Q6vcnxcLQP5vdugSpuAyi6SVGi2clPPp+xgEhuMaHC+zGgn31Kd235W35f7Hykkaww==", + "dev": true, + "dependencies": { + "asynckit": "^0.4.0", + "combined-stream": "^1.0.8", + "mime-types": "^2.1.12" + }, + "engines": { + "node": ">= 6" } }, "node_modules/babel-jest": { @@ -4087,9 +4103,9 @@ } }, "node_modules/follow-redirects": { - "version": "1.14.8", - "resolved": "https://registry.npmjs.org/follow-redirects/-/follow-redirects-1.14.8.tgz", - "integrity": "sha512-1x0S9UVJHsQprFcEC/qnNzBLcIxsjAV905f/UkQxbclCsoTWlacCNOpQa/anodLl2uaEKFhfWOvM2Qg77+15zA==", + "version": "1.15.3", + "resolved": "https://registry.npmjs.org/follow-redirects/-/follow-redirects-1.15.3.tgz", + "integrity": "sha512-1VzOtuEM8pC9SFU1E+8KfTjZyMztRsgEfwQl44z8A25uy13jSzTj6dyK2Df52iV0vgHCfBwLhDWevLn95w5v6Q==", "dev": true, "funding": [ { @@ -7021,6 +7037,12 @@ "node": ">= 6" } }, + "node_modules/proxy-from-env": { + "version": "1.1.0", + "resolved": "https://registry.npmjs.org/proxy-from-env/-/proxy-from-env-1.1.0.tgz", + "integrity": "sha512-D+zkORCbA9f1tdWRK0RaCR3GPv50cMxcrz4X8k5LTSUD1Dkw47mKJEZQNunItRTkWwgtaUSo1RVFRIG9ZXiFYg==", + "dev": true + }, "node_modules/psl": { "version": "1.8.0", "resolved": "https://registry.npmjs.org/psl/-/psl-1.8.0.tgz", @@ -10431,12 +10453,27 @@ "dev": true }, "axios": { - "version": "0.25.0", - "resolved": "https://registry.npmjs.org/axios/-/axios-0.25.0.tgz", - "integrity": "sha512-cD8FOb0tRH3uuEe6+evtAbgJtfxr7ly3fQjYcMcuPlgkwVS9xboaVIpcDV+cYQe+yGykgwZCs1pzjntcGa6l5g==", + "version": "1.6.0", + "resolved": "https://registry.npmjs.org/axios/-/axios-1.6.0.tgz", + "integrity": "sha512-EZ1DYihju9pwVB+jg67ogm+Tmqc6JmhamRN6I4Zt8DfZu5lbcQGw3ozH9lFejSJgs/ibaef3A9PMXPLeefFGJg==", "dev": true, "requires": { - "follow-redirects": "^1.14.7" + "follow-redirects": "^1.15.0", + "form-data": "^4.0.0", + "proxy-from-env": "^1.1.0" + }, + "dependencies": { + "form-data": { + "version": "4.0.0", + "resolved": "https://registry.npmjs.org/form-data/-/form-data-4.0.0.tgz", + "integrity": "sha512-ETEklSGi5t0QMZuiXoA/Q6vcnxcLQP5vdugSpuAyi6SVGi2clPPp+xgEhuMaHC+zGgn31Kd235W35f7Hykkaww==", + "dev": true, + "requires": { + "asynckit": "^0.4.0", + "combined-stream": "^1.0.8", + "mime-types": "^2.1.12" + } + } } }, "babel-jest": { @@ -11279,9 +11316,9 @@ } }, "follow-redirects": { - "version": "1.14.8", - "resolved": "https://registry.npmjs.org/follow-redirects/-/follow-redirects-1.14.8.tgz", - "integrity": "sha512-1x0S9UVJHsQprFcEC/qnNzBLcIxsjAV905f/UkQxbclCsoTWlacCNOpQa/anodLl2uaEKFhfWOvM2Qg77+15zA==", + "version": "1.15.3", + "resolved": "https://registry.npmjs.org/follow-redirects/-/follow-redirects-1.15.3.tgz", + "integrity": "sha512-1VzOtuEM8pC9SFU1E+8KfTjZyMztRsgEfwQl44z8A25uy13jSzTj6dyK2Df52iV0vgHCfBwLhDWevLn95w5v6Q==", "dev": true }, "form-data": { @@ -13464,6 +13501,12 @@ "sisteransi": "^1.0.5" } }, + "proxy-from-env": { + "version": "1.1.0", + "resolved": "https://registry.npmjs.org/proxy-from-env/-/proxy-from-env-1.1.0.tgz", + "integrity": "sha512-D+zkORCbA9f1tdWRK0RaCR3GPv50cMxcrz4X8k5LTSUD1Dkw47mKJEZQNunItRTkWwgtaUSo1RVFRIG9ZXiFYg==", + "dev": true + }, "psl": { "version": "1.8.0", "resolved": "https://registry.npmjs.org/psl/-/psl-1.8.0.tgz", diff --git a/superset-embedded-sdk/package.json b/superset-embedded-sdk/package.json index dfe1801ac933e..55ed198598459 100644 --- a/superset-embedded-sdk/package.json +++ b/superset-embedded-sdk/package.json @@ -42,7 +42,7 @@ "@babel/preset-env": "^7.16.11", "@babel/preset-typescript": "^7.16.7", "@types/jest": "^27.4.1", - "axios": "^0.25.0", + "axios": "^1.6.0", "babel-loader": "^8.2.3", "jest": "^27.5.1", "typescript": "^4.5.5", diff --git a/superset-frontend/cypress-base/cypress/e2e/alerts_and_reports/alerts.test.ts b/superset-frontend/cypress-base/cypress/e2e/alerts_and_reports/alerts.test.ts index a695541ceecd9..b677507a4602f 100644 --- a/superset-frontend/cypress-base/cypress/e2e/alerts_and_reports/alerts.test.ts +++ b/superset-frontend/cypress-base/cypress/e2e/alerts_and_reports/alerts.test.ts @@ -29,10 +29,9 @@ describe('Alert list view', () => { cy.getBySel('sort-header').eq(2).contains('Name'); cy.getBySel('sort-header').eq(3).contains('Schedule'); cy.getBySel('sort-header').eq(4).contains('Notification method'); - cy.getBySel('sort-header').eq(5).contains('Created by'); - cy.getBySel('sort-header').eq(6).contains('Owners'); - cy.getBySel('sort-header').eq(7).contains('Modified'); - cy.getBySel('sort-header').eq(8).contains('Active'); + cy.getBySel('sort-header').eq(5).contains('Owners'); + cy.getBySel('sort-header').eq(6).contains('Last modified'); + cy.getBySel('sort-header').eq(7).contains('Active'); // TODO Cypress won't recognize the Actions column // cy.getBySel('sort-header').eq(9).contains('Actions'); }); diff --git a/superset-frontend/cypress-base/cypress/e2e/alerts_and_reports/reports.test.ts b/superset-frontend/cypress-base/cypress/e2e/alerts_and_reports/reports.test.ts index e267d76f6f7ed..a227fa03d7da7 100644 --- a/superset-frontend/cypress-base/cypress/e2e/alerts_and_reports/reports.test.ts +++ b/superset-frontend/cypress-base/cypress/e2e/alerts_and_reports/reports.test.ts @@ -29,10 +29,9 @@ describe('Report list view', () => { cy.getBySel('sort-header').eq(2).contains('Name'); cy.getBySel('sort-header').eq(3).contains('Schedule'); cy.getBySel('sort-header').eq(4).contains('Notification method'); - cy.getBySel('sort-header').eq(5).contains('Created by'); - cy.getBySel('sort-header').eq(6).contains('Owners'); - cy.getBySel('sort-header').eq(7).contains('Modified'); - cy.getBySel('sort-header').eq(8).contains('Active'); + cy.getBySel('sort-header').eq(5).contains('Owners'); + cy.getBySel('sort-header').eq(6).contains('Last modified'); + cy.getBySel('sort-header').eq(7).contains('Active'); // TODO Cypress won't recognize the Actions column // cy.getBySel('sort-header').eq(9).contains('Actions'); }); diff --git a/superset-frontend/cypress-base/cypress/e2e/chart_list/filter.test.ts b/superset-frontend/cypress-base/cypress/e2e/chart_list/filter.test.ts index acd11669bea18..00b09e2fb8d0f 100644 --- a/superset-frontend/cypress-base/cypress/e2e/chart_list/filter.test.ts +++ b/superset-frontend/cypress-base/cypress/e2e/chart_list/filter.test.ts @@ -35,14 +35,14 @@ describe('Charts filters', () => { setFilter('Owner', 'admin user'); }); - it('should allow filtering by "Created by" correctly', () => { - setFilter('Created by', 'alpha user'); - setFilter('Created by', 'admin user'); + it('should allow filtering by "Modified by" correctly', () => { + setFilter('Modified by', 'alpha user'); + setFilter('Modified by', 'admin user'); }); - it('should allow filtering by "Chart type" correctly', () => { - setFilter('Chart type', 'Area Chart (legacy)'); - setFilter('Chart type', 'Bubble Chart'); + it('should allow filtering by "Type" correctly', () => { + setFilter('Type', 'Area Chart (legacy)'); + setFilter('Type', 'Bubble Chart'); }); it('should allow filtering by "Dataset" correctly', () => { @@ -51,7 +51,7 @@ describe('Charts filters', () => { }); it('should allow filtering by "Dashboards" correctly', () => { - setFilter('Dashboards', 'Unicode Test'); - setFilter('Dashboards', 'Tabbed Dashboard'); + setFilter('Dashboard', 'Unicode Test'); + setFilter('Dashboard', 'Tabbed Dashboard'); }); }); diff --git a/superset-frontend/cypress-base/cypress/e2e/chart_list/list.test.ts b/superset-frontend/cypress-base/cypress/e2e/chart_list/list.test.ts index 6664281abe9b9..44f348edc50f5 100644 --- a/superset-frontend/cypress-base/cypress/e2e/chart_list/list.test.ts +++ b/superset-frontend/cypress-base/cypress/e2e/chart_list/list.test.ts @@ -109,14 +109,12 @@ describe('Charts list', () => { it('should load rows in list mode', () => { cy.getBySel('listview-table').should('be.visible'); - cy.getBySel('sort-header').eq(1).contains('Chart'); - cy.getBySel('sort-header').eq(2).contains('Visualization type'); + cy.getBySel('sort-header').eq(1).contains('Name'); + cy.getBySel('sort-header').eq(2).contains('Type'); cy.getBySel('sort-header').eq(3).contains('Dataset'); - // cy.getBySel('sort-header').eq(4).contains('Dashboards added to'); - cy.getBySel('sort-header').eq(4).contains('Modified by'); + cy.getBySel('sort-header').eq(4).contains('Owners'); cy.getBySel('sort-header').eq(5).contains('Last modified'); - cy.getBySel('sort-header').eq(6).contains('Created by'); - cy.getBySel('sort-header').eq(7).contains('Actions'); + cy.getBySel('sort-header').eq(6).contains('Actions'); }); it('should sort correctly in list mode', () => { diff --git a/superset-frontend/cypress-base/cypress/e2e/dashboard/editmode.test.ts b/superset-frontend/cypress-base/cypress/e2e/dashboard/editmode.test.ts index 812ad945dad45..62bab84d1b85c 100644 --- a/superset-frontend/cypress-base/cypress/e2e/dashboard/editmode.test.ts +++ b/superset-frontend/cypress-base/cypress/e2e/dashboard/editmode.test.ts @@ -515,7 +515,7 @@ describe('Dashboard edit', () => { // label Anthony cy.get('[data-test-chart-name="Trends"] .line .nv-legend-symbol') .eq(2) - .should('have.css', 'fill', 'rgb(0, 122, 135)'); + .should('have.css', 'fill', 'rgb(244, 176, 42)'); // open main tab and nested tab openTab(0, 0); @@ -526,7 +526,7 @@ describe('Dashboard edit', () => { '[data-test-chart-name="Top 10 California Names Timeseries"] .line .nv-legend-symbol', ) .first() - .should('have.css', 'fill', 'rgb(0, 122, 135)'); + .should('have.css', 'fill', 'rgb(244, 176, 42)'); }); it('should apply the color scheme across main tabs', () => { @@ -557,7 +557,7 @@ describe('Dashboard edit', () => { cy.get('[data-test-chart-name="Trends"] .line .nv-legend-symbol') .first() - .should('have.css', 'fill', 'rgb(204, 0, 134)'); + .should('have.css', 'fill', 'rgb(156, 52, 152)'); // change scheme now that charts are rendered across the main tabs editDashboard(); diff --git a/superset-frontend/cypress-base/cypress/e2e/dashboard/nativeFilters.test.ts b/superset-frontend/cypress-base/cypress/e2e/dashboard/nativeFilters.test.ts index e8457ba94b61a..4e0309a2daac1 100644 --- a/superset-frontend/cypress-base/cypress/e2e/dashboard/nativeFilters.test.ts +++ b/superset-frontend/cypress-base/cypress/e2e/dashboard/nativeFilters.test.ts @@ -113,7 +113,7 @@ function prepareDashboardFilters( }, type: 'NATIVE_FILTER', description: '', - chartsInScope: [6], + chartsInScope: [5], tabsInScope: [], }); }); @@ -150,7 +150,7 @@ function prepareDashboardFilters( meta: { width: 4, height: 50, - chartId: 6, + chartId: 5, sliceName: 'Most Populated Countries', }, }, diff --git a/superset-frontend/cypress-base/cypress/e2e/dashboard/tabs.test.ts b/superset-frontend/cypress-base/cypress/e2e/dashboard/tabs.test.ts index 6fc89c1446fb8..ba442e600ae60 100644 --- a/superset-frontend/cypress-base/cypress/e2e/dashboard/tabs.test.ts +++ b/superset-frontend/cypress-base/cypress/e2e/dashboard/tabs.test.ts @@ -25,7 +25,6 @@ import { TABBED_DASHBOARD } from 'cypress/utils/urls'; import { expandFilterOnLeftPanel } from './utils'; const TREEMAP = { name: 'Treemap', viz: 'treemap_v2' }; -const FILTER_BOX = { name: 'Region Filter', viz: 'filter_box' }; const LINE_CHART = { name: 'Growth Rate', viz: 'line' }; const BOX_PLOT = { name: 'Box plot', viz: 'box_plot' }; const BIG_NUMBER = { name: 'Number of Girls', viz: 'big_number_total' }; @@ -41,7 +40,6 @@ function topLevelTabs() { function resetTabs() { topLevelTabs(); cy.get('@top-level-tabs').first().click(); - waitForChartLoad(FILTER_BOX); waitForChartLoad(TREEMAP); waitForChartLoad(BIG_NUMBER); waitForChartLoad(TABLE); @@ -96,7 +94,6 @@ describe('Dashboard tabs', () => { it.skip('should send new queries when tab becomes visible', () => { // landing in first tab - waitForChartLoad(FILTER_BOX); waitForChartLoad(TREEMAP); getChartAliasBySpec(TREEMAP).then(treemapAlias => { diff --git a/superset-frontend/cypress-base/cypress/e2e/dashboard/utils.ts b/superset-frontend/cypress-base/cypress/e2e/dashboard/utils.ts index ca539039cf6e6..c63df51d10a6d 100644 --- a/superset-frontend/cypress-base/cypress/e2e/dashboard/utils.ts +++ b/superset-frontend/cypress-base/cypress/e2e/dashboard/utils.ts @@ -23,7 +23,6 @@ import { ChartSpec, waitForChartLoad } from 'cypress/utils'; export const WORLD_HEALTH_CHARTS = [ { name: '% Rural', viz: 'world_map' }, { name: 'Most Populated Countries', viz: 'table' }, - { name: 'Region Filter', viz: 'filter_box' }, { name: "World's Population", viz: 'big_number' }, { name: 'Growth Rate', viz: 'line' }, { name: 'Rural Breakdown', viz: 'sunburst' }, diff --git a/superset-frontend/cypress-base/cypress/e2e/dashboard_list/filter.test.ts b/superset-frontend/cypress-base/cypress/e2e/dashboard_list/filter.test.ts index 4654b3b5c2634..854ea541c74e3 100644 --- a/superset-frontend/cypress-base/cypress/e2e/dashboard_list/filter.test.ts +++ b/superset-frontend/cypress-base/cypress/e2e/dashboard_list/filter.test.ts @@ -35,9 +35,9 @@ describe('Dashboards filters', () => { setFilter('Owner', 'admin user'); }); - it('should allow filtering by "Created by" correctly', () => { - setFilter('Created by', 'alpha user'); - setFilter('Created by', 'admin user'); + it('should allow filtering by "Modified by" correctly', () => { + setFilter('Modified by', 'alpha user'); + setFilter('Modified by', 'admin user'); }); it('should allow filtering by "Status" correctly', () => { diff --git a/superset-frontend/cypress-base/cypress/e2e/dashboard_list/list.test.ts b/superset-frontend/cypress-base/cypress/e2e/dashboard_list/list.test.ts index 9bc6eed224578..7dfb7cd673d7f 100644 --- a/superset-frontend/cypress-base/cypress/e2e/dashboard_list/list.test.ts +++ b/superset-frontend/cypress-base/cypress/e2e/dashboard_list/list.test.ts @@ -54,13 +54,11 @@ describe('Dashboards list', () => { it('should load rows in list mode', () => { cy.getBySel('listview-table').should('be.visible'); - cy.getBySel('sort-header').eq(1).contains('Title'); - cy.getBySel('sort-header').eq(2).contains('Modified by'); - cy.getBySel('sort-header').eq(3).contains('Status'); - cy.getBySel('sort-header').eq(4).contains('Modified'); - cy.getBySel('sort-header').eq(5).contains('Created by'); - cy.getBySel('sort-header').eq(6).contains('Owners'); - cy.getBySel('sort-header').eq(7).contains('Actions'); + cy.getBySel('sort-header').eq(1).contains('Name'); + cy.getBySel('sort-header').eq(2).contains('Status'); + cy.getBySel('sort-header').eq(3).contains('Owners'); + cy.getBySel('sort-header').eq(4).contains('Last modified'); + cy.getBySel('sort-header').eq(5).contains('Actions'); }); it('should sort correctly in list mode', () => { diff --git a/superset-frontend/cypress-base/cypress/e2e/explore/visualizations/dist_bar.test.js b/superset-frontend/cypress-base/cypress/e2e/explore/visualizations/dist_bar.test.js index 770e1e1c04d38..591ba31776935 100644 --- a/superset-frontend/cypress-base/cypress/e2e/explore/visualizations/dist_bar.test.js +++ b/superset-frontend/cypress-base/cypress/e2e/explore/visualizations/dist_bar.test.js @@ -89,6 +89,6 @@ describe('Visualization > Distribution bar chart', () => { ).should('exist'); cy.get('.dist_bar .nv-legend .nv-legend-symbol') .first() - .should('have.css', 'fill', 'rgb(255, 90, 95)'); + .should('have.css', 'fill', 'rgb(41, 105, 107)'); }); }); diff --git a/superset-frontend/cypress-base/cypress/e2e/explore/visualizations/line.test.ts b/superset-frontend/cypress-base/cypress/e2e/explore/visualizations/line.test.ts index 5cc398c7f3ef7..8499db5946818 100644 --- a/superset-frontend/cypress-base/cypress/e2e/explore/visualizations/line.test.ts +++ b/superset-frontend/cypress-base/cypress/e2e/explore/visualizations/line.test.ts @@ -85,7 +85,7 @@ describe('Visualization > Line', () => { ).should('exist'); cy.get('.line .nv-legend .nv-legend-symbol') .first() - .should('have.css', 'fill', 'rgb(255, 90, 95)'); + .should('have.css', 'fill', 'rgb(41, 105, 107)'); }); it('should work with adhoc metric', () => { diff --git a/superset-frontend/cypress-base/cypress/support/e2e.ts b/superset-frontend/cypress-base/cypress/support/e2e.ts index 6642e0120cca4..cccc7b2005737 100644 --- a/superset-frontend/cypress-base/cypress/support/e2e.ts +++ b/superset-frontend/cypress-base/cypress/support/e2e.ts @@ -18,7 +18,7 @@ */ import '@cypress/code-coverage/support'; import '@applitools/eyes-cypress/commands'; -import failOnConsoleError, { Config } from 'cypress-fail-on-console-error'; +import failOnConsoleError from 'cypress-fail-on-console-error'; require('cy-verify-downloads').addCustomCommand(); diff --git a/superset-frontend/lerna.json b/superset-frontend/lerna.json index 3a16712db29da..07bef7fcfdd67 100644 --- a/superset-frontend/lerna.json +++ b/superset-frontend/lerna.json @@ -1,7 +1,7 @@ { "lerna": "3.2.1", "npmClient": "npm", - "packages": ["packages/*", "plugins/*"], + "packages": ["packages/*", "plugins/*", "src/setup/*"], "useWorkspaces": true, "version": "0.18.25", "ignoreChanges": [ diff --git a/superset-frontend/package-lock.json b/superset-frontend/package-lock.json index ac49a706fef5e..444dc253ff313 100644 --- a/superset-frontend/package-lock.json +++ b/superset-frontend/package-lock.json @@ -10,7 +10,8 @@ "license": "Apache-2.0", "workspaces": [ "packages/*", - "plugins/*" + "plugins/*", + "src/setup/*" ], "dependencies": { "@ant-design/icons": "^5.0.1", @@ -186,6 +187,7 @@ "@testing-library/react-hooks": "^5.0.3", "@testing-library/user-event": "^12.7.0", "@types/classnames": "^2.2.10", + "@types/dom-to-image": "^2.6.7", "@types/enzyme": "^3.10.5", "@types/enzyme-adapter-react-16": "^1.0.6", "@types/fetch-mock": "^7.3.2", @@ -260,7 +262,7 @@ "less-loader": "^10.2.0", "mini-css-extract-plugin": "^2.7.6", "mock-socket": "^9.0.3", - "node-fetch": "^2.6.1", + "node-fetch": "^2.6.7", "prettier": "^2.4.1", "prettier-plugin-packagejson": "^2.2.15", "process": "^0.11.10", @@ -19248,6 +19250,12 @@ "@types/ms": "*" } }, + "node_modules/@types/dom-to-image": { + "version": "2.6.7", + "resolved": "https://registry.npmjs.org/@types/dom-to-image/-/dom-to-image-2.6.7.tgz", + "integrity": "sha512-me5VbCv+fcXozblWwG13krNBvuEOm6kA5xoa4RrjDJCNFOZSWR3/QLtOXimBHk1Fisq69Gx3JtOoXtg1N1tijg==", + "dev": true + }, "node_modules/@types/enzyme": { "version": "3.10.10", "resolved": "https://registry.npmjs.org/@types/enzyme/-/enzyme-3.10.10.tgz", @@ -29516,9 +29524,9 @@ "license": "MIT" }, "node_modules/dom-to-image-more": { - "version": "2.10.1", - "resolved": "https://registry.npmjs.org/dom-to-image-more/-/dom-to-image-more-2.10.1.tgz", - "integrity": "sha512-gMG28V47WGj5/xvrsbSPJAWSaV7CBh4teLErn1iGD1sa29HsFsHxvnoLj8VxVvfqnjPgsiUGs2IV2VAxLJGb+A==" + "version": "2.16.0", + "resolved": "https://registry.npmjs.org/dom-to-image-more/-/dom-to-image-more-2.16.0.tgz", + "integrity": "sha512-RyjtkaM/zVy90uJ20lT+/G7MwBZx6l/ePliq5CQOeAnPeew7aUGS6IqRWBkHpstU+POmhaKA8A9H9qf476gisQ==" }, "node_modules/dom-to-pdf": { "version": "0.3.2", @@ -47777,9 +47785,9 @@ "dev": true }, "node_modules/nx/node_modules/axios": { - "version": "1.4.0", - "resolved": "https://registry.npmjs.org/axios/-/axios-1.4.0.tgz", - "integrity": "sha512-S4XCWMEmzvo64T9GfvQDOXgYRDJ/wsSZc7Jvdgx5u1sd0JwsuPLqb3SYmusag+edF6ziyMensPVqLTSc1PiSEA==", + "version": "1.6.1", + "resolved": "https://registry.npmjs.org/axios/-/axios-1.6.1.tgz", + "integrity": "sha512-vfBmhDpKafglh0EldBEbVuoe7DyAavGSLWhuSm5ZSEKQnHhBf0xAAwybbNH1IkrJNGnS/VG4I5yxig1pCEXE4g==", "dev": true, "dependencies": { "follow-redirects": "^1.15.0", @@ -79213,6 +79221,12 @@ "@types/ms": "*" } }, + "@types/dom-to-image": { + "version": "2.6.7", + "resolved": "https://registry.npmjs.org/@types/dom-to-image/-/dom-to-image-2.6.7.tgz", + "integrity": "sha512-me5VbCv+fcXozblWwG13krNBvuEOm6kA5xoa4RrjDJCNFOZSWR3/QLtOXimBHk1Fisq69Gx3JtOoXtg1N1tijg==", + "dev": true + }, "@types/enzyme": { "version": "3.10.10", "resolved": "https://registry.npmjs.org/@types/enzyme/-/enzyme-3.10.10.tgz", @@ -87334,9 +87348,9 @@ "from": "dom-to-image@git+https://github.com/dmapper/dom-to-image.git" }, "dom-to-image-more": { - "version": "2.10.1", - "resolved": "https://registry.npmjs.org/dom-to-image-more/-/dom-to-image-more-2.10.1.tgz", - "integrity": "sha512-gMG28V47WGj5/xvrsbSPJAWSaV7CBh4teLErn1iGD1sa29HsFsHxvnoLj8VxVvfqnjPgsiUGs2IV2VAxLJGb+A==" + "version": "2.16.0", + "resolved": "https://registry.npmjs.org/dom-to-image-more/-/dom-to-image-more-2.16.0.tgz", + "integrity": "sha512-RyjtkaM/zVy90uJ20lT+/G7MwBZx6l/ePliq5CQOeAnPeew7aUGS6IqRWBkHpstU+POmhaKA8A9H9qf476gisQ==" }, "dom-to-pdf": { "version": "0.3.2", @@ -101318,9 +101332,9 @@ "dev": true }, "axios": { - "version": "1.4.0", - "resolved": "https://registry.npmjs.org/axios/-/axios-1.4.0.tgz", - "integrity": "sha512-S4XCWMEmzvo64T9GfvQDOXgYRDJ/wsSZc7Jvdgx5u1sd0JwsuPLqb3SYmusag+edF6ziyMensPVqLTSc1PiSEA==", + "version": "1.6.1", + "resolved": "https://registry.npmjs.org/axios/-/axios-1.6.1.tgz", + "integrity": "sha512-vfBmhDpKafglh0EldBEbVuoe7DyAavGSLWhuSm5ZSEKQnHhBf0xAAwybbNH1IkrJNGnS/VG4I5yxig1pCEXE4g==", "dev": true, "requires": { "follow-redirects": "^1.15.0", diff --git a/superset-frontend/package.json b/superset-frontend/package.json index 229fe0797e07a..18b8b0971be9e 100644 --- a/superset-frontend/package.json +++ b/superset-frontend/package.json @@ -1,6 +1,6 @@ { "name": "superset", - "version": "0.0.0-dev", + "version": "3.1.0", "description": "Superset is a data exploration platform designed to be visual, intuitive, and interactive.", "keywords": [ "big", @@ -33,7 +33,8 @@ }, "workspaces": [ "packages/*", - "plugins/*" + "plugins/*", + "src/setup/*" ], "scripts": { "_lint": "eslint --ignore-path=.eslintignore --ext .js,.jsx,.ts,tsx .", @@ -251,6 +252,7 @@ "@testing-library/react-hooks": "^5.0.3", "@testing-library/user-event": "^12.7.0", "@types/classnames": "^2.2.10", + "@types/dom-to-image": "^2.6.7", "@types/enzyme": "^3.10.5", "@types/enzyme-adapter-react-16": "^1.0.6", "@types/fetch-mock": "^7.3.2", @@ -325,7 +327,7 @@ "less-loader": "^10.2.0", "mini-css-extract-plugin": "^2.7.6", "mock-socket": "^9.0.3", - "node-fetch": "^2.6.1", + "node-fetch": "^2.6.7", "prettier": "^2.4.1", "prettier-plugin-packagejson": "^2.2.15", "process": "^0.11.10", diff --git a/superset-frontend/packages/superset-ui-chart-controls/src/constants.ts b/superset-frontend/packages/superset-ui-chart-controls/src/constants.ts index cbde46b0ef4df..1aefc25464bb5 100644 --- a/superset-frontend/packages/superset-ui-chart-controls/src/constants.ts +++ b/superset-frontend/packages/superset-ui-chart-controls/src/constants.ts @@ -17,15 +17,16 @@ * under the License. */ import { - t, - QueryMode, DTTM_ALIAS, GenericDataType, QueryColumn, - DatasourceType, + QueryMode, + t, } from '@superset-ui/core'; import { ColumnMeta, SortSeriesData, SortSeriesType } from './types'; +export const DEFAULT_MAX_ROW = 100000; + // eslint-disable-next-line import/prefer-default-export export const TIME_FILTER_LABELS = { time_range: t('Time Range'), @@ -41,6 +42,7 @@ export const COLUMN_NAME_ALIASES: Record<string, string> = { export const DATASET_TIME_COLUMN_OPTION: ColumnMeta = { verbose_name: COLUMN_NAME_ALIASES[DTTM_ALIAS], column_name: DTTM_ALIAS, + type: 'TIMESTAMP', type_generic: GenericDataType.TEMPORAL, description: t( 'A reference to the [Time] configuration, taking granularity into account', @@ -49,8 +51,9 @@ export const DATASET_TIME_COLUMN_OPTION: ColumnMeta = { export const QUERY_TIME_COLUMN_OPTION: QueryColumn = { column_name: DTTM_ALIAS, - type: DatasourceType.Query, - is_dttm: false, + is_dttm: true, + type: 'TIMESTAMP', + type_generic: GenericDataType.TEMPORAL, }; export const QueryModeLabel = { diff --git a/superset-frontend/packages/superset-ui-chart-controls/src/fixtures.ts b/superset-frontend/packages/superset-ui-chart-controls/src/fixtures.ts index cc0b678f6db6c..b12d5be048e8a 100644 --- a/superset-frontend/packages/superset-ui-chart-controls/src/fixtures.ts +++ b/superset-frontend/packages/superset-ui-chart-controls/src/fixtures.ts @@ -16,7 +16,7 @@ * specific language governing permissions and limitations * under the License. */ -import { DatasourceType } from '@superset-ui/core'; +import { DatasourceType, GenericDataType } from '@superset-ui/core'; import { Dataset } from './types'; export const TestDataset: Dataset = { @@ -37,7 +37,7 @@ export const TestDataset: Dataset = { is_dttm: false, python_date_format: null, type: 'BIGINT', - type_generic: 0, + type_generic: GenericDataType.NUMERIC, verbose_name: null, warning_markdown: null, }, @@ -55,7 +55,7 @@ export const TestDataset: Dataset = { is_dttm: false, python_date_format: null, type: 'VARCHAR(16)', - type_generic: 1, + type_generic: GenericDataType.STRING, verbose_name: '', warning_markdown: null, }, @@ -73,7 +73,7 @@ export const TestDataset: Dataset = { is_dttm: false, python_date_format: null, type: 'VARCHAR(10)', - type_generic: 1, + type_generic: GenericDataType.STRING, verbose_name: null, warning_markdown: null, }, @@ -91,7 +91,7 @@ export const TestDataset: Dataset = { is_dttm: true, python_date_format: null, type: 'TIMESTAMP WITHOUT TIME ZONE', - type_generic: 2, + type_generic: GenericDataType.TEMPORAL, verbose_name: null, warning_markdown: null, }, @@ -109,7 +109,7 @@ export const TestDataset: Dataset = { is_dttm: false, python_date_format: null, type: 'VARCHAR(255)', - type_generic: 1, + type_generic: GenericDataType.STRING, verbose_name: null, warning_markdown: null, }, diff --git a/superset-frontend/packages/superset-ui-chart-controls/src/sections/echartsTimeSeriesQuery.tsx b/superset-frontend/packages/superset-ui-chart-controls/src/sections/echartsTimeSeriesQuery.tsx index 53c9aa7447f31..596c6f0e55cab 100644 --- a/superset-frontend/packages/superset-ui-chart-controls/src/sections/echartsTimeSeriesQuery.tsx +++ b/superset-frontend/packages/superset-ui-chart-controls/src/sections/echartsTimeSeriesQuery.tsx @@ -20,6 +20,7 @@ import { hasGenericChartAxes, t } from '@superset-ui/core'; import { ControlPanelSectionConfig, ControlSetRow } from '../types'; import { contributionModeControl, + xAxisForceCategoricalControl, xAxisSortAscControl, xAxisSortControl, xAxisSortSeriesAscendingControl, @@ -55,6 +56,7 @@ export const echartsTimeSeriesQueryWithXAxisSort: ControlPanelSectionConfig = { controlSetRows: [ [hasGenericChartAxes ? 'x_axis' : null], [hasGenericChartAxes ? 'time_grain_sqla' : null], + [hasGenericChartAxes ? xAxisForceCategoricalControl : null], [hasGenericChartAxes ? xAxisSortControl : null], [hasGenericChartAxes ? xAxisSortAscControl : null], [hasGenericChartAxes ? xAxisSortSeriesControl : null], diff --git a/superset-frontend/packages/superset-ui-chart-controls/src/shared-controls/customControls.tsx b/superset-frontend/packages/superset-ui-chart-controls/src/shared-controls/customControls.tsx index 82ba6dfeebe45..b0f3006353d64 100644 --- a/superset-frontend/packages/superset-ui-chart-controls/src/shared-controls/customControls.tsx +++ b/superset-frontend/packages/superset-ui-chart-controls/src/shared-controls/customControls.tsx @@ -20,9 +20,9 @@ import { ContributionType, ensureIsArray, + GenericDataType, getColumnLabel, getMetricLabel, - isDefined, QueryFormColumn, QueryFormMetric, t, @@ -38,6 +38,7 @@ import { DEFAULT_XAXIS_SORT_SERIES_DATA, SORT_SERIES_CHOICES, } from '../constants'; +import { checkColumnType } from '../utils/checkColumnType'; export const contributionModeControl = { name: 'contributionMode', @@ -54,18 +55,29 @@ export const contributionModeControl = { }, }; -function isTemporal(controls: ControlStateMapping): boolean { - return !( - isDefined(controls?.x_axis?.value) && - !isTemporalColumn( +function isForcedCategorical(controls: ControlStateMapping): boolean { + return ( + checkColumnType( getColumnLabel(controls?.x_axis?.value as QueryFormColumn), controls?.datasource?.datasource, + [GenericDataType.NUMERIC], + ) && !!controls?.xAxisForceCategorical?.value + ); +} + +function isSortable(controls: ControlStateMapping): boolean { + return ( + isForcedCategorical(controls) || + checkColumnType( + getColumnLabel(controls?.x_axis?.value as QueryFormColumn), + controls?.datasource?.datasource, + [GenericDataType.STRING, GenericDataType.BOOLEAN], ) ); } const xAxisSortVisibility = ({ controls }: { controls: ControlStateMapping }) => - !isTemporal(controls) && + isSortable(controls) && ensureIsArray(controls?.groupby?.value).length === 0 && ensureIsArray(controls?.metrics?.value).length === 1; @@ -74,7 +86,7 @@ const xAxisMultiSortVisibility = ({ }: { controls: ControlStateMapping; }) => - !isTemporal(controls) && + isSortable(controls) && (!!ensureIsArray(controls?.groupby?.value).length || ensureIsArray(controls?.metrics?.value).length > 1); @@ -141,7 +153,29 @@ export const xAxisSortAscControl = { : t('X-Axis Sort Ascending'), default: true, description: t('Whether to sort ascending or descending on the base Axis.'), - visibility: xAxisSortVisibility, + visibility: ({ controls }: { controls: ControlStateMapping }) => + controls?.x_axis_sort?.value !== undefined && + xAxisSortVisibility({ controls }), + }, +}; + +export const xAxisForceCategoricalControl = { + name: 'xAxisForceCategorical', + config: { + type: 'CheckboxControl', + label: () => t('Force categorical'), + default: false, + description: t('Treat values as categorical.'), + initialValue: (control: ControlState, state: ControlPanelState | null) => + state?.form_data?.x_axis_sort !== undefined || control.value, + renderTrigger: true, + visibility: ({ controls }: { controls: ControlStateMapping }) => + checkColumnType( + getColumnLabel(controls?.x_axis?.value as QueryFormColumn), + controls?.datasource?.datasource, + [GenericDataType.NUMERIC], + ), + shouldMapStateToProps: () => true, }, }; @@ -173,6 +207,8 @@ export const xAxisSortSeriesAscendingControl = { default: DEFAULT_XAXIS_SORT_SERIES_DATA.sort_series_ascending, description: t('Whether to sort ascending or descending on the base Axis.'), renderTrigger: true, - visibility: xAxisMultiSortVisibility, + visibility: ({ controls }: { controls: ControlStateMapping }) => + controls?.x_axis_sort_series?.value !== undefined && + xAxisMultiSortVisibility({ controls }), }, }; diff --git a/superset-frontend/packages/superset-ui-chart-controls/src/shared-controls/sharedControls.tsx b/superset-frontend/packages/superset-ui-chart-controls/src/shared-controls/sharedControls.tsx index 57419b491899f..341aaa0729f08 100644 --- a/superset-frontend/packages/superset-ui-chart-controls/src/shared-controls/sharedControls.tsx +++ b/superset-frontend/packages/superset-ui-chart-controls/src/shared-controls/sharedControls.tsx @@ -47,6 +47,7 @@ import { isDefined, hasGenericChartAxes, NO_TIME_RANGE, + validateMaxValue, } from '@superset-ui/core'; import { @@ -58,7 +59,7 @@ import { DEFAULT_TIME_FORMAT, DEFAULT_NUMBER_FORMAT, } from '../utils'; -import { TIME_FILTER_LABELS } from '../constants'; +import { DEFAULT_MAX_ROW, TIME_FILTER_LABELS } from '../constants'; import { SharedControlConfig, Dataset, @@ -243,7 +244,12 @@ const row_limit: SharedControlConfig<'SelectControl'> = { type: 'SelectControl', freeForm: true, label: t('Row limit'), - validators: [legacyValidateInteger], + clearable: false, + validators: [ + legacyValidateInteger, + (v, state) => + validateMaxValue(v, state?.common?.conf?.SQL_MAX_ROW || DEFAULT_MAX_ROW), + ], default: 10000, choices: formatSelectOptions(ROW_LIMIT_OPTIONS), description: t( diff --git a/superset-frontend/packages/superset-ui-chart-controls/src/types.ts b/superset-frontend/packages/superset-ui-chart-controls/src/types.ts index 09e4f63ee3d37..9314f8d33f3b9 100644 --- a/superset-frontend/packages/superset-ui-chart-controls/src/types.ts +++ b/superset-frontend/packages/superset-ui-chart-controls/src/types.ts @@ -479,13 +479,15 @@ export function isControlPanelSectionConfig( export function isDataset( datasource: Dataset | QueryResponse | null | undefined, ): datasource is Dataset { - return !!datasource && 'columns' in datasource; + return ( + !!datasource && 'columns' in datasource && !('sqlEditorId' in datasource) + ); } export function isQueryResponse( datasource: Dataset | QueryResponse | null | undefined, ): datasource is QueryResponse { - return !!datasource && 'results' in datasource && 'sql' in datasource; + return !!datasource && 'results' in datasource && 'sqlEditorId' in datasource; } export enum SortSeriesType { diff --git a/superset-frontend/packages/superset-ui-chart-controls/src/utils/checkColumnType.ts b/superset-frontend/packages/superset-ui-chart-controls/src/utils/checkColumnType.ts new file mode 100644 index 0000000000000..202b9605d545d --- /dev/null +++ b/superset-frontend/packages/superset-ui-chart-controls/src/utils/checkColumnType.ts @@ -0,0 +1,49 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +import { ensureIsArray, GenericDataType, ValueOf } from '@superset-ui/core'; +import { + ControlPanelState, + isDataset, + isQueryResponse, +} from '@superset-ui/chart-controls'; + +export function checkColumnType( + columnName: string, + datasource: ValueOf<Pick<ControlPanelState, 'datasource'>>, + columnTypes: GenericDataType[], +): boolean { + if (isDataset(datasource)) { + return ensureIsArray(datasource.columns).some( + c => + c.type_generic !== undefined && + columnTypes.includes(c.type_generic) && + columnName === c.column_name, + ); + } + if (isQueryResponse(datasource)) { + return ensureIsArray(datasource.columns) + .filter( + c => + c.type_generic !== undefined && columnTypes.includes(c.type_generic), + ) + .map(c => c.column_name) + .some(c => columnName === c); + } + return false; +} diff --git a/superset-frontend/packages/superset-ui-chart-controls/src/utils/columnChoices.ts b/superset-frontend/packages/superset-ui-chart-controls/src/utils/columnChoices.ts index c76cd79031c23..f0561517502c5 100644 --- a/superset-frontend/packages/superset-ui-chart-controls/src/utils/columnChoices.ts +++ b/superset-frontend/packages/superset-ui-chart-controls/src/utils/columnChoices.ts @@ -17,7 +17,7 @@ * under the License. */ import { QueryResponse } from '@superset-ui/core'; -import { Dataset, isColumnMeta, isDataset } from '../types'; +import { Dataset, isDataset, isQueryResponse } from '../types'; /** * Convert Datasource columns to column choices @@ -25,11 +25,13 @@ import { Dataset, isColumnMeta, isDataset } from '../types'; export default function columnChoices( datasource?: Dataset | QueryResponse | null, ): [string, string][] { - if (isDataset(datasource) && isColumnMeta(datasource.columns[0])) { + if (isDataset(datasource) || isQueryResponse(datasource)) { return datasource.columns .map((col): [string, string] => [ col.column_name, - col.verbose_name || col.column_name, + 'verbose_name' in col + ? col.verbose_name || col.column_name + : col.column_name, ]) .sort((opt1, opt2) => opt1[1].toLowerCase() > opt2[1].toLowerCase() ? 1 : -1, diff --git a/superset-frontend/packages/superset-ui-chart-controls/src/utils/index.ts b/superset-frontend/packages/superset-ui-chart-controls/src/utils/index.ts index 4fa4243c1e850..208d708a96854 100644 --- a/superset-frontend/packages/superset-ui-chart-controls/src/utils/index.ts +++ b/superset-frontend/packages/superset-ui-chart-controls/src/utils/index.ts @@ -16,6 +16,7 @@ * specific language governing permissions and limitations * under the License. */ +export * from './checkColumnType'; export * from './selectOptions'; export * from './D3Formatting'; export * from './expandControlConfig'; diff --git a/superset-frontend/packages/superset-ui-chart-controls/test/utils/checkColumnType.test.ts b/superset-frontend/packages/superset-ui-chart-controls/test/utils/checkColumnType.test.ts new file mode 100644 index 0000000000000..44ebbf605cfaa --- /dev/null +++ b/superset-frontend/packages/superset-ui-chart-controls/test/utils/checkColumnType.test.ts @@ -0,0 +1,48 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +import { GenericDataType, testQueryResponse } from '@superset-ui/core'; +import { checkColumnType, TestDataset } from '../../src'; + +test('checkColumnType columns from a Dataset', () => { + expect( + checkColumnType('num', TestDataset, [GenericDataType.NUMERIC]), + ).toEqual(true); + expect(checkColumnType('num', TestDataset, [GenericDataType.STRING])).toEqual( + false, + ); + expect( + checkColumnType('gender', TestDataset, [GenericDataType.STRING]), + ).toEqual(true); + expect( + checkColumnType('gender', TestDataset, [GenericDataType.NUMERIC]), + ).toEqual(false); +}); + +test('checkColumnType from a QueryResponse', () => { + expect( + checkColumnType('Column 1', testQueryResponse, [GenericDataType.STRING]), + ).toEqual(true); + expect( + checkColumnType('Column 1', testQueryResponse, [GenericDataType.NUMERIC]), + ).toEqual(false); +}); + +test('checkColumnType from null', () => { + expect(checkColumnType('col', null, [])).toEqual(false); +}); diff --git a/superset-frontend/packages/superset-ui-chart-controls/test/utils/columnChoices.test.tsx b/superset-frontend/packages/superset-ui-chart-controls/test/utils/columnChoices.test.tsx index 70018ddc67fa1..de5bb1ab6980d 100644 --- a/superset-frontend/packages/superset-ui-chart-controls/test/utils/columnChoices.test.tsx +++ b/superset-frontend/packages/superset-ui-chart-controls/test/utils/columnChoices.test.tsx @@ -16,7 +16,11 @@ * specific language governing permissions and limitations * under the License. */ -import { DatasourceType, testQueryResponse } from '@superset-ui/core'; +import { + DatasourceType, + GenericDataType, + testQueryResponse, +} from '@superset-ui/core'; import { columnChoices } from '../../src'; describe('columnChoices()', () => { @@ -31,14 +35,20 @@ describe('columnChoices()', () => { columns: [ { column_name: 'fiz', + type: 'INT', + type_generic: GenericDataType.NUMERIC, }, { column_name: 'about', verbose_name: 'right', + type: 'VARCHAR', + type_generic: GenericDataType.STRING, }, { column_name: 'foo', - verbose_name: 'bar', + verbose_name: undefined, + type: 'TIMESTAMP', + type_generic: GenericDataType.TEMPORAL, }, ], verbose_map: {}, @@ -48,8 +58,8 @@ describe('columnChoices()', () => { description: 'this is my datasource', }), ).toEqual([ - ['foo', 'bar'], ['fiz', 'fiz'], + ['foo', 'foo'], ['about', 'right'], ]); }); diff --git a/superset-frontend/packages/superset-ui-chart-controls/test/utils/getTemporalColumns.test.ts b/superset-frontend/packages/superset-ui-chart-controls/test/utils/getTemporalColumns.test.ts index 7227173045b52..f0a6529de7122 100644 --- a/superset-frontend/packages/superset-ui-chart-controls/test/utils/getTemporalColumns.test.ts +++ b/superset-frontend/packages/superset-ui-chart-controls/test/utils/getTemporalColumns.test.ts @@ -16,7 +16,11 @@ * specific language governing permissions and limitations * under the License. */ -import { testQueryResponse, testQueryResults } from '@superset-ui/core'; +import { + GenericDataType, + testQueryResponse, + testQueryResults, +} from '@superset-ui/core'; import { Dataset, getTemporalColumns, @@ -55,8 +59,9 @@ test('get temporal columns from a QueryResponse', () => { temporalColumns: [ { column_name: 'Column 2', - type: 'TIMESTAMP', is_dttm: true, + type: 'TIMESTAMP', + type_generic: GenericDataType.TEMPORAL, }, ], defaultTemporalColumn: 'Column 2', diff --git a/superset-frontend/packages/superset-ui-core/src/chart/index.ts b/superset-frontend/packages/superset-ui-core/src/chart/index.ts index c1588023a324a..bc4b5a20bffee 100644 --- a/superset-frontend/packages/superset-ui-core/src/chart/index.ts +++ b/superset-frontend/packages/superset-ui-core/src/chart/index.ts @@ -20,7 +20,7 @@ export { default as ChartClient } from './clients/ChartClient'; export { default as ChartMetadata } from './models/ChartMetadata'; export { default as ChartPlugin } from './models/ChartPlugin'; -export { default as ChartProps } from './models/ChartProps'; +export { default as ChartProps, ChartPropsConfig } from './models/ChartProps'; export { default as createLoadableRenderer } from './components/createLoadableRenderer'; export { default as reactify } from './components/reactify'; diff --git a/superset-frontend/packages/superset-ui-core/src/chart/models/ChartMetadata.ts b/superset-frontend/packages/superset-ui-core/src/chart/models/ChartMetadata.ts index 34f373f0f4891..dcb1de62a5c62 100644 --- a/superset-frontend/packages/superset-ui-core/src/chart/models/ChartMetadata.ts +++ b/superset-frontend/packages/superset-ui-core/src/chart/models/ChartMetadata.ts @@ -36,7 +36,6 @@ export interface ChartMetadataConfig { description?: string; datasourceCount?: number; enableNoResults?: boolean; - show?: boolean; supportedAnnotationTypes?: string[]; thumbnail: string; useLegacyApi?: boolean; @@ -64,8 +63,6 @@ export default class ChartMetadata { description: string; - show: boolean; - supportedAnnotationTypes: string[]; thumbnail: string; @@ -100,7 +97,6 @@ export default class ChartMetadata { canBeAnnotationTypes = [], credits = [], description = '', - show = true, supportedAnnotationTypes = [], thumbnail, useLegacyApi = false, @@ -120,7 +116,6 @@ export default class ChartMetadata { this.name = name; this.credits = credits; this.description = description; - this.show = show; this.canBeAnnotationTypes = canBeAnnotationTypes; this.canBeAnnotationTypesLookup = canBeAnnotationTypes.reduce( (prev: LookupTable, type: string) => { diff --git a/superset-frontend/packages/superset-ui-core/src/chart/types/Base.ts b/superset-frontend/packages/superset-ui-core/src/chart/types/Base.ts index 1c4d278f6cc46..b3884a8488013 100644 --- a/superset-frontend/packages/superset-ui-core/src/chart/types/Base.ts +++ b/superset-frontend/packages/superset-ui-core/src/chart/types/Base.ts @@ -58,7 +58,6 @@ export enum AppSection { export type FilterState = { value?: any; [key: string]: any }; export type DataMask = { - __cache?: FilterState; extraFormData?: ExtraFormData; filterState?: FilterState; ownState?: JsonObject; diff --git a/superset-frontend/packages/superset-ui-core/src/color/colorSchemes/categorical/airbnb.ts b/superset-frontend/packages/superset-ui-core/src/color/colorSchemes/categorical/airbnb.ts index 462065b84f2b9..a126f502a9c3d 100644 --- a/superset-frontend/packages/superset-ui-core/src/color/colorSchemes/categorical/airbnb.ts +++ b/superset-frontend/packages/superset-ui-core/src/color/colorSchemes/categorical/airbnb.ts @@ -24,27 +24,19 @@ const schemes = [ id: 'bnbColors', label: 'Airbnb Colors', colors: [ - '#ff5a5f', // rausch - '#7b0051', // hackb - '#007A87', // kazan - '#00d1c1', // babu - '#8ce071', // lima - '#ffb400', // beach - '#b4a76c', // barol - '#ff8083', - '#cc0086', - '#00a1b3', - '#00ffeb', - '#bbedab', - '#ffd266', - '#cbc29a', - '#ff3339', - '#ff1ab1', - '#005c66', - '#00b3a5', - '#55d12e', - '#b37e00', - '#988b4e', + '#29696B', + '#5BCACE', + '#F4B02A', + '#F1826A', + '#792EB2', + '#C96EC6', + '#921E50', + '#B27700', + '#9C3498', + '#9C3498', + '#E4679D', + '#C32F0E', + '#9D63CA', ], }, ].map(s => new CategoricalScheme(s)); diff --git a/superset-frontend/packages/superset-ui-core/src/components/SafeMarkdown.tsx b/superset-frontend/packages/superset-ui-core/src/components/SafeMarkdown.tsx index b0826ce2eda54..2b36802d4b497 100644 --- a/superset-frontend/packages/superset-ui-core/src/components/SafeMarkdown.tsx +++ b/superset-frontend/packages/superset-ui-core/src/components/SafeMarkdown.tsx @@ -67,6 +67,7 @@ function SafeMarkdown({ rehypePlugins={rehypePlugins} remarkPlugins={[remarkGfm]} skipHtml={false} + transformLinkUri={null} > {source} </ReactMarkdown> diff --git a/superset-frontend/packages/superset-ui-core/src/query/types/Query.ts b/superset-frontend/packages/superset-ui-core/src/query/types/Query.ts index f42b01abdcb7f..1271be2fff89f 100644 --- a/superset-frontend/packages/superset-ui-core/src/query/types/Query.ts +++ b/superset-frontend/packages/superset-ui-core/src/query/types/Query.ts @@ -31,6 +31,7 @@ import { Maybe } from '../../types'; import { PostProcessingRule } from './PostProcessing'; import { JsonObject } from '../../connection'; import { TimeGranularity } from '../../time-format'; +import { GenericDataType } from './QueryResponse'; export type BaseQueryObjectFilterClause = { col: QueryFormColumn; @@ -250,6 +251,7 @@ export type QueryColumn = { name?: string; column_name: string; type: string | null; + type_generic: GenericDataType; is_dttm: boolean; }; @@ -383,16 +385,19 @@ export const testQuery: Query = { column_name: 'Column 1', type: 'STRING', is_dttm: false, + type_generic: GenericDataType.STRING, }, { column_name: 'Column 3', type: 'STRING', is_dttm: false, + type_generic: GenericDataType.STRING, }, { column_name: 'Column 2', type: 'TIMESTAMP', is_dttm: true, + type_generic: GenericDataType.TEMPORAL, }, ], }; @@ -404,16 +409,19 @@ export const testQueryResults = { { column_name: 'Column 1', type: 'STRING', + type_generic: GenericDataType.STRING, is_dttm: false, }, { column_name: 'Column 3', type: 'STRING', + type_generic: GenericDataType.STRING, is_dttm: false, }, { column_name: 'Column 2', type: 'TIMESTAMP', + type_generic: GenericDataType.TEMPORAL, is_dttm: true, }, ], @@ -425,16 +433,19 @@ export const testQueryResults = { { column_name: 'Column 1', type: 'STRING', + type_generic: GenericDataType.STRING, is_dttm: false, }, { column_name: 'Column 3', type: 'STRING', + type_generic: GenericDataType.STRING, is_dttm: false, }, { column_name: 'Column 2', type: 'TIMESTAMP', + type_generic: GenericDataType.TEMPORAL, is_dttm: true, }, ], diff --git a/superset-frontend/packages/superset-ui-core/src/ui-overrides/types.ts b/superset-frontend/packages/superset-ui-core/src/ui-overrides/types.ts index 0e7e0c9783944..27646442de3d9 100644 --- a/superset-frontend/packages/superset-ui-core/src/ui-overrides/types.ts +++ b/superset-frontend/packages/superset-ui-core/src/ui-overrides/types.ts @@ -127,6 +127,14 @@ export interface SQLResultTableExtentionProps { expandedColumns?: string[]; } +/** + * Interface for extensions to Slice Header + */ +export interface SliceHeaderExtension { + sliceId: number; + dashboardId: number; +} + export type Extensions = Partial<{ 'alertsreports.header.icon': React.ComponentType; 'embedded.documentation.configuration_details': React.ComponentType<ConfigDetailsProps>; @@ -147,4 +155,5 @@ export type Extensions = Partial<{ 'dataset.delete.related': React.ComponentType<DatasetDeleteRelatedExtensionProps>; 'sqleditor.extension.form': React.ComponentType<SQLFormExtensionProps>; 'sqleditor.extension.resultTable': React.ComponentType<SQLResultTableExtentionProps>; + 'dashboard.slice.header': React.ComponentType<SliceHeaderExtension>; }>; diff --git a/superset-frontend/packages/superset-ui-core/src/utils/html.test.tsx b/superset-frontend/packages/superset-ui-core/src/utils/html.test.tsx index 8fd06cb6f8e7a..9b950e4246e92 100644 --- a/superset-frontend/packages/superset-ui-core/src/utils/html.test.tsx +++ b/superset-frontend/packages/superset-ui-core/src/utils/html.test.tsx @@ -44,6 +44,9 @@ describe('isProbablyHTML', () => { const plainText = 'Just a plain text'; const isHTML = isProbablyHTML(plainText); expect(isHTML).toBe(false); + + const trickyText = 'a <= 10 and b > 10'; + expect(isProbablyHTML(trickyText)).toBe(false); }); }); diff --git a/superset-frontend/packages/superset-ui-core/src/utils/html.tsx b/superset-frontend/packages/superset-ui-core/src/utils/html.tsx index 3215eb9b9de5b..fffd43bda8f6e 100644 --- a/superset-frontend/packages/superset-ui-core/src/utils/html.tsx +++ b/superset-frontend/packages/superset-ui-core/src/utils/html.tsx @@ -28,7 +28,9 @@ export function sanitizeHtml(htmlString: string) { } export function isProbablyHTML(text: string) { - return /<[^>]+>/.test(text); + return Array.from( + new DOMParser().parseFromString(text, 'text/html').body.childNodes, + ).some(({ nodeType }) => nodeType === 1); } export function sanitizeHtmlIfNeeded(htmlString: string) { diff --git a/superset-frontend/packages/superset-ui-core/src/validator/index.ts b/superset-frontend/packages/superset-ui-core/src/validator/index.ts index 532efcc959116..6294bddec7ca9 100644 --- a/superset-frontend/packages/superset-ui-core/src/validator/index.ts +++ b/superset-frontend/packages/superset-ui-core/src/validator/index.ts @@ -22,3 +22,5 @@ export { default as legacyValidateNumber } from './legacyValidateNumber'; export { default as validateInteger } from './validateInteger'; export { default as validateNumber } from './validateNumber'; export { default as validateNonEmpty } from './validateNonEmpty'; +export { default as validateMaxValue } from './validateMaxValue'; +export { default as validateMapboxStylesUrl } from './validateMapboxStylesUrl'; diff --git a/superset-frontend/packages/superset-ui-core/src/validator/validateMapboxStylesUrl.ts b/superset-frontend/packages/superset-ui-core/src/validator/validateMapboxStylesUrl.ts new file mode 100644 index 0000000000000..bfbbaa7168d38 --- /dev/null +++ b/superset-frontend/packages/superset-ui-core/src/validator/validateMapboxStylesUrl.ts @@ -0,0 +1,36 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + +import { t } from '../translation'; + +/** + * Validate a [Mapbox styles URL](https://docs.mapbox.com/help/glossary/style-url/) + * @param v + */ +export default function validateMapboxStylesUrl(v: unknown) { + if ( + typeof v === 'string' && + v.trim().length > 0 && + v.trim().startsWith('mapbox://styles/') + ) { + return false; + } + + return t('is expected to be a Mapbox URL'); +} diff --git a/superset-frontend/packages/superset-ui-core/src/validator/validateMaxValue.ts b/superset-frontend/packages/superset-ui-core/src/validator/validateMaxValue.ts new file mode 100644 index 0000000000000..24c1da1c79dde --- /dev/null +++ b/superset-frontend/packages/superset-ui-core/src/validator/validateMaxValue.ts @@ -0,0 +1,8 @@ +import { t } from '../translation'; + +export default function validateMaxValue(v: unknown, max: Number) { + if (Number(v) > +max) { + return t('Value cannot exceed %s', max); + } + return false; +} diff --git a/superset-frontend/packages/superset-ui-core/test/validator/validateMapboxStylesUrl.test.ts b/superset-frontend/packages/superset-ui-core/test/validator/validateMapboxStylesUrl.test.ts new file mode 100644 index 0000000000000..dbd5822666eb0 --- /dev/null +++ b/superset-frontend/packages/superset-ui-core/test/validator/validateMapboxStylesUrl.test.ts @@ -0,0 +1,47 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +import { validateMapboxStylesUrl } from '@superset-ui/core'; +import './setup'; + +describe('validateMapboxStylesUrl', () => { + it('should validate mapbox style URLs', () => { + expect( + validateMapboxStylesUrl('mapbox://styles/mapbox/streets-v9'), + ).toEqual(false); + expect( + validateMapboxStylesUrl( + 'mapbox://styles/foobar/clp2dr5r4008a01pcg4ad45m8', + ), + ).toEqual(false); + }); + + [ + 123, + ['mapbox://styles/mapbox/streets-v9'], + { url: 'mapbox://styles/mapbox/streets-v9' }, + 'https://superset.apache.org/', + 'mapbox://tileset/mapbox/streets-v9', + ].forEach(value => { + it(`should not validate ${value}`, () => { + expect(validateMapboxStylesUrl(value)).toEqual( + 'is expected to be a Mapbox URL', + ); + }); + }); +}); diff --git a/superset-frontend/packages/superset-ui-core/test/validator/validateMaxValue.test.ts b/superset-frontend/packages/superset-ui-core/test/validator/validateMaxValue.test.ts new file mode 100644 index 0000000000000..6a8ed1642e7b9 --- /dev/null +++ b/superset-frontend/packages/superset-ui-core/test/validator/validateMaxValue.test.ts @@ -0,0 +1,37 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + +import { validateMaxValue } from '@superset-ui/core'; +import './setup'; + +test('validateInteger returns the warning message if invalid', () => { + expect(validateMaxValue(10.1, 10)).toBeTruthy(); + expect(validateMaxValue(1, 0)).toBeTruthy(); + expect(validateMaxValue('2', 1)).toBeTruthy(); +}); + +test('validateInteger returns false if the input is valid', () => { + expect(validateMaxValue(0, 1)).toBeFalsy(); + expect(validateMaxValue(10, 10)).toBeFalsy(); + expect(validateMaxValue(undefined, 1)).toBeFalsy(); + expect(validateMaxValue(NaN, NaN)).toBeFalsy(); + expect(validateMaxValue(null, 1)).toBeFalsy(); + expect(validateMaxValue('1', 1)).toBeFalsy(); + expect(validateMaxValue('a', 1)).toBeFalsy(); +}); diff --git a/superset-frontend/packages/superset-ui-demo/storybook/stories/plugins/legacy-plugin-chart-map-box/Stories.tsx b/superset-frontend/packages/superset-ui-demo/storybook/stories/plugins/legacy-plugin-chart-map-box/Stories.tsx index 6cdca623a1c82..dd95ffada5b04 100644 --- a/superset-frontend/packages/superset-ui-demo/storybook/stories/plugins/legacy-plugin-chart-map-box/Stories.tsx +++ b/superset-frontend/packages/superset-ui-demo/storybook/stories/plugins/legacy-plugin-chart-map-box/Stories.tsx @@ -42,7 +42,7 @@ export const Basic = () => { allColumnsY: 'LAT', clusteringRadius: '60', globalOpacity: 1, - mapboxColor: 'rgb(0, 122, 135)', + mapboxColor: 'rgb(244, 176, 42)', mapboxLabel: [], mapboxStyle: 'mapbox://styles/mapbox/light-v9', pandasAggfunc: 'sum', diff --git a/superset-frontend/plugins/legacy-plugin-chart-map-box/src/controlPanel.ts b/superset-frontend/plugins/legacy-plugin-chart-map-box/src/controlPanel.ts index 1dc75d96ef444..e0b65246097b5 100644 --- a/superset-frontend/plugins/legacy-plugin-chart-map-box/src/controlPanel.ts +++ b/superset-frontend/plugins/legacy-plugin-chart-map-box/src/controlPanel.ts @@ -16,7 +16,12 @@ * specific language governing permissions and limitations * under the License. */ -import { FeatureFlag, isFeatureEnabled, t } from '@superset-ui/core'; +import { + FeatureFlag, + isFeatureEnabled, + t, + validateMapboxStylesUrl, +} from '@superset-ui/core'; import { columnChoices, ControlPanelConfig, @@ -224,6 +229,8 @@ const config: ControlPanelConfig = { label: t('Map Style'), clearable: false, renderTrigger: true, + freeForm: true, + validators: [validateMapboxStylesUrl], choices: [ ['mapbox://styles/mapbox/streets-v9', t('Streets')], ['mapbox://styles/mapbox/dark-v9', t('Dark')], @@ -236,7 +243,10 @@ const config: ControlPanelConfig = { ['mapbox://styles/mapbox/outdoors-v9', t('Outdoors')], ], default: 'mapbox://styles/mapbox/light-v9', - description: t('Base layer map style'), + description: t( + 'Base layer map style. See Mapbox documentation: %s', + 'https://docs.mapbox.com/help/glossary/style-url/', + ), }, }, ], diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/Multi/controlPanel.ts b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/Multi/controlPanel.ts index 8571fe23d0e2a..8f4df671c35ff 100644 --- a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/Multi/controlPanel.ts +++ b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/Multi/controlPanel.ts @@ -27,7 +27,8 @@ export default { label: t('Map'), expanded: true, controlSetRows: [ - [mapboxStyle, viewport], + [mapboxStyle], + [viewport], [ { name: 'deck_slices', diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/Multi/images/example.png b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/Multi/images/example.png new file mode 100644 index 0000000000000..df5f1de7a786f Binary files /dev/null and b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/Multi/images/example.png differ diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/Multi/index.ts b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/Multi/index.ts index 0535e96010a39..42ce06b1c75e1 100644 --- a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/Multi/index.ts +++ b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/Multi/index.ts @@ -18,6 +18,7 @@ */ import { t, ChartMetadata, ChartPlugin } from '@superset-ui/core'; import thumbnail from './images/thumbnail.png'; +import example from './images/example.png'; import transformProps from '../transformProps'; import controlPanel from './controlPanel'; @@ -25,6 +26,7 @@ const metadata = new ChartMetadata({ category: t('Map'), credits: ['https://uber.github.io/deck.gl'], description: t('Compose multiple layers together to form complex visuals.'), + exampleGallery: [{ url: example }], name: t('deck.gl Multiple Layers'), thumbnail, useLegacyApi: true, diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Arc/controlPanel.ts b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Arc/controlPanel.ts index 3794ef38daef4..664f389a0bd2a 100644 --- a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Arc/controlPanel.ts +++ b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Arc/controlPanel.ts @@ -76,10 +76,7 @@ const config: ControlPanelConfig = { }, { label: t('Map'), - controlSetRows: [ - [mapboxStyle, viewport], - [autozoom, null], - ], + controlSetRows: [[mapboxStyle], [autozoom, viewport]], }, { label: t('Arc'), diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Arc/images/example.png b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Arc/images/example.png new file mode 100644 index 0000000000000..b031f430dbba8 Binary files /dev/null and b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Arc/images/example.png differ diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Arc/index.ts b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Arc/index.ts index fa7e4155a2f6e..350877432e1e9 100644 --- a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Arc/index.ts +++ b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Arc/index.ts @@ -18,6 +18,7 @@ */ import { t, ChartMetadata, ChartPlugin } from '@superset-ui/core'; import thumbnail from './images/thumbnail.png'; +import example from './images/example.png'; import transformProps from '../../transformProps'; import controlPanel from './controlPanel'; @@ -29,6 +30,7 @@ const metadata = new ChartMetadata({ ), name: t('deck.gl Arc'), thumbnail, + exampleGallery: [{ url: example }], useLegacyApi: true, tags: [t('deckGL'), t('Geo'), t('3D'), t('Relational'), t('Web')], }); diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Contour/controlPanel.ts b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Contour/controlPanel.ts index 238029aada4b6..407cab3162125 100644 --- a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Contour/controlPanel.ts +++ b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Contour/controlPanel.ts @@ -52,8 +52,8 @@ const config: ControlPanelConfig = { label: t('Map'), expanded: true, controlSetRows: [ - [mapboxStyle, viewport], - [autozoom], + [mapboxStyle], + [autozoom, viewport], [ { name: 'cellSize', diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Contour/images/example.png b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Contour/images/example.png new file mode 100644 index 0000000000000..20572b77e437f Binary files /dev/null and b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Contour/images/example.png differ diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Contour/index.ts b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Contour/index.ts index 01f14467c72ea..bc0bd8c1f6ccc 100644 --- a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Contour/index.ts +++ b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Contour/index.ts @@ -20,6 +20,7 @@ import { t, ChartMetadata, ChartPlugin } from '@superset-ui/core'; import transformProps from '../../transformProps'; import controlPanel from './controlPanel'; import thumbnail from './images/thumbnail.png'; +import example from './images/example.png'; const metadata = new ChartMetadata({ category: t('Map'), @@ -27,7 +28,8 @@ const metadata = new ChartMetadata({ description: t( 'Uses Gaussian Kernel Density Estimation to visualize spatial distribution of data', ), - name: t('deck.gl Countour'), + exampleGallery: [{ url: example }], + name: t('deck.gl Contour'), thumbnail, useLegacyApi: true, tags: [t('deckGL'), t('Spatial'), t('Comparison'), t('Experimental')], diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Geojson/images/example.png b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Geojson/images/example.png new file mode 100644 index 0000000000000..f35389d36da18 Binary files /dev/null and b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Geojson/images/example.png differ diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Geojson/index.ts b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Geojson/index.ts index c3afedbc858a4..23708a2d24a61 100644 --- a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Geojson/index.ts +++ b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Geojson/index.ts @@ -18,6 +18,7 @@ */ import { t, ChartMetadata, ChartPlugin } from '@superset-ui/core'; import thumbnail from './images/thumbnail.png'; +import example from './images/example.png'; import transformProps from '../../transformProps'; import controlPanel from './controlPanel'; @@ -27,6 +28,7 @@ const metadata = new ChartMetadata({ description: t( 'The GeoJsonLayer takes in GeoJSON formatted data and renders it as interactive polygons, lines and points (circles, icons and/or texts).', ), + exampleGallery: [{ url: example }], name: t('deck.gl Geojson'), thumbnail, useLegacyApi: true, diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Grid/controlPanel.ts b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Grid/controlPanel.ts index 9b8e33d739816..fa9a03a8f38da 100644 --- a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Grid/controlPanel.ts +++ b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Grid/controlPanel.ts @@ -53,7 +53,8 @@ const config: ControlPanelConfig = { { label: t('Map'), controlSetRows: [ - [mapboxStyle, viewport], + [mapboxStyle], + [viewport], ['color_scheme'], [autozoom], [gridSize], diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Grid/images/example.png b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Grid/images/example.png new file mode 100644 index 0000000000000..1d1407809a0af Binary files /dev/null and b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Grid/images/example.png differ diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Grid/index.ts b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Grid/index.ts index 7bc1bec26577b..b9d45ddaff91b 100644 --- a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Grid/index.ts +++ b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Grid/index.ts @@ -18,6 +18,7 @@ */ import { t, ChartMetadata, ChartPlugin } from '@superset-ui/core'; import thumbnail from './images/thumbnail.png'; +import example from './images/example.png'; import transformProps from '../../transformProps'; import controlPanel from './controlPanel'; @@ -29,6 +30,7 @@ const metadata = new ChartMetadata({ ), name: t('deck.gl Grid'), thumbnail, + exampleGallery: [{ url: example }], useLegacyApi: true, tags: [t('deckGL'), t('3D'), t('Comparison'), t('Experimental')], }); diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Heatmap/controlPanel.ts b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Heatmap/controlPanel.ts index 6fa41c2e21ab1..fd343eed16acb 100644 --- a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Heatmap/controlPanel.ts +++ b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Heatmap/controlPanel.ts @@ -99,7 +99,8 @@ const config: ControlPanelConfig = { { label: t('Map'), controlSetRows: [ - [mapboxStyle, viewport], + [mapboxStyle], + [viewport], ['linear_color_scheme'], [autozoom], [ diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Heatmap/images/example.png b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Heatmap/images/example.png new file mode 100644 index 0000000000000..ebd1c8ae489da Binary files /dev/null and b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Heatmap/images/example.png differ diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Heatmap/index.ts b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Heatmap/index.ts index 00d1c99af06b0..adf4ed393e4d1 100644 --- a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Heatmap/index.ts +++ b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Heatmap/index.ts @@ -20,6 +20,7 @@ import { t, ChartMetadata, ChartPlugin } from '@superset-ui/core'; import transformProps from '../../transformProps'; import controlPanel from './controlPanel'; import thumbnail from './images/thumbnail.png'; +import example from './images/example.png'; const metadata = new ChartMetadata({ category: t('Map'), @@ -27,6 +28,7 @@ const metadata = new ChartMetadata({ description: t( 'Uses Gaussian Kernel Density Estimation to visualize spatial distribution of data', ), + exampleGallery: [{ url: example }], name: t('deck.gl Heatmap'), thumbnail, useLegacyApi: true, diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Hex/controlPanel.ts b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Hex/controlPanel.ts index 2f9293c521826..8865ed0052ab8 100644 --- a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Hex/controlPanel.ts +++ b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Hex/controlPanel.ts @@ -53,8 +53,8 @@ const config: ControlPanelConfig = { { label: t('Map'), controlSetRows: [ - [mapboxStyle, viewport], - ['color_scheme'], + [mapboxStyle], + ['color_scheme', viewport], [autozoom], [gridSize], [extruded], diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Hex/images/example.png b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Hex/images/example.png new file mode 100644 index 0000000000000..378a442665e5a Binary files /dev/null and b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Hex/images/example.png differ diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Hex/index.ts b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Hex/index.ts index 7d714f82c4f38..1b67fbb657a3d 100644 --- a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Hex/index.ts +++ b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Hex/index.ts @@ -18,6 +18,7 @@ */ import { t, ChartMetadata, ChartPlugin } from '@superset-ui/core'; import thumbnail from './images/thumbnail.png'; +import example from './images/example.png'; import transformProps from '../../transformProps'; import controlPanel from './controlPanel'; @@ -27,6 +28,7 @@ const metadata = new ChartMetadata({ description: t( 'Overlays a hexagonal grid on a map, and aggregates data within the boundary of each cell.', ), + exampleGallery: [{ url: example }], name: t('deck.gl 3D Hexagon'), thumbnail, useLegacyApi: true, diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Path/controlPanel.ts b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Path/controlPanel.ts index 80691efa6b737..b0403b3596fb3 100644 --- a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Path/controlPanel.ts +++ b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Path/controlPanel.ts @@ -67,7 +67,8 @@ const config: ControlPanelConfig = { label: t('Map'), expanded: true, controlSetRows: [ - [mapboxStyle, viewport], + [mapboxStyle], + [viewport], ['color_picker'], [lineWidth], [ diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Path/images/example.png b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Path/images/example.png new file mode 100644 index 0000000000000..a259597742e58 Binary files /dev/null and b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Path/images/example.png differ diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Path/index.ts b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Path/index.ts index a62be08da7dd8..23919c8446484 100644 --- a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Path/index.ts +++ b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Path/index.ts @@ -18,6 +18,7 @@ */ import { t, ChartMetadata, ChartPlugin } from '@superset-ui/core'; import thumbnail from './images/thumbnail.png'; +import example from './images/example.png'; import transformProps from '../../transformProps'; import controlPanel from './controlPanel'; @@ -27,6 +28,7 @@ const metadata = new ChartMetadata({ description: t('Visualizes connected points, which form a path, on a map.'), name: t('deck.gl Path'), thumbnail, + exampleGallery: [{ url: example }], useLegacyApi: true, tags: [t('deckGL'), t('Web')], }); diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Polygon/images/example.png b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Polygon/images/example.png new file mode 100644 index 0000000000000..18feb67819118 Binary files /dev/null and b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Polygon/images/example.png differ diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Polygon/index.ts b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Polygon/index.ts index 325eeac401415..d6852144938d1 100644 --- a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Polygon/index.ts +++ b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Polygon/index.ts @@ -18,6 +18,7 @@ */ import { t, ChartMetadata, ChartPlugin } from '@superset-ui/core'; import thumbnail from './images/thumbnail.png'; +import example from './images/example.png'; import transformProps from '../../transformProps'; import controlPanel from './controlPanel'; @@ -29,6 +30,7 @@ const metadata = new ChartMetadata({ ), name: t('deck.gl Polygon'), thumbnail, + exampleGallery: [{ url: example }], useLegacyApi: true, tags: [ t('deckGL'), diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Scatter/controlPanel.ts b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Scatter/controlPanel.ts index ef3d45a95685f..9afeb1b415dd6 100644 --- a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Scatter/controlPanel.ts +++ b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Scatter/controlPanel.ts @@ -62,10 +62,7 @@ const config: ControlPanelConfig = { { label: t('Map'), expanded: true, - controlSetRows: [ - [mapboxStyle, viewport], - [autozoom, null], - ], + controlSetRows: [[mapboxStyle], [autozoom, viewport]], }, { label: t('Point Size'), diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Scatter/images/example.png b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Scatter/images/example.png new file mode 100644 index 0000000000000..fc0ec766319a7 Binary files /dev/null and b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Scatter/images/example.png differ diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Scatter/index.ts b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Scatter/index.ts index 4e93e81d69949..7e4d28e0a9eb2 100644 --- a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Scatter/index.ts +++ b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Scatter/index.ts @@ -18,6 +18,7 @@ */ import { t, ChartMetadata, ChartPlugin } from '@superset-ui/core'; import thumbnail from './images/thumbnail.png'; +import example from './images/example.png'; import transformProps from '../../transformProps'; import controlPanel from './controlPanel'; @@ -29,6 +30,7 @@ const metadata = new ChartMetadata({ ), name: t('deck.gl Scatterplot'), thumbnail, + exampleGallery: [{ url: example }], useLegacyApi: true, tags: [ t('deckGL'), diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Screengrid/controlPanel.ts b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Screengrid/controlPanel.ts index caf052581cbb5..82aeda1745f93 100644 --- a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Screengrid/controlPanel.ts +++ b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Screengrid/controlPanel.ts @@ -52,10 +52,7 @@ const config: ControlPanelConfig = { }, { label: t('Map'), - controlSetRows: [ - [mapboxStyle, viewport], - [autozoom, null], - ], + controlSetRows: [[mapboxStyle], [autozoom, viewport]], }, { label: t('Grid'), diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Screengrid/images/example.png b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Screengrid/images/example.png new file mode 100644 index 0000000000000..92679725e32f6 Binary files /dev/null and b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Screengrid/images/example.png differ diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Screengrid/index.ts b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Screengrid/index.ts index fc285ba104e93..a106d3d8a955a 100644 --- a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Screengrid/index.ts +++ b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Screengrid/index.ts @@ -18,6 +18,7 @@ */ import { t, ChartMetadata, ChartPlugin } from '@superset-ui/core'; import thumbnail from './images/thumbnail.png'; +import example from './images/example.png'; import transformProps from '../../transformProps'; import controlPanel from './controlPanel'; @@ -29,6 +30,7 @@ const metadata = new ChartMetadata({ ), name: t('deck.gl Screen Grid'), thumbnail, + exampleGallery: [{ url: example }], useLegacyApi: true, tags: [ t('deckGL'), diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/utilities/Shared_DeckGL.jsx b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/utilities/Shared_DeckGL.jsx index 9a123e91c371b..5b307efd906c8 100644 --- a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/utilities/Shared_DeckGL.jsx +++ b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/utilities/Shared_DeckGL.jsx @@ -25,6 +25,7 @@ import { isFeatureEnabled, t, validateNonEmpty, + validateMapboxStylesUrl, } from '@superset-ui/core'; import { D3_FORMAT_OPTIONS, sharedControls } from '@superset-ui/chart-controls'; import { columnChoices, PRIMARY_COLOR } from './controls'; @@ -370,6 +371,8 @@ export const mapboxStyle = { label: t('Map Style'), clearable: false, renderTrigger: true, + freeForm: true, + validators: [validateMapboxStylesUrl], choices: [ ['mapbox://styles/mapbox/streets-v9', t('Streets')], ['mapbox://styles/mapbox/dark-v9', t('Dark')], @@ -379,7 +382,10 @@ export const mapboxStyle = { ['mapbox://styles/mapbox/outdoors-v9', t('Outdoors')], ], default: 'mapbox://styles/mapbox/light-v9', - description: t('Base layer map style'), + description: t( + 'Base layer map style. See Mapbox documentation: %s', + 'https://docs.mapbox.com/help/glossary/style-url/', + ), }, }; diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/Bubble/constants.ts b/superset-frontend/plugins/plugin-chart-echarts/src/Bubble/constants.ts index 0f9bc0f3054b4..1c70e872e6fae 100644 --- a/superset-frontend/plugins/plugin-chart-echarts/src/Bubble/constants.ts +++ b/superset-frontend/plugins/plugin-chart-echarts/src/Bubble/constants.ts @@ -17,6 +17,7 @@ * under the License. */ import { DEFAULT_LEGEND_FORM_DATA } from '../constants'; +import { defaultXAxis } from '../defaults'; import { EchartsBubbleFormData } from './types'; export const DEFAULT_FORM_DATA: Partial<EchartsBubbleFormData> = { @@ -26,9 +27,11 @@ export const DEFAULT_FORM_DATA: Partial<EchartsBubbleFormData> = { logYAxis: false, xAxisTitleMargin: 30, yAxisTitleMargin: 30, + truncateXAxis: false, truncateYAxis: false, + xAxisBounds: [null, null], yAxisBounds: [null, null], - xAxisLabelRotation: 0, + xAxisLabelRotation: defaultXAxis.xAxisLabelRotation, opacity: 0.6, }; diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/Bubble/controlPanel.tsx b/superset-frontend/plugins/plugin-chart-echarts/src/Bubble/controlPanel.tsx index 53fba5de2b32e..521ae98130dc6 100644 --- a/superset-frontend/plugins/plugin-chart-echarts/src/Bubble/controlPanel.tsx +++ b/superset-frontend/plugins/plugin-chart-echarts/src/Bubble/controlPanel.tsx @@ -26,10 +26,15 @@ import { } from '@superset-ui/chart-controls'; import { DEFAULT_FORM_DATA } from './constants'; -import { legendSection } from '../controls'; +import { + legendSection, + truncateXAxis, + xAxisBounds, + xAxisLabelRotation, +} from '../controls'; +import { defaultYAxis } from '../defaults'; -const { logAxis, truncateYAxis, yAxisBounds, xAxisLabelRotation, opacity } = - DEFAULT_FORM_DATA; +const { logAxis, truncateYAxis, yAxisBounds, opacity } = DEFAULT_FORM_DATA; const config: ControlPanelConfig = { controlPanelSections: [ @@ -127,26 +132,7 @@ const config: ControlPanelConfig = { }, }, ], - [ - { - name: 'xAxisLabelRotation', - config: { - type: 'SelectControl', - freeForm: true, - clearable: false, - label: t('Rotate x axis label'), - choices: [ - [0, '0°'], - [45, '45°'], - ], - default: xAxisLabelRotation, - renderTrigger: true, - description: t( - 'Input field supports custom rotation. e.g. 30 for 30°', - ), - }, - }, - ], + [xAxisLabelRotation], [ { name: 'x_axis_title_margin', @@ -211,7 +197,7 @@ const config: ControlPanelConfig = { [0, '0°'], [45, '45°'], ], - default: xAxisLabelRotation, + default: defaultYAxis.yAxisLabelRotation, renderTrigger: true, description: t( 'Input field supports custom rotation. e.g. 30 for 30°', @@ -246,6 +232,8 @@ const config: ControlPanelConfig = { }, }, ], + [truncateXAxis], + [xAxisBounds], [ { name: 'truncateYAxis', diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/Bubble/transformProps.ts b/superset-frontend/plugins/plugin-chart-echarts/src/Bubble/transformProps.ts index 7962bc2c3677a..754b26003bb67 100644 --- a/superset-frontend/plugins/plugin-chart-echarts/src/Bubble/transformProps.ts +++ b/superset-frontend/plugins/plugin-chart-echarts/src/Bubble/transformProps.ts @@ -28,9 +28,9 @@ import { import { EchartsBubbleChartProps, EchartsBubbleFormData } from './types'; import { DEFAULT_FORM_DATA, MINIMUM_BUBBLE_SIZE } from './constants'; import { defaultGrid } from '../defaults'; -import { getLegendProps } from '../utils/series'; +import { getLegendProps, getMinAndMaxFromBounds } from '../utils/series'; import { Refs } from '../types'; -import { parseYAxisBound } from '../utils/controls'; +import { parseAxisBound } from '../utils/controls'; import { getDefaultTooltip } from '../utils/tooltip'; import { getPadding } from '../Timeseries/transformers'; import { convertInteger } from '../utils/convertInteger'; @@ -84,6 +84,7 @@ export default function transformProps(chartProps: EchartsBubbleChartProps) { series: bubbleSeries, xAxisLabel: bubbleXAxisTitle, yAxisLabel: bubbleYAxisTitle, + xAxisBounds, xAxisFormat, yAxisFormat, yAxisBounds, @@ -91,6 +92,7 @@ export default function transformProps(chartProps: EchartsBubbleChartProps) { logYAxis, xAxisTitleMargin, yAxisTitleMargin, + truncateXAxis, truncateYAxis, xAxisLabelRotation, yAxisLabelRotation, @@ -104,7 +106,7 @@ export default function transformProps(chartProps: EchartsBubbleChartProps) { const colorFn = CategoricalColorNamespace.getScale(colorScheme as string); - const legends: string[] = []; + const legends = new Set<string>(); const series: ScatterSeriesOption[] = []; const xAxisLabel: string = getMetricLabel(x); @@ -114,9 +116,8 @@ export default function transformProps(chartProps: EchartsBubbleChartProps) { const refs: Refs = {}; data.forEach(datum => { - const name = - ((bubbleSeries ? datum[bubbleSeries] : datum[entity]) as string) || - NULL_STRING; + const dataName = bubbleSeries ? datum[bubbleSeries] : datum[entity]; + const name = dataName ? String(dataName) : NULL_STRING; const bubbleSeriesValue = bubbleSeries ? datum[bubbleSeries] : null; series.push({ @@ -133,7 +134,7 @@ export default function transformProps(chartProps: EchartsBubbleChartProps) { type: 'scatter', itemStyle: { color: colorFn(name), opacity }, }); - legends.push(name); + legends.add(name); }); normalizeSymbolSize(series, maxBubbleSize); @@ -142,7 +143,8 @@ export default function transformProps(chartProps: EchartsBubbleChartProps) { const yAxisFormatter = getNumberFormatter(yAxisFormat); const tooltipSizeFormatter = getNumberFormatter(tooltipSizeFormat); - const [min, max] = yAxisBounds.map(parseYAxisBound); + const [xAxisMin, xAxisMax] = (xAxisBounds || []).map(parseAxisBound); + const [yAxisMin, yAxisMax] = (yAxisBounds || []).map(parseAxisBound); const padding = getPadding( showLegend, @@ -156,6 +158,7 @@ export default function transformProps(chartProps: EchartsBubbleChartProps) { convertInteger(xAxisTitleMargin), ); + const xAxisType = logXAxis ? AxisType.log : AxisType.value; const echartOptions: EChartsCoreOption = { series, xAxis: { @@ -173,7 +176,8 @@ export default function transformProps(chartProps: EchartsBubbleChartProps) { fontWight: 'bolder', }, nameGap: convertInteger(xAxisTitleMargin), - type: logXAxis ? AxisType.log : AxisType.value, + type: xAxisType, + ...getMinAndMaxFromBounds(xAxisType, truncateXAxis, xAxisMin, xAxisMax), }, yAxis: { axisLabel: { formatter: yAxisFormatter }, @@ -190,13 +194,13 @@ export default function transformProps(chartProps: EchartsBubbleChartProps) { fontWight: 'bolder', }, nameGap: convertInteger(yAxisTitleMargin), - min, - max, + min: yAxisMin, + max: yAxisMax, type: logYAxis ? AxisType.log : AxisType.value, }, legend: { ...getLegendProps(legendType, legendOrientation, showLegend, theme), - data: legends, + data: Array.from(legends), }, tooltip: { show: !inContextMenu, diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/MixedTimeseries/controlPanel.tsx b/superset-frontend/plugins/plugin-chart-echarts/src/MixedTimeseries/controlPanel.tsx index c9f9027a3efb0..d3fb1fc53e10a 100644 --- a/superset-frontend/plugins/plugin-chart-echarts/src/MixedTimeseries/controlPanel.tsx +++ b/superset-frontend/plugins/plugin-chart-echarts/src/MixedTimeseries/controlPanel.tsx @@ -32,7 +32,14 @@ import { import { DEFAULT_FORM_DATA } from './types'; import { EchartsTimeseriesSeriesType } from '../Timeseries/types'; -import { legendSection, richTooltipSection } from '../controls'; +import { + legendSection, + minorTicks, + richTooltipSection, + truncateXAxis, + xAxisBounds, + xAxisLabelRotation, +} from '../controls'; const { area, @@ -49,7 +56,6 @@ const { truncateYAxis, yAxisBounds, zoomable, - xAxisLabelRotation, yAxisIndex, } = DEFAULT_FORM_DATA; @@ -311,29 +317,11 @@ const config: ControlPanelConfig = { }, }, ], + [minorTicks], ...legendSection, [<ControlSubSectionHeader>{t('X Axis')}</ControlSubSectionHeader>], ['x_axis_time_format'], - [ - { - name: 'xAxisLabelRotation', - config: { - type: 'SelectControl', - freeForm: true, - clearable: false, - label: t('Rotate x axis label'), - choices: [ - [0, '0°'], - [45, '45°'], - ], - default: xAxisLabelRotation, - renderTrigger: true, - description: t( - 'Input field supports custom rotation. e.g. 30 for 30°', - ), - }, - }, - ], + [xAxisLabelRotation], ...richTooltipSection, // eslint-disable-next-line react/jsx-key [<ControlSubSectionHeader>{t('Y Axis')}</ControlSubSectionHeader>], @@ -349,6 +337,8 @@ const config: ControlPanelConfig = { }, }, ], + [truncateXAxis], + [xAxisBounds], [ { name: 'truncateYAxis', diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/MixedTimeseries/transformProps.ts b/superset-frontend/plugins/plugin-chart-echarts/src/MixedTimeseries/transformProps.ts index 47411e2477e89..1d4eceb33f3b6 100644 --- a/superset-frontend/plugins/plugin-chart-echarts/src/MixedTimeseries/transformProps.ts +++ b/superset-frontend/plugins/plugin-chart-echarts/src/MixedTimeseries/transformProps.ts @@ -20,32 +20,32 @@ import { invert } from 'lodash'; import { AnnotationLayer, + buildCustomFormatters, CategoricalColorNamespace, + CurrencyFormatter, + ensureIsArray, GenericDataType, + getCustomFormatter, getNumberFormatter, + getXAxisLabel, + isDefined, isEventAnnotationLayer, isFormulaAnnotationLayer, isIntervalAnnotationLayer, + isPhysicalColumn, isTimeseriesAnnotationLayer, QueryFormData, + QueryFormMetric, TimeseriesChartDataResponseResult, TimeseriesDataRecord, - getXAxisLabel, - isPhysicalColumn, - isDefined, - ensureIsArray, - buildCustomFormatters, ValueFormatter, - QueryFormMetric, - getCustomFormatter, - CurrencyFormatter, } from '@superset-ui/core'; import { getOriginalSeries } from '@superset-ui/chart-controls'; import { EChartsCoreOption, SeriesOption } from 'echarts'; import { DEFAULT_FORM_DATA, - EchartsMixedTimeseriesFormData, EchartsMixedTimeseriesChartTransformedProps, + EchartsMixedTimeseriesFormData, EchartsMixedTimeseriesProps, } from './types'; import { @@ -53,16 +53,17 @@ import { ForecastSeriesEnum, Refs, } from '../types'; -import { parseYAxisBound } from '../utils/controls'; +import { parseAxisBound } from '../utils/controls'; import { - getOverMaxHiddenFormatter, dedupSeries, + extractDataTotalValues, extractSeries, + extractShowValueIndexes, getAxisType, getColtypesMapping, getLegendProps, - extractDataTotalValues, - extractShowValueIndexes, + getMinAndMaxFromBounds, + getOverMaxHiddenFormatter, } from '../utils/series'; import { extractAnnotationLabels, @@ -84,7 +85,7 @@ import { transformSeries, transformTimeseriesAnnotation, } from '../Timeseries/transformers'; -import { TIMESERIES_CONSTANTS, TIMEGRAIN_TO_TIMESTAMP } from '../constants'; +import { TIMEGRAIN_TO_TIMESTAMP, TIMESERIES_CONSTANTS } from '../constants'; import { getDefaultTooltip } from '../utils/tooltip'; import { getTooltipTimeFormatter, @@ -159,6 +160,7 @@ export default function transformProps( opacity, opacityB, minorSplitLine, + minorTicks, seriesType, seriesTypeB, showLegend, @@ -166,6 +168,7 @@ export default function transformProps( showValueB, stack, stackB, + truncateXAxis, truncateYAxis, tooltipTimeFormat, yAxisFormat, @@ -181,10 +184,12 @@ export default function transformProps( zoomable, richTooltip, tooltipSortByMetric, + xAxisBounds, xAxisLabelRotation, groupby, groupbyB, xAxis: xAxisOrig, + xAxisForceCategorical, xAxisTitle, yAxisTitle, xAxisTitleMargin, @@ -223,7 +228,7 @@ export default function transformProps( const dataTypes = getColtypesMapping(queriesData[0]); const xAxisDataType = dataTypes?.[xAxisLabel] ?? dataTypes?.[xAxisOrig]; - const xAxisType = getAxisType(xAxisDataType); + const xAxisType = getAxisType(stack, xAxisForceCategorical, xAxisDataType); const series: SeriesOption[] = []; const formatter = contributionMode ? getNumberFormatter(',.0%') @@ -345,9 +350,10 @@ export default function transformProps( }); // yAxisBounds need to be parsed to replace incompatible values with undefined - let [min, max] = (yAxisBounds || []).map(parseYAxisBound); + const [xAxisMin, xAxisMax] = (xAxisBounds || []).map(parseAxisBound); + let [yAxisMin, yAxisMax] = (yAxisBounds || []).map(parseAxisBound); let [minSecondary, maxSecondary] = (yAxisBoundsSecondary || []).map( - parseYAxisBound, + parseAxisBound, ); const array = ensureIsArray(chartProps.rawFormData?.time_compare); @@ -386,7 +392,7 @@ export default function transformProps( formatter: seriesType === EchartsTimeseriesSeriesType.Bar ? getOverMaxHiddenFormatter({ - max, + max: yAxisMax, formatter: seriesFormatter, }) : seriesFormatter, @@ -447,8 +453,8 @@ export default function transformProps( // default to 0-100% range when doing row-level contribution chart if (contributionMode === 'row' && stack) { - if (min === undefined) min = 0; - if (max === undefined) max = 1; + if (yAxisMin === undefined) yAxisMin = 0; + if (yAxisMax === undefined) yAxisMax = 1; if (minSecondary === undefined) minSecondary = 0; if (maxSecondary === undefined) maxSecondary = 1; } @@ -495,18 +501,29 @@ export default function transformProps( formatter: xAxisFormatter, rotate: xAxisLabelRotation, }, + minorTick: { show: minorTicks }, minInterval: xAxisType === 'time' && timeGrainSqla ? TIMEGRAIN_TO_TIMESTAMP[timeGrainSqla] : 0, + ...getMinAndMaxFromBounds( + xAxisType, + truncateXAxis, + xAxisMin, + xAxisMax, + seriesType === EchartsTimeseriesSeriesType.Bar || + seriesTypeB === EchartsTimeseriesSeriesType.Bar + ? EchartsTimeseriesSeriesType.Bar + : undefined, + ), }, yAxis: [ { ...defaultYAxis, type: logAxis ? 'log' : 'value', - min, - max, - minorTick: { show: true }, + min: yAxisMin, + max: yAxisMax, + minorTick: { show: minorTicks }, minorSplitLine: { show: minorSplitLine }, axisLabel: { formatter: getYAxisFormatter( @@ -527,7 +544,7 @@ export default function transformProps( type: logAxisSecondary ? 'log' : 'value', min: minSecondary, max: maxSecondary, - minorTick: { show: true }, + minorTick: { show: minorTicks }, splitLine: { show: false }, minorSplitLine: { show: minorSplitLine }, axisLabel: { diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/MixedTimeseries/types.ts b/superset-frontend/plugins/plugin-chart-echarts/src/MixedTimeseries/types.ts index 30969ae367ce9..e79523d176d02 100644 --- a/superset-frontend/plugins/plugin-chart-echarts/src/MixedTimeseries/types.ts +++ b/superset-frontend/plugins/plugin-chart-echarts/src/MixedTimeseries/types.ts @@ -45,6 +45,7 @@ export type EchartsMixedTimeseriesFormData = QueryFormData & { annotationLayers: AnnotationLayer[]; // shared properties minorSplitLine: boolean; + minorTicks: boolean; logAxis: boolean; logAxisSecondary: boolean; yAxisFormat?: string; @@ -104,6 +105,8 @@ export const DEFAULT_FORM_DATA: EchartsMixedTimeseriesFormData = { yAxisFormatSecondary: TIMESERIES_DEFAULTS.yAxisFormat, yAxisTitleSecondary: DEFAULT_TITLE_FORM_DATA.yAxisTitle, tooltipTimeFormat: TIMESERIES_DEFAULTS.tooltipTimeFormat, + xAxisBounds: TIMESERIES_DEFAULTS.xAxisBounds, + xAxisForceCategorical: TIMESERIES_DEFAULTS.xAxisForceCategorical, xAxisTimeFormat: TIMESERIES_DEFAULTS.xAxisTimeFormat, area: TIMESERIES_DEFAULTS.area, areaB: TIMESERIES_DEFAULTS.area, diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/Area/controlPanel.tsx b/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/Area/controlPanel.tsx index 85151395480d2..2d8bf549aa3a4 100644 --- a/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/Area/controlPanel.tsx +++ b/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/Area/controlPanel.tsx @@ -37,6 +37,10 @@ import { richTooltipSection, seriesOrderSection, percentageThresholdControl, + xAxisLabelRotation, + truncateXAxis, + xAxisBounds, + minorTicks, } from '../../controls'; import { AreaChartStackControlOptions } from '../../constants'; @@ -51,7 +55,6 @@ const { truncateYAxis, yAxisBounds, zoomable, - xAxisLabelRotation, } = DEFAULT_FORM_DATA; const config: ControlPanelConfig = { controlPanelSections: [ @@ -167,6 +170,7 @@ const config: ControlPanelConfig = { }, }, ], + [minorTicks], [ { name: 'zoomable', @@ -191,26 +195,7 @@ const config: ControlPanelConfig = { }, }, ], - [ - { - name: 'xAxisLabelRotation', - config: { - type: 'SelectControl', - freeForm: true, - clearable: false, - label: t('Rotate x axis label'), - choices: [ - [0, '0°'], - [45, '45°'], - ], - default: xAxisLabelRotation, - renderTrigger: true, - description: t( - 'Input field supports custom rotation. e.g. 30 for 30°', - ), - }, - }, - ], + [xAxisLabelRotation], ...richTooltipSection, // eslint-disable-next-line react/jsx-key [<ControlSubSectionHeader>{t('Y Axis')}</ControlSubSectionHeader>], @@ -240,6 +225,8 @@ const config: ControlPanelConfig = { }, }, ], + [truncateXAxis], + [xAxisBounds], [ { name: 'truncateYAxis', diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/Regular/Bar/controlPanel.tsx b/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/Regular/Bar/controlPanel.tsx index 47fe550ad7311..23cdae6a390d1 100644 --- a/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/Regular/Bar/controlPanel.tsx +++ b/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/Regular/Bar/controlPanel.tsx @@ -32,9 +32,13 @@ import { } from '@superset-ui/chart-controls'; import { legendSection, + minorTicks, richTooltipSection, seriesOrderSection, showValueSection, + truncateXAxis, + xAxisBounds, + xAxisLabelRotation, } from '../../../controls'; import { OrientationType } from '../../types'; @@ -49,7 +53,6 @@ const { truncateYAxis, yAxisBounds, zoomable, - xAxisLabelRotation, orientation, } = DEFAULT_FORM_DATA; @@ -163,21 +166,9 @@ function createAxisControl(axis: 'x' | 'y'): ControlSetRow[] { ], [ { - name: 'xAxisLabelRotation', + name: xAxisLabelRotation.name, config: { - type: 'SelectControl', - freeForm: true, - clearable: false, - label: t('Rotate axis label'), - choices: [ - [0, '0°'], - [45, '45°'], - ], - default: xAxisLabelRotation, - renderTrigger: true, - description: t( - 'Input field supports custom rotation. e.g. 30 for 30°', - ), + ...xAxisLabelRotation.config, visibility: ({ controls }: ControlPanelsContainerProps) => isXAxis ? isVertical(controls) : isHorizontal(controls), }, @@ -223,6 +214,8 @@ function createAxisControl(axis: 'x' | 'y'): ControlSetRow[] { }, }, ], + [truncateXAxis], + [xAxisBounds], [ { name: 'truncateYAxis', @@ -307,6 +300,7 @@ const config: ControlPanelConfig = { ...seriesOrderSection, ['color_scheme'], ...showValueSection, + [minorTicks], [ { name: 'zoomable', diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/Regular/Line/controlPanel.tsx b/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/Regular/Line/controlPanel.tsx index 637a5fbc57c79..1b2e7688ea4b9 100644 --- a/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/Regular/Line/controlPanel.tsx +++ b/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/Regular/Line/controlPanel.tsx @@ -35,9 +35,13 @@ import { } from '../../constants'; import { legendSection, + minorTicks, richTooltipSection, seriesOrderSection, showValueSection, + truncateXAxis, + xAxisBounds, + xAxisLabelRotation, } from '../../../controls'; const { @@ -52,7 +56,6 @@ const { truncateYAxis, yAxisBounds, zoomable, - xAxisLabelRotation, } = DEFAULT_FORM_DATA; const config: ControlPanelConfig = { controlPanelSections: [ @@ -167,6 +170,7 @@ const config: ControlPanelConfig = { }, }, ], + [minorTicks], ...legendSection, [<ControlSubSectionHeader>{t('X Axis')}</ControlSubSectionHeader>], [ @@ -179,26 +183,7 @@ const config: ControlPanelConfig = { }, }, ], - [ - { - name: 'xAxisLabelRotation', - config: { - type: 'SelectControl', - freeForm: true, - clearable: false, - label: t('Rotate x axis label'), - choices: [ - [0, '0°'], - [45, '45°'], - ], - default: xAxisLabelRotation, - renderTrigger: true, - description: t( - 'Input field supports custom rotation. e.g. 30 for 30°', - ), - }, - }, - ], + [xAxisLabelRotation], ...richTooltipSection, // eslint-disable-next-line react/jsx-key [<ControlSubSectionHeader>{t('Y Axis')}</ControlSubSectionHeader>], @@ -228,6 +213,8 @@ const config: ControlPanelConfig = { }, }, ], + [truncateXAxis], + [xAxisBounds], [ { name: 'truncateYAxis', diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/Regular/Scatter/controlPanel.tsx b/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/Regular/Scatter/controlPanel.tsx index ffcee717928e9..334f4438c3141 100644 --- a/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/Regular/Scatter/controlPanel.tsx +++ b/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/Regular/Scatter/controlPanel.tsx @@ -34,9 +34,13 @@ import { } from '../../constants'; import { legendSection, + minorTicks, richTooltipSection, seriesOrderSection, showValueSection, + truncateXAxis, + xAxisBounds, + xAxisLabelRotation, } from '../../../controls'; const { @@ -48,7 +52,6 @@ const { truncateYAxis, yAxisBounds, zoomable, - xAxisLabelRotation, } = DEFAULT_FORM_DATA; const config: ControlPanelConfig = { controlPanelSections: [ @@ -109,6 +112,7 @@ const config: ControlPanelConfig = { }, }, ], + [minorTicks], ...legendSection, [<ControlSubSectionHeader>{t('X Axis')}</ControlSubSectionHeader>], @@ -122,26 +126,7 @@ const config: ControlPanelConfig = { }, }, ], - [ - { - name: 'xAxisLabelRotation', - config: { - type: 'SelectControl', - freeForm: true, - clearable: false, - label: t('Rotate x axis label'), - choices: [ - [0, '0°'], - [45, '45°'], - ], - default: xAxisLabelRotation, - renderTrigger: true, - description: t( - 'Input field supports custom rotation. e.g. 30 for 30°', - ), - }, - }, - ], + [xAxisLabelRotation], // eslint-disable-next-line react/jsx-key ...richTooltipSection, // eslint-disable-next-line react/jsx-key @@ -172,6 +157,8 @@ const config: ControlPanelConfig = { }, }, ], + [truncateXAxis], + [xAxisBounds], [ { name: 'truncateYAxis', diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/Regular/SmoothLine/controlPanel.tsx b/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/Regular/SmoothLine/controlPanel.tsx index cb7164e0ab5df..24ff0bfa8dc5f 100644 --- a/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/Regular/SmoothLine/controlPanel.tsx +++ b/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/Regular/SmoothLine/controlPanel.tsx @@ -34,9 +34,13 @@ import { } from '../../constants'; import { legendSection, + minorTicks, richTooltipSection, seriesOrderSection, showValueSectionWithoutStack, + truncateXAxis, + xAxisBounds, + xAxisLabelRotation, } from '../../../controls'; const { @@ -48,7 +52,6 @@ const { truncateYAxis, yAxisBounds, zoomable, - xAxisLabelRotation, } = DEFAULT_FORM_DATA; const config: ControlPanelConfig = { controlPanelSections: [ @@ -109,6 +112,7 @@ const config: ControlPanelConfig = { }, }, ], + [minorTicks], ...legendSection, [<ControlSubSectionHeader>{t('X Axis')}</ControlSubSectionHeader>], [ @@ -121,26 +125,7 @@ const config: ControlPanelConfig = { }, }, ], - [ - { - name: 'xAxisLabelRotation', - config: { - type: 'SelectControl', - freeForm: true, - clearable: false, - label: t('Rotate x axis label'), - choices: [ - [0, '0°'], - [45, '45°'], - ], - default: xAxisLabelRotation, - renderTrigger: true, - description: t( - 'Input field supports custom rotation. e.g. 30 for 30°', - ), - }, - }, - ], + [xAxisLabelRotation], // eslint-disable-next-line react/jsx-key ...richTooltipSection, // eslint-disable-next-line react/jsx-key @@ -172,6 +157,8 @@ const config: ControlPanelConfig = { }, }, ], + [truncateXAxis], + [xAxisBounds], [ { name: 'truncateYAxis', diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/Step/controlPanel.tsx b/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/Step/controlPanel.tsx index 1921e698c2d2e..36da829576864 100644 --- a/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/Step/controlPanel.tsx +++ b/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/Step/controlPanel.tsx @@ -32,9 +32,13 @@ import { EchartsTimeseriesSeriesType } from '../../types'; import { DEFAULT_FORM_DATA, TIME_SERIES_DESCRIPTION_TEXT } from '../constants'; import { legendSection, + minorTicks, richTooltipSection, seriesOrderSection, showValueSection, + truncateXAxis, + xAxisBounds, + xAxisLabelRotation, } from '../../controls'; const { @@ -48,7 +52,6 @@ const { truncateYAxis, yAxisBounds, zoomable, - xAxisLabelRotation, } = DEFAULT_FORM_DATA; const config: ControlPanelConfig = { controlPanelSections: [ @@ -161,6 +164,7 @@ const config: ControlPanelConfig = { }, }, ], + [minorTicks], ...legendSection, [<ControlSubSectionHeader>{t('X Axis')}</ControlSubSectionHeader>], [ @@ -173,26 +177,7 @@ const config: ControlPanelConfig = { }, }, ], - [ - { - name: 'xAxisLabelRotation', - config: { - type: 'SelectControl', - freeForm: true, - clearable: false, - label: t('Rotate x axis label'), - choices: [ - [0, '0°'], - [45, '45°'], - ], - default: xAxisLabelRotation, - renderTrigger: true, - description: t( - 'Input field supports custom rotation. e.g. 30 for 30°', - ), - }, - }, - ], + [xAxisLabelRotation], ...richTooltipSection, // eslint-disable-next-line react/jsx-key [<ControlSubSectionHeader>{t('Y Axis')}</ControlSubSectionHeader>], @@ -222,6 +207,8 @@ const config: ControlPanelConfig = { }, }, ], + [truncateXAxis], + [xAxisBounds], [ { name: 'truncateYAxis', diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/constants.ts b/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/constants.ts index 17629c0996b77..839bc607f77a5 100644 --- a/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/constants.ts +++ b/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/constants.ts @@ -30,6 +30,7 @@ import { DEFAULT_LEGEND_FORM_DATA, DEFAULT_TITLE_FORM_DATA, } from '../constants'; +import { defaultXAxis } from '../defaults'; // @ts-ignore export const DEFAULT_FORM_DATA: EchartsTimeseriesFormData = { @@ -57,11 +58,13 @@ export const DEFAULT_FORM_DATA: EchartsTimeseriesFormData = { seriesType: EchartsTimeseriesSeriesType.Line, stack: false, tooltipTimeFormat: 'smart_date', + truncateXAxis: true, truncateYAxis: false, yAxisBounds: [null, null], zoomable: false, richTooltip: true, - xAxisLabelRotation: 0, + xAxisForceCategorical: false, + xAxisLabelRotation: defaultXAxis.xAxisLabelRotation, groupby: [], showValue: false, onlyTotal: false, diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/transformProps.ts b/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/transformProps.ts index d44ae93580489..3bbe285aeca54 100644 --- a/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/transformProps.ts +++ b/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/transformProps.ts @@ -20,9 +20,13 @@ import { invert } from 'lodash'; import { AnnotationLayer, + AxisType, + buildCustomFormatters, CategoricalColorNamespace, + CurrencyFormatter, ensureIsArray, GenericDataType, + getCustomFormatter, getMetricLabel, getNumberFormatter, getXAxisLabel, @@ -34,9 +38,6 @@ import { isTimeseriesAnnotationLayer, t, TimeseriesChartDataResponseResult, - buildCustomFormatters, - getCustomFormatter, - CurrencyFormatter, } from '@superset-ui/core'; import { extractExtraMetrics, @@ -48,12 +49,12 @@ import { ZRLineType } from 'echarts/types/src/util/types'; import { EchartsTimeseriesChartProps, EchartsTimeseriesFormData, - TimeseriesChartTransformedProps, OrientationType, + TimeseriesChartTransformedProps, } from './types'; import { DEFAULT_FORM_DATA } from './constants'; import { ForecastSeriesEnum, ForecastValue, Refs } from '../types'; -import { parseYAxisBound } from '../utils/controls'; +import { parseAxisBound } from '../utils/controls'; import { calculateLowerLogTick, dedupSeries, @@ -63,6 +64,7 @@ import { getAxisType, getColtypesMapping, getLegendProps, + getMinAndMaxFromBounds, } from '../utils/series'; import { extractAnnotationLabels, @@ -88,8 +90,8 @@ import { } from './transformers'; import { StackControlsValue, - TIMESERIES_CONSTANTS, TIMEGRAIN_TO_TIMESTAMP, + TIMESERIES_CONSTANTS, } from '../constants'; import { getDefaultTooltip } from '../utils/tooltip'; import { @@ -144,6 +146,7 @@ export default function transformProps( markerSize, metrics, minorSplitLine, + minorTicks, onlyTotal, opacity, orientation, @@ -160,8 +163,11 @@ export default function transformProps( stack, tooltipTimeFormat, tooltipSortByMetric, + truncateXAxis, truncateYAxis, xAxis: xAxisOrig, + xAxisBounds, + xAxisForceCategorical, xAxisLabelRotation, xAxisSortSeries, xAxisSortSeriesAscending, @@ -243,7 +249,7 @@ export default function transformProps( const isAreaExpand = stack === StackControlsValue.Expand; const xAxisDataType = dataTypes?.[xAxisLabel] ?? dataTypes?.[xAxisOrig]; - const xAxisType = getAxisType(xAxisDataType); + const xAxisType = getAxisType(stack, xAxisForceCategorical, xAxisDataType); const series: SeriesOption[] = []; const forcePercentFormatter = Boolean(contributionMode || isAreaExpand); @@ -387,15 +393,20 @@ export default function transformProps( } }); - // yAxisBounds need to be parsed to replace incompatible values with undefined - let [min, max] = (yAxisBounds || []).map(parseYAxisBound); + // axis bounds need to be parsed to replace incompatible values with undefined + const [xAxisMin, xAxisMax] = (xAxisBounds || []).map(parseAxisBound); + let [yAxisMin, yAxisMax] = (yAxisBounds || []).map(parseAxisBound); // default to 0-100% range when doing row-level contribution chart if ((contributionMode === 'row' || isAreaExpand) && stack) { - if (min === undefined) min = 0; - if (max === undefined) max = 1; - } else if (logAxis && min === undefined && minPositiveValue !== undefined) { - min = calculateLowerLogTick(minPositiveValue); + if (yAxisMin === undefined) yAxisMin = 0; + if (yAxisMax === undefined) yAxisMax = 1; + } else if ( + logAxis && + yAxisMin === undefined && + minPositiveValue !== undefined + ) { + yAxisMin = calculateLowerLogTick(minPositiveValue); } const tooltipFormatter = @@ -447,17 +458,26 @@ export default function transformProps( formatter: xAxisFormatter, rotate: xAxisLabelRotation, }, + minorTick: { show: minorTicks }, minInterval: - xAxisType === 'time' && timeGrainSqla + xAxisType === AxisType.time && timeGrainSqla ? TIMEGRAIN_TO_TIMESTAMP[timeGrainSqla] : 0, + ...getMinAndMaxFromBounds( + xAxisType, + truncateXAxis, + xAxisMin, + xAxisMax, + seriesType, + ), }; + let yAxis: any = { ...defaultYAxis, - type: logAxis ? 'log' : 'value', - min, - max, - minorTick: { show: true }, + type: logAxis ? AxisType.log : AxisType.value, + min: yAxisMin, + max: yAxisMax, + minorTick: { show: minorTicks }, minorSplitLine: { show: minorSplitLine }, axisLabel: { formatter: getYAxisFormatter( diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/types.ts b/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/types.ts index 1873086d99122..692abb2c79577 100644 --- a/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/types.ts +++ b/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/types.ts @@ -68,6 +68,7 @@ export type EchartsTimeseriesFormData = QueryFormData & { markerSize: number; metrics: QueryFormMetric[]; minorSplitLine: boolean; + minorTicks: boolean; opacity: number; orderDesc: boolean; rowLimit: number; @@ -75,10 +76,13 @@ export type EchartsTimeseriesFormData = QueryFormData & { stack: StackType; timeCompare?: string[]; tooltipTimeFormat?: string; + truncateXAxis: boolean; truncateYAxis: boolean; yAxisFormat?: string; + xAxisForceCategorical?: boolean; xAxisTimeFormat?: string; timeGrainSqla?: TimeGranularity; + xAxisBounds: [number | undefined | null, number | undefined | null]; yAxisBounds: [number | undefined | null, number | undefined | null]; zoomable: boolean; richTooltip: boolean; diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/Waterfall/buildQuery.ts b/superset-frontend/plugins/plugin-chart-echarts/src/Waterfall/buildQuery.ts index e47effb3c2723..deb3571938bf2 100644 --- a/superset-frontend/plugins/plugin-chart-echarts/src/Waterfall/buildQuery.ts +++ b/superset-frontend/plugins/plugin-chart-echarts/src/Waterfall/buildQuery.ts @@ -19,15 +19,14 @@ import { buildQueryContext, ensureIsArray, - getXAxisColumn, - isXAxisSet, QueryFormData, } from '@superset-ui/core'; export default function buildQuery(formData: QueryFormData) { + const { x_axis, granularity_sqla, groupby } = formData; const columns = [ - ...(isXAxisSet(formData) ? ensureIsArray(getXAxisColumn(formData)) : []), - ...ensureIsArray(formData.groupby), + ...ensureIsArray(x_axis || granularity_sqla), + ...ensureIsArray(groupby), ]; return buildQueryContext(formData, baseQueryObject => [ { diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/Waterfall/controlPanel.tsx b/superset-frontend/plugins/plugin-chart-echarts/src/Waterfall/controlPanel.tsx index 7a71dd4fcba55..d07e5175e6579 100644 --- a/superset-frontend/plugins/plugin-chart-echarts/src/Waterfall/controlPanel.tsx +++ b/superset-frontend/plugins/plugin-chart-echarts/src/Waterfall/controlPanel.tsx @@ -17,25 +17,27 @@ * under the License. */ import React from 'react'; -import { t } from '@superset-ui/core'; +import { hasGenericChartAxes, t } from '@superset-ui/core'; import { ControlPanelConfig, ControlSubSectionHeader, D3_TIME_FORMAT_DOCS, DEFAULT_TIME_FORMAT, formatSelectOptions, + sections, sharedControls, } from '@superset-ui/chart-controls'; import { showValueControl } from '../controls'; const config: ControlPanelConfig = { controlPanelSections: [ + sections.genericTime, { label: t('Query'), expanded: true, controlSetRows: [ - ['x_axis'], - ['time_grain_sqla'], + [hasGenericChartAxes ? 'x_axis' : null], + [hasGenericChartAxes ? 'time_grain_sqla' : null], ['groupby'], ['metric'], ['adhoc_filters'], diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/Waterfall/index.ts b/superset-frontend/plugins/plugin-chart-echarts/src/Waterfall/index.ts index c0d7a11067420..b8c66fabb1aab 100644 --- a/superset-frontend/plugins/plugin-chart-echarts/src/Waterfall/index.ts +++ b/superset-frontend/plugins/plugin-chart-echarts/src/Waterfall/index.ts @@ -61,7 +61,7 @@ export default class EchartsWaterfallChartPlugin extends ChartPlugin< { url: example3 }, ], name: t('Waterfall Chart'), - tags: [t('Categorical'), t('Comparison'), t('ECharts')], + tags: [t('Categorical'), t('Comparison'), t('ECharts'), t('Popular')], thumbnail, }), transformProps, diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/Waterfall/transformProps.ts b/superset-frontend/plugins/plugin-chart-echarts/src/Waterfall/transformProps.ts index 7b5faed1b20a5..84fbbf6cb9f91 100644 --- a/superset-frontend/plugins/plugin-chart-echarts/src/Waterfall/transformProps.ts +++ b/superset-frontend/plugins/plugin-chart-echarts/src/Waterfall/transformProps.ts @@ -185,6 +185,7 @@ export default function transformProps( const { setDataMask = () => {}, onContextMenu, onLegendStateChanged } = hooks; const { currencyFormat, + granularitySqla = '', groupby, increaseColor, decreaseColor, @@ -213,7 +214,10 @@ export default function transformProps( const breakdownName = isAdhocColumn(breakdownColumn) ? breakdownColumn.label! : breakdownColumn; - const xAxisName = isAdhocColumn(xAxis) ? xAxis.label! : xAxis; + const xAxisColumn = xAxis || granularitySqla; + const xAxisName = isAdhocColumn(xAxisColumn) + ? xAxisColumn.label! + : xAxisColumn; const metricLabel = getMetricLabel(metric); const transformedData = transformer({ diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/controls.tsx b/superset-frontend/plugins/plugin-chart-echarts/src/controls.tsx index 8f311e47e5676..c91d27acc6ab1 100644 --- a/superset-frontend/plugins/plugin-chart-echarts/src/controls.tsx +++ b/superset-frontend/plugins/plugin-chart-echarts/src/controls.tsx @@ -29,6 +29,7 @@ import { } from '@superset-ui/chart-controls'; import { DEFAULT_LEGEND_FORM_DATA, StackControlOptions } from './constants'; import { DEFAULT_FORM_DATA } from './Timeseries/constants'; +import { defaultXAxis } from './defaults'; const { legendMargin, legendOrientation, legendType, showLegend } = DEFAULT_LEGEND_FORM_DATA; @@ -243,8 +244,79 @@ const sortSeriesAscending: ControlSetItem = { }, }; +export const xAxisLabelRotation = { + name: 'xAxisLabelRotation', + config: { + type: 'SelectControl', + freeForm: true, + clearable: false, + label: t('Rotate x axis label'), + choices: [ + [0, '0°'], + [45, '45°'], + [90, '90°'], + ], + default: defaultXAxis.xAxisLabelRotation, + renderTrigger: true, + description: t('Input field supports custom rotation. e.g. 30 for 30°'), + }, +}; + export const seriesOrderSection: ControlSetRow[] = [ [<ControlSubSectionHeader>{t('Series Order')}</ControlSubSectionHeader>], [sortSeriesType], [sortSeriesAscending], ]; + +export const truncateXAxis: ControlSetItem = { + name: 'truncateXAxis', + config: { + type: 'CheckboxControl', + label: t('Truncate X Axis'), + default: DEFAULT_FORM_DATA.truncateXAxis, + renderTrigger: true, + description: t( + 'Truncate X Axis. Can be overridden by specifying a min or max bound. Only applicable for numercal X axis.', + ), + }, +}; + +export const xAxisBounds: ControlSetItem = { + name: 'xAxisBounds', + config: { + type: 'BoundsControl', + label: t('X Axis Bounds'), + renderTrigger: true, + default: DEFAULT_FORM_DATA.xAxisBounds, + description: t( + 'Bounds for numerical X axis. Not applicable for temporal or categorical axes. ' + + 'When left empty, the bounds are dynamically defined based on the min/max of the data. ' + + "Note that this feature will only expand the axis range. It won't " + + "narrow the data's extent.", + ), + visibility: ({ controls }: ControlPanelsContainerProps) => + Boolean(controls?.truncateXAxis?.value), + }, +}; + +export const minorTicks: ControlSetItem = { + name: 'minorTicks', + config: { + type: 'CheckboxControl', + label: t('Minor ticks'), + default: false, + renderTrigger: true, + description: t('Show minor ticks on axes.'), + }, +}; + +export const forceCategorical: ControlSetItem = { + name: 'forceCategorical', + config: { + type: 'CheckboxControl', + label: t('Force categorical'), + default: false, + renderTrigger: true, + description: t('Make the x-axis categorical'), + }, +}; diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/defaults.ts b/superset-frontend/plugins/plugin-chart-echarts/src/defaults.ts index c5ada14932ebf..be37d6fcbf748 100644 --- a/superset-frontend/plugins/plugin-chart-echarts/src/defaults.ts +++ b/superset-frontend/plugins/plugin-chart-echarts/src/defaults.ts @@ -24,6 +24,11 @@ export const defaultGrid = { export const defaultYAxis = { scale: true, + yAxisLabelRotation: 0, +}; + +export const defaultXAxis = { + xAxisLabelRotation: 0, }; export const defaultLegendPadding = { diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/utils/controls.ts b/superset-frontend/plugins/plugin-chart-echarts/src/utils/controls.ts index 27f8fb144729a..689a6e99e5d6a 100644 --- a/superset-frontend/plugins/plugin-chart-echarts/src/utils/controls.ts +++ b/superset-frontend/plugins/plugin-chart-echarts/src/utils/controls.ts @@ -19,8 +19,7 @@ import { validateNumber } from '@superset-ui/core'; -// eslint-disable-next-line import/prefer-default-export -export function parseYAxisBound( +export function parseAxisBound( bound?: string | number | null, ): number | undefined { if (bound === undefined || bound === null || Number.isNaN(Number(bound))) { diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/utils/series.ts b/superset-frontend/plugins/plugin-chart-echarts/src/utils/series.ts index 663548f25d710..6a51e7cbf7c5c 100644 --- a/superset-frontend/plugins/plugin-chart-echarts/src/utils/series.ts +++ b/superset-frontend/plugins/plugin-chart-echarts/src/utils/series.ts @@ -25,12 +25,12 @@ import { DTTM_ALIAS, ensureIsArray, GenericDataType, + LegendState, + normalizeTimestamp, NumberFormats, NumberFormatter, - TimeFormatter, SupersetTheme, - normalizeTimestamp, - LegendState, + TimeFormatter, ValueFormatter, } from '@superset-ui/core'; import { SortSeriesType } from '@superset-ui/chart-controls'; @@ -41,7 +41,12 @@ import { StackControlsValue, TIMESERIES_CONSTANTS, } from '../constants'; -import { LegendOrientation, LegendType, StackType } from '../types'; +import { + EchartsTimeseriesSeriesType, + LegendOrientation, + LegendType, + StackType, +} from '../types'; import { defaultLegendPadding } from '../defaults'; function isDefined<T>(value: T | undefined | null): boolean { @@ -224,8 +229,10 @@ export function sortRows( } const value = - xAxisSortSeries === SortSeriesType.Name && typeof sortKey === 'string' - ? sortKey.toLowerCase() + xAxisSortSeries === SortSeriesType.Name + ? typeof sortKey === 'string' + ? sortKey.toLowerCase() + : sortKey : aggregate; return { @@ -508,10 +515,20 @@ export function sanitizeHtml(text: string): string { return format.encodeHTML(text); } -export function getAxisType(dataType?: GenericDataType): AxisType { +export function getAxisType( + stack: StackType, + forceCategorical?: boolean, + dataType?: GenericDataType, +): AxisType { + if (forceCategorical) { + return AxisType.category; + } if (dataType === GenericDataType.TEMPORAL) { return AxisType.time; } + if (dataType === GenericDataType.NUMERIC && !stack) { + return AxisType.value; + } return AxisType.category; } @@ -540,3 +557,36 @@ export function calculateLowerLogTick(minPositiveValue: number) { const logBase10 = Math.floor(Math.log10(minPositiveValue)); return Math.pow(10, logBase10); } + +type BoundsType = { + min?: number | 'dataMin'; + max?: number | 'dataMax'; + scale?: true; +}; + +export function getMinAndMaxFromBounds( + axisType: AxisType, + truncateAxis: boolean, + min?: number, + max?: number, + seriesType?: EchartsTimeseriesSeriesType, +): BoundsType | {} { + if (axisType === AxisType.value && truncateAxis) { + const ret: BoundsType = {}; + if (seriesType === EchartsTimeseriesSeriesType.Bar) { + ret.scale = true; + } + if (min !== undefined) { + ret.min = min; + } else if (seriesType !== EchartsTimeseriesSeriesType.Bar) { + ret.min = 'dataMin'; + } + if (max !== undefined) { + ret.max = max; + } else if (seriesType !== EchartsTimeseriesSeriesType.Bar) { + ret.max = 'dataMax'; + } + return ret; + } + return {}; +} diff --git a/superset-frontend/plugins/plugin-chart-echarts/test/Bubble/transformProps.test.ts b/superset-frontend/plugins/plugin-chart-echarts/test/Bubble/transformProps.test.ts index 2bb4ae0fc604a..d93f394681205 100644 --- a/superset-frontend/plugins/plugin-chart-echarts/test/Bubble/transformProps.test.ts +++ b/superset-frontend/plugins/plugin-chart-echarts/test/Bubble/transformProps.test.ts @@ -18,6 +18,7 @@ */ import { ChartProps, + ChartPropsConfig, getNumberFormatter, SqlaFormData, supersetTheme, @@ -27,7 +28,7 @@ import { EchartsBubbleChartProps } from 'plugins/plugin-chart-echarts/src/Bubble import transformProps, { formatTooltip } from '../../src/Bubble/transformProps'; describe('Bubble transformProps', () => { - const formData: SqlaFormData = { + const defaultFormData: SqlaFormData = { datasource: '1__table', viz_type: 'echarts_bubble', entity: 'customer_name', @@ -48,10 +49,11 @@ describe('Bubble transformProps', () => { expressionType: 'simple', label: 'SUM(sales)', }, + xAxisBounds: [null, null], yAxisBounds: [null, null], }; - const chartProps = new ChartProps({ - formData, + const chartConfig: ChartPropsConfig = { + formData: defaultFormData, height: 800, width: 800, queriesData: [ @@ -79,9 +81,48 @@ describe('Bubble transformProps', () => { }, ], theme: supersetTheme, - }); + }; it('Should transform props for viz', () => { + const chartProps = new ChartProps(chartConfig); + expect(transformProps(chartProps as EchartsBubbleChartProps)).toEqual( + expect.objectContaining({ + width: 800, + height: 800, + echartOptions: expect.objectContaining({ + series: expect.arrayContaining([ + expect.objectContaining({ + data: expect.arrayContaining([ + [10, 20, 30, 'AV Stores, Co.', null], + ]), + }), + expect.objectContaining({ + data: expect.arrayContaining([ + [40, 50, 60, 'Alpha Cognac', null], + ]), + }), + expect.objectContaining({ + data: expect.arrayContaining([ + [70, 80, 90, 'Amica Models & Co.', null], + ]), + }), + ]), + }), + }), + ); + }); + + it('Should transform props with undefined control values', () => { + const formData: SqlaFormData = { + ...defaultFormData, + xAxisBounds: undefined, + yAxisBounds: undefined, + }; + const chartProps = new ChartProps({ + ...chartConfig, + formData, + }); + expect(transformProps(chartProps as EchartsBubbleChartProps)).toEqual( expect.objectContaining({ width: 800, diff --git a/superset-frontend/plugins/plugin-chart-echarts/test/utils/controls.test.ts b/superset-frontend/plugins/plugin-chart-echarts/test/utils/controls.test.ts index 60ced57739342..cb0faac5959c1 100644 --- a/superset-frontend/plugins/plugin-chart-echarts/test/utils/controls.test.ts +++ b/superset-frontend/plugins/plugin-chart-echarts/test/utils/controls.test.ts @@ -16,22 +16,22 @@ * specific language governing permissions and limitations * under the License. */ -import { parseYAxisBound } from '../../src/utils/controls'; +import { parseAxisBound } from '../../src/utils/controls'; describe('parseYAxisBound', () => { it('should return undefined for invalid values', () => { - expect(parseYAxisBound(null)).toBeUndefined(); - expect(parseYAxisBound(undefined)).toBeUndefined(); - expect(parseYAxisBound(NaN)).toBeUndefined(); - expect(parseYAxisBound('abc')).toBeUndefined(); + expect(parseAxisBound(null)).toBeUndefined(); + expect(parseAxisBound(undefined)).toBeUndefined(); + expect(parseAxisBound(NaN)).toBeUndefined(); + expect(parseAxisBound('abc')).toBeUndefined(); }); it('should return numeric value for valid values', () => { - expect(parseYAxisBound(0)).toEqual(0); - expect(parseYAxisBound('0')).toEqual(0); - expect(parseYAxisBound(1)).toEqual(1); - expect(parseYAxisBound('1')).toEqual(1); - expect(parseYAxisBound(10.1)).toEqual(10.1); - expect(parseYAxisBound('10.1')).toEqual(10.1); + expect(parseAxisBound(0)).toEqual(0); + expect(parseAxisBound('0')).toEqual(0); + expect(parseAxisBound(1)).toEqual(1); + expect(parseAxisBound('1')).toEqual(1); + expect(parseAxisBound(10.1)).toEqual(10.1); + expect(parseAxisBound('10.1')).toEqual(10.1); }); }); diff --git a/superset-frontend/plugins/plugin-chart-echarts/test/utils/series.test.ts b/superset-frontend/plugins/plugin-chart-echarts/test/utils/series.test.ts index 75faee93e59cd..3d7d21c8d0b02 100644 --- a/superset-frontend/plugins/plugin-chart-echarts/test/utils/series.test.ts +++ b/superset-frontend/plugins/plugin-chart-echarts/test/utils/series.test.ts @@ -18,6 +18,7 @@ */ import { SortSeriesType } from '@superset-ui/chart-controls'; import { + AxisType, DataRecord, GenericDataType, getNumberFormatter, @@ -31,14 +32,20 @@ import { extractSeries, extractShowValueIndexes, formatSeriesName, + getAxisType, getChartPadding, getLegendProps, getOverMaxHiddenFormatter, + getMinAndMaxFromBounds, sanitizeHtml, sortAndFilterSeries, sortRows, } from '../../src/utils/series'; -import { LegendOrientation, LegendType } from '../../src/types'; +import { + EchartsTimeseriesSeriesType, + LegendOrientation, + LegendType, +} from '../../src/types'; import { defaultLegendPadding } from '../../src/defaults'; import { NULL_STRING } from '../../src/constants'; @@ -870,3 +877,135 @@ test('calculateLowerLogTick', () => { expect(calculateLowerLogTick(2)).toEqual(1); expect(calculateLowerLogTick(0.005)).toEqual(0.001); }); + +test('getAxisType without forced categorical', () => { + expect(getAxisType(false, false, GenericDataType.TEMPORAL)).toEqual( + AxisType.time, + ); + expect(getAxisType(false, false, GenericDataType.NUMERIC)).toEqual( + AxisType.value, + ); + expect(getAxisType(true, false, GenericDataType.NUMERIC)).toEqual( + AxisType.category, + ); + expect(getAxisType(false, false, GenericDataType.BOOLEAN)).toEqual( + AxisType.category, + ); + expect(getAxisType(false, false, GenericDataType.STRING)).toEqual( + AxisType.category, + ); +}); + +test('getAxisType with forced categorical', () => { + expect(getAxisType(false, true, GenericDataType.NUMERIC)).toEqual( + AxisType.category, + ); +}); + +test('getMinAndMaxFromBounds returns empty object when not truncating', () => { + expect( + getMinAndMaxFromBounds( + AxisType.value, + false, + 10, + 100, + EchartsTimeseriesSeriesType.Bar, + ), + ).toEqual({}); +}); + +test('getMinAndMaxFromBounds returns empty object for categorical axis', () => { + expect( + getMinAndMaxFromBounds( + AxisType.category, + false, + 10, + 100, + EchartsTimeseriesSeriesType.Bar, + ), + ).toEqual({}); +}); + +test('getMinAndMaxFromBounds returns empty object for time axis', () => { + expect( + getMinAndMaxFromBounds( + AxisType.time, + false, + 10, + 100, + EchartsTimeseriesSeriesType.Bar, + ), + ).toEqual({}); +}); + +test('getMinAndMaxFromBounds returns dataMin/dataMax for non-bar charts', () => { + expect( + getMinAndMaxFromBounds( + AxisType.value, + true, + undefined, + undefined, + EchartsTimeseriesSeriesType.Line, + ), + ).toEqual({ + min: 'dataMin', + max: 'dataMax', + }); +}); + +test('getMinAndMaxFromBounds returns bound without scale for non-bar charts', () => { + expect( + getMinAndMaxFromBounds( + AxisType.value, + true, + 10, + undefined, + EchartsTimeseriesSeriesType.Line, + ), + ).toEqual({ + min: 10, + max: 'dataMax', + }); +}); + +test('getMinAndMaxFromBounds returns scale when truncating without bounds', () => { + expect( + getMinAndMaxFromBounds( + AxisType.value, + true, + undefined, + undefined, + EchartsTimeseriesSeriesType.Bar, + ), + ).toEqual({ scale: true }); +}); + +test('getMinAndMaxFromBounds returns automatic upper bound when truncating', () => { + expect( + getMinAndMaxFromBounds( + AxisType.value, + true, + 10, + undefined, + EchartsTimeseriesSeriesType.Bar, + ), + ).toEqual({ + min: 10, + scale: true, + }); +}); + +test('getMinAndMaxFromBounds returns automatic lower bound when truncating', () => { + expect( + getMinAndMaxFromBounds( + AxisType.value, + true, + undefined, + 100, + EchartsTimeseriesSeriesType.Bar, + ), + ).toEqual({ + max: 100, + scale: true, + }); +}); diff --git a/superset-frontend/plugins/plugin-chart-table/src/TableChart.tsx b/superset-frontend/plugins/plugin-chart-table/src/TableChart.tsx index 917abb929a8ec..d106e42a84fc4 100644 --- a/superset-frontend/plugins/plugin-chart-table/src/TableChart.tsx +++ b/superset-frontend/plugins/plugin-chart-table/src/TableChart.tsx @@ -71,6 +71,12 @@ interface TableSize { height: number; } +const ACTION_KEYS = { + enter: 'Enter', + spacebar: 'Spacebar', + space: ' ', +}; + /** * Return sortType based on data type */ @@ -591,6 +597,13 @@ export default function TableChart<D extends DataRecord = DataRecord>( ...sharedStyle, ...style, }} + tabIndex={0} + onKeyDown={(e: React.KeyboardEvent<HTMLElement>) => { + // programatically sort column on keypress + if (Object.values(ACTION_KEYS).includes(e.key)) { + col.toggleSortBy(); + } + }} onClick={onClick} data-column-name={col.id} {...(allowRearrangeColumns && { diff --git a/superset-frontend/plugins/plugin-chart-table/src/transformProps.ts b/superset-frontend/plugins/plugin-chart-table/src/transformProps.ts index a4eb2e5e18a94..e76201cac30cb 100644 --- a/superset-frontend/plugins/plugin-chart-table/src/transformProps.ts +++ b/superset-frontend/plugins/plugin-chart-table/src/transformProps.ts @@ -118,9 +118,10 @@ const processColumns = memoizeOne(function processColumns( // because users can also add things like `MAX(str_col)` as a metric. const isMetric = metricsSet.has(key) && isNumeric(key, records); const isPercentMetric = percentMetricsSet.has(key); - const label = isPercentMetric - ? `%${verboseMap?.[key.replace('%', '')] || key}` - : verboseMap?.[key] || key; + const label = + isPercentMetric && verboseMap?.hasOwnProperty(key.replace('%', '')) + ? `%${verboseMap[key.replace('%', '')]}` + : verboseMap?.[key] || key; const isTime = dataType === GenericDataType.TEMPORAL; const isNumber = dataType === GenericDataType.NUMERIC; const savedFormat = columnFormats?.[key]; diff --git a/superset-frontend/src/SqlLab/actions/sqlLab.js b/superset-frontend/src/SqlLab/actions/sqlLab.js index 44b4307a1906e..567d3383d752d 100644 --- a/superset-frontend/src/SqlLab/actions/sqlLab.js +++ b/superset-frontend/src/SqlLab/actions/sqlLab.js @@ -99,6 +99,8 @@ export const CREATE_DATASOURCE_STARTED = 'CREATE_DATASOURCE_STARTED'; export const CREATE_DATASOURCE_SUCCESS = 'CREATE_DATASOURCE_SUCCESS'; export const CREATE_DATASOURCE_FAILED = 'CREATE_DATASOURCE_FAILED'; +export const SET_EDITOR_TAB_LAST_UPDATE = 'SET_EDITOR_TAB_LAST_UPDATE'; + export const addInfoToast = addInfoToastAction; export const addSuccessToast = addSuccessToastAction; export const addDangerToast = addDangerToastAction; @@ -160,6 +162,10 @@ export function updateQueryEditor(alterations) { return { type: UPDATE_QUERY_EDITOR, alterations }; } +export function setEditorTabLastUpdate(timestamp) { + return { type: SET_EDITOR_TAB_LAST_UPDATE, timestamp }; +} + export function scheduleQuery(query) { return dispatch => SupersetClient.post({ @@ -237,44 +243,11 @@ export function startQuery(query) { } export function querySuccess(query, results) { - return function (dispatch) { - const sqlEditorId = results?.query?.sqlEditorId; - const sync = - sqlEditorId && - !query.isDataPreview && - isFeatureEnabled(FeatureFlag.SQLLAB_BACKEND_PERSISTENCE) - ? SupersetClient.put({ - endpoint: encodeURI(`/tabstateview/${sqlEditorId}`), - postPayload: { latest_query_id: query.id }, - }) - : Promise.resolve(); - - return sync - .then(() => dispatch({ type: QUERY_SUCCESS, query, results })) - .catch(() => - dispatch( - addDangerToast( - t( - 'An error occurred while storing the latest query id in the backend. ' + - 'Please contact your administrator if this problem persists.', - ), - ), - ), - ); - }; + return { type: QUERY_SUCCESS, query, results }; } export function queryFailed(query, msg, link, errors) { return function (dispatch) { - const sync = - !query.isDataPreview && - isFeatureEnabled(FeatureFlag.SQLLAB_BACKEND_PERSISTENCE) - ? SupersetClient.put({ - endpoint: encodeURI(`/tabstateview/${query.sqlEditorId}`), - postPayload: { latest_query_id: query.id }, - }) - : Promise.resolve(); - const eventData = { has_err: true, start_offset: query.startDttm, @@ -295,22 +268,7 @@ export function queryFailed(query, msg, link, errors) { }); }); - return ( - sync - .catch(() => - dispatch( - addDangerToast( - t( - 'An error occurred while storing the latest query id in the backend. ' + - 'Please contact your administrator if this problem persists.', - ), - ), - ), - ) - // We should always show the error message, even if we couldn't sync the - // state to the backend - .then(() => dispatch({ type: QUERY_FAILED, query, msg, link, errors })) - ); + dispatch({ type: QUERY_FAILED, query, msg, link, errors }); }; } @@ -557,14 +515,15 @@ export function addQueryEditor(queryEditor) { ? SupersetClient.post({ endpoint: '/tabstateview/', postPayload: { queryEditor }, - }) - : Promise.resolve({ json: { id: shortid.generate() } }); + }).then(({ json }) => ({ ...json, loaded: true })) + : Promise.resolve({ id: shortid.generate() }); return sync - .then(({ json }) => { + .then(({ id, loaded }) => { const newQueryEditor = { ...queryEditor, - id: json.id.toString(), + id: id.toString(), + loaded, }; return dispatch({ type: ADD_QUERY_EDITOR, @@ -736,11 +695,6 @@ export function switchQueryEditor(queryEditor, displayLimit) { schema: json.schema, queryLimit: json.query_limit, remoteId: json.saved_query?.id, - validationResult: { - id: null, - errors: [], - completed: false, - }, hideLeftBar: json.hide_left_bar, }; dispatch(loadQueryEditor(loadedQueryEditor)); @@ -770,31 +724,10 @@ export function setActiveSouthPaneTab(tabId) { export function toggleLeftBar(queryEditor) { const hideLeftBar = !queryEditor.hideLeftBar; - return function (dispatch) { - const sync = isFeatureEnabled(FeatureFlag.SQLLAB_BACKEND_PERSISTENCE) - ? SupersetClient.put({ - endpoint: encodeURI(`/tabstateview/${queryEditor.id}`), - postPayload: { hide_left_bar: hideLeftBar }, - }) - : Promise.resolve(); - - return sync - .then(() => - dispatch({ - type: QUERY_EDITOR_TOGGLE_LEFT_BAR, - queryEditor, - hideLeftBar, - }), - ) - .catch(() => - dispatch( - addDangerToast( - t( - 'An error occurred while hiding the left bar. Please contact your administrator.', - ), - ), - ), - ); + return { + type: QUERY_EDITOR_TOGGLE_LEFT_BAR, + queryEditor, + hideLeftBar, }; } @@ -856,110 +789,26 @@ export function removeQuery(query) { } export function queryEditorSetDb(queryEditor, dbId) { - return function (dispatch) { - const sync = isFeatureEnabled(FeatureFlag.SQLLAB_BACKEND_PERSISTENCE) - ? SupersetClient.put({ - endpoint: encodeURI(`/tabstateview/${queryEditor.id}`), - postPayload: { database_id: dbId }, - }) - : Promise.resolve(); - - return sync - .then(() => dispatch({ type: QUERY_EDITOR_SETDB, queryEditor, dbId })) - .catch(() => - dispatch( - addDangerToast( - t( - 'An error occurred while setting the tab database ID. Please contact your administrator.', - ), - ), - ), - ); - }; + return { type: QUERY_EDITOR_SETDB, queryEditor, dbId }; } export function queryEditorSetSchema(queryEditor, schema) { - return function (dispatch) { - const sync = - isFeatureEnabled(FeatureFlag.SQLLAB_BACKEND_PERSISTENCE) && - typeof queryEditor === 'object' - ? SupersetClient.put({ - endpoint: encodeURI(`/tabstateview/${queryEditor.id}`), - postPayload: { schema }, - }) - : Promise.resolve(); - - return sync - .then(() => - dispatch({ - type: QUERY_EDITOR_SET_SCHEMA, - queryEditor: queryEditor || {}, - schema, - }), - ) - .catch(() => - dispatch( - addDangerToast( - t( - 'An error occurred while setting the tab schema. Please contact your administrator.', - ), - ), - ), - ); + return { + type: QUERY_EDITOR_SET_SCHEMA, + queryEditor: queryEditor || {}, + schema, }; } export function queryEditorSetAutorun(queryEditor, autorun) { - return function (dispatch) { - const sync = isFeatureEnabled(FeatureFlag.SQLLAB_BACKEND_PERSISTENCE) - ? SupersetClient.put({ - endpoint: encodeURI(`/tabstateview/${queryEditor.id}`), - postPayload: { autorun }, - }) - : Promise.resolve(); - - return sync - .then(() => - dispatch({ type: QUERY_EDITOR_SET_AUTORUN, queryEditor, autorun }), - ) - .catch(() => - dispatch( - addDangerToast( - t( - 'An error occurred while setting the tab autorun. Please contact your administrator.', - ), - ), - ), - ); - }; + return { type: QUERY_EDITOR_SET_AUTORUN, queryEditor, autorun }; } export function queryEditorSetTitle(queryEditor, name, id) { - return function (dispatch) { - const sync = isFeatureEnabled(FeatureFlag.SQLLAB_BACKEND_PERSISTENCE) - ? SupersetClient.put({ - endpoint: encodeURI(`/tabstateview/${id}`), - postPayload: { label: name }, - }) - : Promise.resolve(); - - return sync - .then(() => - dispatch({ - type: QUERY_EDITOR_SET_TITLE, - queryEditor: { ...queryEditor, id }, - name, - }), - ) - .catch(() => - dispatch( - addDangerToast( - t( - 'An error occurred while setting the tab name. Please contact your administrator.', - ), - ), - ), - ); + return { + type: QUERY_EDITOR_SET_TITLE, + queryEditor: { ...queryEditor, id }, + name, }; } @@ -1029,32 +878,19 @@ export function updateSavedQuery(query, clientId) { .then(() => dispatch(updateQueryEditor(query))); } -export function queryEditorSetSql(queryEditor, sql) { - return { type: QUERY_EDITOR_SET_SQL, queryEditor, sql }; +export function queryEditorSetSql(queryEditor, sql, queryId) { + return { type: QUERY_EDITOR_SET_SQL, queryEditor, sql, queryId }; } -export function formatQuery(queryEditor) { - return function (dispatch, getState) { - const { sql } = getUpToDateQuery(getState(), queryEditor); - return SupersetClient.post({ - endpoint: `/api/v1/sqllab/format_sql/`, - body: JSON.stringify({ sql }), - headers: { 'Content-Type': 'application/json' }, - }).then(({ json }) => { - dispatch(queryEditorSetSql(queryEditor, json.result)); - }); - }; -} - -export function queryEditorSetAndSaveSql(targetQueryEditor, sql) { +export function queryEditorSetAndSaveSql(targetQueryEditor, sql, queryId) { return function (dispatch, getState) { const queryEditor = getUpToDateQuery(getState(), targetQueryEditor); // saved query and set tab state use this action - dispatch(queryEditorSetSql(queryEditor, sql)); + dispatch(queryEditorSetSql(queryEditor, sql, queryId)); if (isFeatureEnabled(FeatureFlag.SQLLAB_BACKEND_PERSISTENCE)) { return SupersetClient.put({ endpoint: encodeURI(`/tabstateview/${queryEditor.id}`), - postPayload: { sql, latest_query_id: queryEditor.latestQueryId }, + postPayload: { sql, latest_query_id: queryId }, }).catch(() => dispatch( addDangerToast( @@ -1071,59 +907,32 @@ export function queryEditorSetAndSaveSql(targetQueryEditor, sql) { }; } -export function queryEditorSetQueryLimit(queryEditor, queryLimit) { - return function (dispatch) { - const sync = isFeatureEnabled(FeatureFlag.SQLLAB_BACKEND_PERSISTENCE) - ? SupersetClient.put({ - endpoint: encodeURI(`/tabstateview/${queryEditor.id}`), - postPayload: { query_limit: queryLimit }, - }) - : Promise.resolve(); +export function formatQuery(queryEditor) { + return function (dispatch, getState) { + const { sql } = getUpToDateQuery(getState(), queryEditor); + return SupersetClient.post({ + endpoint: `/api/v1/sqllab/format_sql/`, + body: JSON.stringify({ sql }), + headers: { 'Content-Type': 'application/json' }, + }).then(({ json }) => { + dispatch(queryEditorSetSql(queryEditor, json.result)); + }); + }; +} - return sync - .then(() => - dispatch({ - type: QUERY_EDITOR_SET_QUERY_LIMIT, - queryEditor, - queryLimit, - }), - ) - .catch(() => - dispatch( - addDangerToast( - t( - 'An error occurred while setting the tab name. Please contact your administrator.', - ), - ), - ), - ); +export function queryEditorSetQueryLimit(queryEditor, queryLimit) { + return { + type: QUERY_EDITOR_SET_QUERY_LIMIT, + queryEditor, + queryLimit, }; } export function queryEditorSetTemplateParams(queryEditor, templateParams) { - return function (dispatch) { - dispatch({ - type: QUERY_EDITOR_SET_TEMPLATE_PARAMS, - queryEditor, - templateParams, - }); - const sync = isFeatureEnabled(FeatureFlag.SQLLAB_BACKEND_PERSISTENCE) - ? SupersetClient.put({ - endpoint: encodeURI(`/tabstateview/${queryEditor.id}`), - postPayload: { template_params: templateParams }, - }) - : Promise.resolve(); - - return sync.catch(() => - dispatch( - addDangerToast( - t( - 'An error occurred while setting the tab template parameters. ' + - 'Please contact your administrator.', - ), - ), - ), - ); + return { + type: QUERY_EDITOR_SET_TEMPLATE_PARAMS, + queryEditor, + templateParams, }; } diff --git a/superset-frontend/src/SqlLab/actions/sqlLab.test.js b/superset-frontend/src/SqlLab/actions/sqlLab.test.js index dbf4e8a5c55f1..175ea06ec3ecf 100644 --- a/superset-frontend/src/SqlLab/actions/sqlLab.test.js +++ b/superset-frontend/src/SqlLab/actions/sqlLab.test.js @@ -32,7 +32,6 @@ import { initialState, queryId, } from 'src/SqlLab/fixtures'; -import { QueryState } from '@superset-ui/core'; const middlewares = [thunk]; const mockStore = configureMockStore(middlewares); @@ -531,88 +530,6 @@ describe('async actions', () => { afterEach(fetchMock.resetHistory); - describe('querySuccess', () => { - it('updates the tab state in the backend', () => { - expect.assertions(2); - - const store = mockStore({}); - const results = { query: { sqlEditorId: 'abcd' } }; - const expectedActions = [ - { - type: actions.QUERY_SUCCESS, - query, - results, - }, - ]; - return store.dispatch(actions.querySuccess(query, results)).then(() => { - expect(store.getActions()).toEqual(expectedActions); - expect(fetchMock.calls(updateTabStateEndpoint)).toHaveLength(1); - }); - }); - }); - - describe('fetchQueryResults', () => { - it('updates the tab state in the backend', () => { - expect.assertions(2); - - const results = { - data: mockBigNumber, - query: { sqlEditorId: 'abcd' }, - status: QueryState.SUCCESS, - query_id: 'efgh', - }; - fetchMock.get(fetchQueryEndpoint, JSON.stringify(results), { - overwriteRoutes: true, - }); - const store = mockStore({}); - const expectedActions = [ - { - type: actions.REQUEST_QUERY_RESULTS, - query, - }, - // missing below - { - type: actions.QUERY_SUCCESS, - query, - results, - }, - ]; - return store.dispatch(actions.fetchQueryResults(query)).then(() => { - expect(store.getActions()).toEqual(expectedActions); - expect(fetchMock.calls(updateTabStateEndpoint)).toHaveLength(1); - }); - }); - - it("doesn't update the tab state in the backend on stoppped query", () => { - expect.assertions(2); - - const results = { - status: QueryState.STOPPED, - query_id: 'efgh', - }; - fetchMock.get(fetchQueryEndpoint, JSON.stringify(results), { - overwriteRoutes: true, - }); - const store = mockStore({}); - const expectedActions = [ - { - type: actions.REQUEST_QUERY_RESULTS, - query, - }, - // missing below - { - type: actions.QUERY_SUCCESS, - query, - results, - }, - ]; - return store.dispatch(actions.fetchQueryResults(query)).then(() => { - expect(store.getActions()).toEqual(expectedActions); - expect(fetchMock.calls(updateTabStateEndpoint)).toHaveLength(0); - }); - }); - }); - describe('addQueryEditor', () => { it('updates the tab state in the backend', () => { expect.assertions(2); @@ -621,7 +538,7 @@ describe('async actions', () => { const expectedActions = [ { type: actions.ADD_QUERY_EDITOR, - queryEditor: { ...queryEditor, id: '1' }, + queryEditor: { ...queryEditor, id: '1', loaded: true }, }, ]; return store.dispatch(actions.addQueryEditor(queryEditor)).then(() => { @@ -673,7 +590,7 @@ describe('async actions', () => { describe('queryEditorSetDb', () => { it('updates the tab state in the backend', () => { - expect.assertions(2); + expect.assertions(1); const dbId = 42; const store = mockStore({}); @@ -684,18 +601,14 @@ describe('async actions', () => { dbId, }, ]; - return store - .dispatch(actions.queryEditorSetDb(queryEditor, dbId)) - .then(() => { - expect(store.getActions()).toEqual(expectedActions); - expect(fetchMock.calls(updateTabStateEndpoint)).toHaveLength(1); - }); + store.dispatch(actions.queryEditorSetDb(queryEditor, dbId)); + expect(store.getActions()).toEqual(expectedActions); }); }); describe('queryEditorSetSchema', () => { it('updates the tab state in the backend', () => { - expect.assertions(2); + expect.assertions(1); const schema = 'schema'; const store = mockStore({}); @@ -706,18 +619,14 @@ describe('async actions', () => { schema, }, ]; - return store - .dispatch(actions.queryEditorSetSchema(queryEditor, schema)) - .then(() => { - expect(store.getActions()).toEqual(expectedActions); - expect(fetchMock.calls(updateTabStateEndpoint)).toHaveLength(1); - }); + store.dispatch(actions.queryEditorSetSchema(queryEditor, schema)); + expect(store.getActions()).toEqual(expectedActions); }); }); describe('queryEditorSetAutorun', () => { it('updates the tab state in the backend', () => { - expect.assertions(2); + expect.assertions(1); const autorun = true; const store = mockStore({}); @@ -728,18 +637,14 @@ describe('async actions', () => { autorun, }, ]; - return store - .dispatch(actions.queryEditorSetAutorun(queryEditor, autorun)) - .then(() => { - expect(store.getActions()).toEqual(expectedActions); - expect(fetchMock.calls(updateTabStateEndpoint)).toHaveLength(1); - }); + store.dispatch(actions.queryEditorSetAutorun(queryEditor, autorun)); + expect(store.getActions()).toEqual(expectedActions); }); }); describe('queryEditorSetTitle', () => { it('updates the tab state in the backend', () => { - expect.assertions(2); + expect.assertions(1); const name = 'name'; const store = mockStore({}); @@ -750,14 +655,10 @@ describe('async actions', () => { name, }, ]; - return store - .dispatch( - actions.queryEditorSetTitle(queryEditor, name, queryEditor.id), - ) - .then(() => { - expect(store.getActions()).toEqual(expectedActions); - expect(fetchMock.calls(updateTabStateEndpoint)).toHaveLength(1); - }); + store.dispatch( + actions.queryEditorSetTitle(queryEditor, name, queryEditor.id), + ); + expect(store.getActions()).toEqual(expectedActions); }); }); @@ -803,7 +704,7 @@ describe('async actions', () => { describe('queryEditorSetQueryLimit', () => { it('updates the tab state in the backend', () => { - expect.assertions(2); + expect.assertions(1); const queryLimit = 10; const store = mockStore({}); @@ -814,18 +715,16 @@ describe('async actions', () => { queryLimit, }, ]; - return store - .dispatch(actions.queryEditorSetQueryLimit(queryEditor, queryLimit)) - .then(() => { - expect(store.getActions()).toEqual(expectedActions); - expect(fetchMock.calls(updateTabStateEndpoint)).toHaveLength(1); - }); + store.dispatch( + actions.queryEditorSetQueryLimit(queryEditor, queryLimit), + ); + expect(store.getActions()).toEqual(expectedActions); }); }); describe('queryEditorSetTemplateParams', () => { it('updates the tab state in the backend', () => { - expect.assertions(2); + expect.assertions(1); const templateParams = '{"foo": "bar"}'; const store = mockStore({}); @@ -836,14 +735,11 @@ describe('async actions', () => { templateParams, }, ]; - return store - .dispatch( - actions.queryEditorSetTemplateParams(queryEditor, templateParams), - ) - .then(() => { - expect(store.getActions()).toEqual(expectedActions); - expect(fetchMock.calls(updateTabStateEndpoint)).toHaveLength(1); - }); + store.dispatch( + actions.queryEditorSetTemplateParams(queryEditor, templateParams), + ); + + expect(store.getActions()).toEqual(expectedActions); }); }); diff --git a/superset-frontend/src/SqlLab/components/EditorAutoSync/EditorAutoSync.test.tsx b/superset-frontend/src/SqlLab/components/EditorAutoSync/EditorAutoSync.test.tsx new file mode 100644 index 0000000000000..52e1d44b247f8 --- /dev/null +++ b/superset-frontend/src/SqlLab/components/EditorAutoSync/EditorAutoSync.test.tsx @@ -0,0 +1,137 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +import React from 'react'; +import fetchMock from 'fetch-mock'; +import { render, waitFor } from 'spec/helpers/testing-library'; +import ToastContainer from 'src/components/MessageToasts/ToastContainer'; +import { initialState, defaultQueryEditor } from 'src/SqlLab/fixtures'; +import { logging } from '@superset-ui/core'; +import EditorAutoSync from '.'; + +jest.mock('@superset-ui/core', () => ({ + ...jest.requireActual('@superset-ui/core'), + logging: { + warn: jest.fn(), + }, +})); + +const editorTabLastUpdatedAt = Date.now(); +const unsavedSqlLabState = { + ...initialState.sqlLab, + unsavedQueryEditor: { + id: defaultQueryEditor.id, + name: 'updated tab name', + updatedAt: editorTabLastUpdatedAt + 100, + }, + editorTabLastUpdatedAt, +}; +beforeAll(() => { + jest.useFakeTimers(); +}); + +afterAll(() => { + jest.useRealTimers(); +}); + +test('sync the unsaved editor tab state when there are new changes since the last update', async () => { + const updateEditorTabState = `glob:*/tabstateview/${defaultQueryEditor.id}`; + fetchMock.put(updateEditorTabState, 200); + expect(fetchMock.calls(updateEditorTabState)).toHaveLength(0); + render(<EditorAutoSync />, { + useRedux: true, + initialState: { + ...initialState, + sqlLab: unsavedSqlLabState, + }, + }); + await waitFor(() => jest.runAllTimers()); + expect(fetchMock.calls(updateEditorTabState)).toHaveLength(1); + fetchMock.restore(); +}); + +test('skip syncing the unsaved editor tab state when the updates are already synced', async () => { + const updateEditorTabState = `glob:*/tabstateview/${defaultQueryEditor.id}`; + fetchMock.put(updateEditorTabState, 200); + expect(fetchMock.calls(updateEditorTabState)).toHaveLength(0); + render(<EditorAutoSync />, { + useRedux: true, + initialState: { + ...initialState, + sqlLab: { + ...initialState.sqlLab, + unsavedQueryEditor: { + id: defaultQueryEditor.id, + name: 'updated tab name', + updatedAt: editorTabLastUpdatedAt - 100, + }, + editorTabLastUpdatedAt, + }, + }, + }); + await waitFor(() => jest.runAllTimers()); + expect(fetchMock.calls(updateEditorTabState)).toHaveLength(0); + fetchMock.restore(); +}); + +test('renders an error toast when the sync failed', async () => { + const updateEditorTabState = `glob:*/tabstateview/${defaultQueryEditor.id}`; + fetchMock.put(updateEditorTabState, { + throws: new Error('errorMessage'), + }); + expect(fetchMock.calls(updateEditorTabState)).toHaveLength(0); + render( + <> + <EditorAutoSync /> + <ToastContainer /> + </>, + { + useRedux: true, + initialState: { + ...initialState, + sqlLab: unsavedSqlLabState, + }, + }, + ); + await waitFor(() => jest.runAllTimers()); + + expect(logging.warn).toHaveBeenCalledTimes(1); + expect(logging.warn).toHaveBeenCalledWith( + 'An error occurred while saving your editor state.', + expect.anything(), + ); + fetchMock.restore(); +}); diff --git a/superset-frontend/src/SqlLab/components/EditorAutoSync/index.tsx b/superset-frontend/src/SqlLab/components/EditorAutoSync/index.tsx new file mode 100644 index 0000000000000..51399753e95b7 --- /dev/null +++ b/superset-frontend/src/SqlLab/components/EditorAutoSync/index.tsx @@ -0,0 +1,106 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + +import React, { useRef, useEffect } from 'react'; +import { useDispatch, useSelector } from 'react-redux'; +import { logging } from '@superset-ui/core'; +import { + SqlLabRootState, + QueryEditor, + UnsavedQueryEditor, +} from 'src/SqlLab/types'; +import { useUpdateSqlEditorTabMutation } from 'src/hooks/apiResources/sqlEditorTabs'; +import { useDebounceValue } from 'src/hooks/useDebounceValue'; +import { setEditorTabLastUpdate } from 'src/SqlLab/actions/sqlLab'; + +const INTERVAL = 5000; + +function hasUnsavedChanges( + queryEditor: QueryEditor, + lastSavedTimestamp: number, +) { + return ( + queryEditor.inLocalStorage || + (queryEditor.updatedAt && queryEditor.updatedAt > lastSavedTimestamp) + ); +} + +export function filterUnsavedQueryEditorList( + queryEditors: QueryEditor[], + unsavedQueryEditor: UnsavedQueryEditor, + lastSavedTimestamp: number, +) { + return queryEditors + .map(queryEditor => ({ + ...queryEditor, + ...(unsavedQueryEditor.id === queryEditor.id && unsavedQueryEditor), + })) + .filter(queryEditor => hasUnsavedChanges(queryEditor, lastSavedTimestamp)); +} + +const EditorAutoSync: React.FC = () => { + const queryEditors = useSelector<SqlLabRootState, QueryEditor[]>( + state => state.sqlLab.queryEditors, + ); + const unsavedQueryEditor = useSelector<SqlLabRootState, UnsavedQueryEditor>( + state => state.sqlLab.unsavedQueryEditor, + ); + const editorTabLastUpdatedAt = useSelector<SqlLabRootState, number>( + state => state.sqlLab.editorTabLastUpdatedAt, + ); + const dispatch = useDispatch(); + const lastSavedTimestampRef = useRef<number>(editorTabLastUpdatedAt); + const [updateSqlEditor, { error }] = useUpdateSqlEditorTabMutation(); + + const debouncedUnsavedQueryEditor = useDebounceValue( + unsavedQueryEditor, + INTERVAL, + ); + + useEffect(() => { + const unsaved = filterUnsavedQueryEditorList( + queryEditors, + debouncedUnsavedQueryEditor, + lastSavedTimestampRef.current, + ); + + Promise.all( + unsaved + // TODO: Migrate migrateQueryEditorFromLocalStorage + // in TabbedSqlEditors logic by addSqlEditor mutation later + .filter(({ inLocalStorage }) => !inLocalStorage) + .map(queryEditor => updateSqlEditor({ queryEditor })), + ).then(resolvers => { + if (!resolvers.some(result => 'error' in result)) { + lastSavedTimestampRef.current = Date.now(); + dispatch(setEditorTabLastUpdate(lastSavedTimestampRef.current)); + } + }); + }, [debouncedUnsavedQueryEditor, dispatch, queryEditors, updateSqlEditor]); + + useEffect(() => { + if (error) { + logging.warn('An error occurred while saving your editor state.', error); + } + }, [dispatch, error]); + + return null; +}; + +export default EditorAutoSync; diff --git a/superset-frontend/src/SqlLab/components/QueryTable/index.tsx b/superset-frontend/src/SqlLab/components/QueryTable/index.tsx index 6ddae08e68520..5dc8a43c19310 100644 --- a/superset-frontend/src/SqlLab/components/QueryTable/index.tsx +++ b/superset-frontend/src/SqlLab/components/QueryTable/index.tsx @@ -25,7 +25,7 @@ import { t, useTheme, QueryResponse } from '@superset-ui/core'; import { useDispatch, useSelector } from 'react-redux'; import { - queryEditorSetAndSaveSql, + queryEditorSetSql, cloneQueryToNewTab, fetchQueryResults, clearQueryResults, @@ -109,7 +109,9 @@ const QueryTable = ({ const data = useMemo(() => { const restoreSql = (query: QueryResponse) => { - dispatch(queryEditorSetAndSaveSql({ id: query.sqlEditorId }, query.sql)); + dispatch( + queryEditorSetSql({ id: query.sqlEditorId }, query.sql, query.id), + ); }; const openQueryInNewTab = (query: QueryResponse) => { diff --git a/superset-frontend/src/SqlLab/components/ResultSet/index.tsx b/superset-frontend/src/SqlLab/components/ResultSet/index.tsx index 35eac78044b0f..58e55a1df7911 100644 --- a/superset-frontend/src/SqlLab/components/ResultSet/index.tsx +++ b/superset-frontend/src/SqlLab/components/ResultSet/index.tsx @@ -18,6 +18,7 @@ */ import React, { useCallback, useEffect, useState } from 'react'; import { useDispatch } from 'react-redux'; +import { useHistory } from 'react-router-dom'; import ButtonGroup from 'src/components/ButtonGroup'; import Alert from 'src/components/Alert'; import Button from 'src/components/Button'; @@ -161,6 +162,7 @@ const ResultSet = ({ const [showSaveDatasetModal, setShowSaveDatasetModal] = useState(false); const [alertIsOpen, setAlertIsOpen] = useState(false); + const history = useHistory(); const dispatch = useDispatch(); const reRunQueryIfSessionTimeoutErrorOnMount = useCallback(() => { @@ -215,9 +217,11 @@ const ResultSet = ({ setSearchText(event.target.value); }; - const createExploreResultsOnClick = async () => { + const createExploreResultsOnClick = async (clickEvent: React.MouseEvent) => { const { results } = query; + const openInNewWindow = clickEvent.metaKey; + if (results?.query_id) { const key = await postFormData(results.query_id, 'query', { ...EXPLORE_CHART_DEFAULT, @@ -229,7 +233,11 @@ const ResultSet = ({ const url = mountExploreUrl(null, { [URL_PARAMS.formDataKey.name]: key, }); - window.open(url, '_blank', 'noreferrer'); + if (openInNewWindow) { + window.open(url, '_blank', 'noreferrer'); + } else { + history.push(url); + } } else { addDangerToast(t('Unable to create chart without a query id.')); } diff --git a/superset-frontend/src/SqlLab/components/SqlEditor/index.tsx b/superset-frontend/src/SqlLab/components/SqlEditor/index.tsx index 609cb917b6f20..088143448731a 100644 --- a/superset-frontend/src/SqlLab/components/SqlEditor/index.tsx +++ b/superset-frontend/src/SqlLab/components/SqlEditor/index.tsx @@ -93,7 +93,7 @@ import { } from 'src/utils/localStorageHelpers'; import { EmptyStateBig } from 'src/components/EmptyState'; import getBootstrapData from 'src/utils/getBootstrapData'; -import { isEmpty } from 'lodash'; +import { isBoolean, isEmpty } from 'lodash'; import TemplateParamsEditor from '../TemplateParamsEditor'; import SouthPane from '../SouthPane'; import SaveQuery, { QueryPayload } from '../SaveQuery'; @@ -255,7 +255,9 @@ const SqlEditor: React.FC<Props> = ({ if (unsavedQueryEditor?.id === queryEditor.id) { dbId = unsavedQueryEditor.dbId || dbId; latestQueryId = unsavedQueryEditor.latestQueryId || latestQueryId; - hideLeftBar = unsavedQueryEditor.hideLeftBar || hideLeftBar; + hideLeftBar = isBoolean(unsavedQueryEditor.hideLeftBar) + ? unsavedQueryEditor.hideLeftBar + : hideLeftBar; } return { database: databases[dbId || ''], @@ -557,10 +559,9 @@ const SqlEditor: React.FC<Props> = ({ [setQueryEditorAndSaveSql], ); - const onSqlChanged = (sql: string) => { + const onSqlChanged = useEffectEvent((sql: string) => { dispatch(queryEditorSetSql(queryEditor, sql)); - setQueryEditorAndSaveSqlWithDebounce(sql); - }; + }); // Return the heights for the ace editor and the south pane as an object // given the height of the sql editor, north pane percent and south pane percent. @@ -785,7 +786,7 @@ const SqlEditor: React.FC<Props> = ({ )} <AceEditorWrapper autocomplete={autocompleteEnabled} - onBlur={setQueryEditorAndSaveSql} + onBlur={onSqlChanged} onChange={onSqlChanged} queryEditorId={queryEditor.id} height={`${aceEditorHeight}px`} diff --git a/superset-frontend/src/SqlLab/fixtures.ts b/superset-frontend/src/SqlLab/fixtures.ts index 4f6ad9ceb5dac..7a08c51876ff2 100644 --- a/superset-frontend/src/SqlLab/fixtures.ts +++ b/superset-frontend/src/SqlLab/fixtures.ts @@ -22,9 +22,11 @@ import { ColumnKeyTypeType } from 'src/SqlLab/components/ColumnElement'; import { DatasourceType, denormalizeTimestamp, + GenericDataType, QueryResponse, QueryState, } from '@superset-ui/core'; +import { LatestQueryEditorVersion } from 'src/SqlLab/types'; import { ISaveableDatasource } from 'src/SqlLab/components/SaveDatasetModal'; export const mockedActions = sinon.stub({ ...actions }); @@ -181,6 +183,7 @@ export const table = { }; export const defaultQueryEditor = { + version: LatestQueryEditorVersion, id: 'dfsadfs', autorun: false, dbId: undefined, @@ -579,11 +582,13 @@ const baseQuery: QueryResponse = { is_dttm: true, column_name: 'ds', type: 'STRING', + type_generic: GenericDataType.STRING, }, { is_dttm: false, column_name: 'gender', type: 'STRING', + type_generic: GenericDataType.STRING, }, ], selected_columns: [ @@ -591,11 +596,13 @@ const baseQuery: QueryResponse = { is_dttm: true, column_name: 'ds', type: 'STRING', + type_generic: GenericDataType.TEMPORAL, }, { is_dttm: false, column_name: 'gender', type: 'STRING', + type_generic: GenericDataType.STRING, }, ], expanded_columns: [ @@ -603,6 +610,7 @@ const baseQuery: QueryResponse = { is_dttm: true, column_name: 'ds', type: 'STRING', + type_generic: GenericDataType.STRING, }, ], data: [ diff --git a/superset-frontend/src/SqlLab/middlewares/persistSqlLabStateEnhancer.js b/superset-frontend/src/SqlLab/middlewares/persistSqlLabStateEnhancer.js index 4e32095e2853e..d1bec5e0c16a9 100644 --- a/superset-frontend/src/SqlLab/middlewares/persistSqlLabStateEnhancer.js +++ b/superset-frontend/src/SqlLab/middlewares/persistSqlLabStateEnhancer.js @@ -18,6 +18,9 @@ */ // TODO: requires redux-localstorage > 1.0 for typescript support import persistState from 'redux-localstorage'; +import { pickBy } from 'lodash'; +import { isFeatureEnabled, FeatureFlag } from '@superset-ui/core'; +import { filterUnsavedQueryEditorList } from 'src/SqlLab/components/EditorAutoSync'; import { emptyTablePersistData, emptyQueryResults, @@ -38,6 +41,39 @@ const sqlLabPersistStateConfig = { slicer: paths => state => { const subset = {}; paths.forEach(path => { + if (isFeatureEnabled(FeatureFlag.SQLLAB_BACKEND_PERSISTENCE)) { + const { + queryEditors, + editorTabLastUpdatedAt, + unsavedQueryEditor, + tables, + queries, + tabHistory, + } = state.sqlLab; + const unsavedQueryEditors = filterUnsavedQueryEditorList( + queryEditors, + unsavedQueryEditor, + editorTabLastUpdatedAt, + ); + if (unsavedQueryEditors.length > 0) { + const hasFinishedMigrationFromLocalStorage = + unsavedQueryEditors.every( + ({ inLocalStorage }) => !inLocalStorage, + ); + subset.sqlLab = { + queryEditors: unsavedQueryEditors, + ...(!hasFinishedMigrationFromLocalStorage && { + tabHistory, + tables: tables.filter(table => table.inLocalStorage), + queries: pickBy( + queries, + query => query.inLocalStorage && !query.isDataPreview, + ), + }), + }; + } + return; + } // this line is used to remove old data from browser localStorage. // we used to persist all redux state into localStorage, but // it caused configurations passed from server-side got override. diff --git a/superset-frontend/src/SqlLab/reducers/getInitialState.test.ts b/superset-frontend/src/SqlLab/reducers/getInitialState.test.ts index aca11e2cca10f..1dd3220fcc467 100644 --- a/superset-frontend/src/SqlLab/reducers/getInitialState.test.ts +++ b/superset-frontend/src/SqlLab/reducers/getInitialState.test.ts @@ -54,6 +54,10 @@ const apiDataWithTabState = { }, }; describe('getInitialState', () => { + afterEach(() => { + localStorage.clear(); + }); + it('should output the user that is passed in', () => { expect(getInitialState(apiData).user?.userId).toEqual(1); }); @@ -134,10 +138,6 @@ describe('getInitialState', () => { }); describe('dedupe tables schema', () => { - afterEach(() => { - localStorage.clear(); - }); - it('should dedupe the table schema', () => { localStorage.setItem( 'redux', @@ -245,4 +245,109 @@ describe('getInitialState', () => { ); }); }); + + describe('restore unsaved changes for PERSISTENCE mode', () => { + const lastUpdatedTime = Date.now(); + const expectedValue = 'updated editor value'; + beforeEach(() => { + localStorage.setItem( + 'redux', + JSON.stringify({ + sqlLab: { + queryEditors: [ + { + // restore cached value since updates are after server update time + id: '1', + name: expectedValue, + updatedAt: lastUpdatedTime + 100, + }, + { + // no update required given that last updated time comes before server update time + id: '2', + name: expectedValue, + updatedAt: lastUpdatedTime - 100, + }, + { + // no update required given that there's no updatedAt + id: '3', + name: expectedValue, + }, + ], + }, + }), + ); + }); + + it('restore unsaved changes for PERSISTENCE mode', () => { + const apiDataWithLocalStorage = { + ...apiData, + active_tab: { + ...apiDataWithTabState.active_tab, + id: 1, + label: 'persisted tab', + table_schemas: [], + extra_json: { + updatedAt: lastUpdatedTime, + }, + }, + tab_state_ids: [{ id: 1, label: '' }], + }; + expect( + getInitialState(apiDataWithLocalStorage).sqlLab.queryEditors[0], + ).toEqual( + expect.objectContaining({ + id: '1', + name: expectedValue, + }), + ); + }); + + it('skip unsaved changes for expired data', () => { + const apiDataWithLocalStorage = { + ...apiData, + active_tab: { + ...apiDataWithTabState.active_tab, + id: 2, + label: 'persisted tab', + table_schemas: [], + extra_json: { + updatedAt: lastUpdatedTime, + }, + }, + tab_state_ids: [{ id: 2, label: '' }], + }; + expect( + getInitialState(apiDataWithLocalStorage).sqlLab.queryEditors[1], + ).toEqual( + expect.objectContaining({ + id: '2', + name: apiDataWithLocalStorage.active_tab.label, + }), + ); + }); + + it('skip unsaved changes for legacy cache data', () => { + const apiDataWithLocalStorage = { + ...apiData, + active_tab: { + ...apiDataWithTabState.active_tab, + id: 3, + label: 'persisted tab', + table_schemas: [], + extra_json: { + updatedAt: lastUpdatedTime, + }, + }, + tab_state_ids: [{ id: 3, label: '' }], + }; + expect( + getInitialState(apiDataWithLocalStorage).sqlLab.queryEditors[2], + ).toEqual( + expect.objectContaining({ + id: '3', + name: apiDataWithLocalStorage.active_tab.label, + }), + ); + }); + }); }); diff --git a/superset-frontend/src/SqlLab/reducers/getInitialState.ts b/superset-frontend/src/SqlLab/reducers/getInitialState.ts index e2aa1d4688738..8d72a313b2716 100644 --- a/superset-frontend/src/SqlLab/reducers/getInitialState.ts +++ b/superset-frontend/src/SqlLab/reducers/getInitialState.ts @@ -20,11 +20,13 @@ import { t } from '@superset-ui/core'; import getToastsFromPyFlashMessages from 'src/components/MessageToasts/getToastsFromPyFlashMessages'; import type { BootstrapData } from 'src/types/bootstrapTypes'; import type { InitialState } from 'src/hooks/apiResources/sqlLab'; -import type { +import { QueryEditor, UnsavedQueryEditor, SqlLabRootState, Table, + LatestQueryEditorVersion, + QueryEditorVersion, } from 'src/SqlLab/types'; export function dedupeTabHistory(tabHistory: string[]) { @@ -53,6 +55,7 @@ export default function getInitialState({ */ let queryEditors: Record<string, QueryEditor> = {}; const defaultQueryEditor = { + version: LatestQueryEditorVersion, loaded: true, name: t('Untitled query'), sql: 'SELECT *\nFROM\nWHERE', @@ -73,6 +76,7 @@ export default function getInitialState({ let queryEditor: QueryEditor; if (activeTab && activeTab.id === id) { queryEditor = { + version: activeTab.extra_json?.version ?? QueryEditorVersion.v1, id: id.toString(), loaded: true, name: activeTab.label, @@ -88,6 +92,7 @@ export default function getInitialState({ schema: activeTab.schema, queryLimit: activeTab.query_limit, hideLeftBar: activeTab.hide_left_bar, + updatedAt: activeTab.extra_json?.updatedAt, }; } else { // dummy state, actual state will be loaded on tab switch @@ -103,11 +108,12 @@ export default function getInitialState({ [queryEditor.id]: queryEditor, }; }); - const tabHistory = activeTab ? [activeTab.id.toString()] : []; let tables = {} as Record<string, Table>; - const editorTabLastUpdatedAt = Date.now(); + let editorTabLastUpdatedAt = Date.now(); if (activeTab) { + editorTabLastUpdatedAt = + activeTab.extra_json?.updatedAt || editorTabLastUpdatedAt; activeTab.table_schemas .filter(tableSchema => tableSchema.description !== null) .forEach(tableSchema => { @@ -153,37 +159,57 @@ export default function getInitialState({ // add query editors and tables to state with a special flag so they can // be migrated if the `SQLLAB_BACKEND_PERSISTENCE` feature flag is on sqlLab.queryEditors.forEach(qe => { + const hasConflictFromBackend = Boolean(queryEditors[qe.id]); + const unsavedUpdatedAt = queryEditors[qe.id]?.updatedAt; + const hasUnsavedUpdateSinceLastSave = + qe.updatedAt && + (!unsavedUpdatedAt || qe.updatedAt > unsavedUpdatedAt); + const cachedQueryEditor: UnsavedQueryEditor = + !hasConflictFromBackend || hasUnsavedUpdateSinceLastSave ? qe : {}; queryEditors = { ...queryEditors, [qe.id]: { ...queryEditors[qe.id], - ...qe, - name: qe.title || qe.name, - ...(unsavedQueryEditor.id === qe.id && unsavedQueryEditor), - inLocalStorage: true, + ...cachedQueryEditor, + name: + cachedQueryEditor.title || + cachedQueryEditor.name || + queryEditors[qe.id]?.name, + ...(cachedQueryEditor.id && + unsavedQueryEditor.id === qe.id && + unsavedQueryEditor), + inLocalStorage: !hasConflictFromBackend, loaded: true, }, }; }); const expandedTables = new Set(); - tables = sqlLab.tables.reduce((merged, table) => { - const expanded = !expandedTables.has(table.queryEditorId); - if (expanded) { - expandedTables.add(table.queryEditorId); - } - return { - ...merged, - [table.id]: { - ...tables[table.id], - ...table, - expanded, - }, - }; - }, tables); - Object.values(sqlLab.queries).forEach(query => { - queries[query.id] = { ...query, inLocalStorage: true }; - }); - tabHistory.push(...sqlLab.tabHistory); + + if (sqlLab.tables) { + tables = sqlLab.tables.reduce((merged, table) => { + const expanded = !expandedTables.has(table.queryEditorId); + if (expanded) { + expandedTables.add(table.queryEditorId); + } + return { + ...merged, + [table.id]: { + ...tables[table.id], + ...table, + expanded, + inLocalStorage: true, + }, + }; + }, tables); + } + if (sqlLab.queries) { + Object.values(sqlLab.queries).forEach(query => { + queries[query.id] = { ...query, inLocalStorage: true }; + }); + } + if (sqlLab.tabHistory) { + tabHistory.push(...sqlLab.tabHistory); + } } } } catch (error) { diff --git a/superset-frontend/src/SqlLab/reducers/sqlLab.js b/superset-frontend/src/SqlLab/reducers/sqlLab.js index 278109564f96a..59bd0558a1f1c 100644 --- a/superset-frontend/src/SqlLab/reducers/sqlLab.js +++ b/superset-frontend/src/SqlLab/reducers/sqlLab.js @@ -29,7 +29,7 @@ import { extendArr, } from '../../reduxUtils'; -function alterUnsavedQueryEditorState(state, updatedState, id) { +function alterUnsavedQueryEditorState(state, updatedState, id, silent = false) { if (state.tabHistory[state.tabHistory.length - 1] !== id) { const { queryEditors } = alterInArr( state, @@ -45,6 +45,7 @@ function alterUnsavedQueryEditorState(state, updatedState, id) { unsavedQueryEditor: { ...(state.unsavedQueryEditor.id === id && state.unsavedQueryEditor), ...(id ? { id, ...updatedState } : state.unsavedQueryEditor), + ...(!silent && { updatedAt: new Date().getTime() }), }, }; } @@ -64,7 +65,10 @@ export default function sqlLabReducer(state = {}, action) { ...mergeUnsavedState, tabHistory: [...state.tabHistory, action.queryEditor.id], }; - return addToArr(newState, 'queryEditors', action.queryEditor); + return addToArr(newState, 'queryEditors', { + ...action.queryEditor, + updatedAt: new Date().getTime(), + }); }, [actions.QUERY_EDITOR_SAVED]() { const { query, result, clientId } = action; @@ -308,6 +312,7 @@ export default function sqlLabReducer(state = {}, action) { latestQueryId: action.query.id, }, action.query.sqlEditorId, + action.query.isDataPreview, ), }; }, @@ -378,14 +383,12 @@ export default function sqlLabReducer(state = {}, action) { qeIds.indexOf(action.queryEditor?.id) > -1 && state.tabHistory[state.tabHistory.length - 1] !== action.queryEditor.id ) { - const mergeUnsavedState = alterInArr( - state, - 'queryEditors', - state.unsavedQueryEditor, - { + const mergeUnsavedState = { + ...alterInArr(state, 'queryEditors', state.unsavedQueryEditor, { ...state.unsavedQueryEditor, - }, - ); + }), + unsavedQueryEditor: {}, + }; return { ...(action.queryEditor.id === state.unsavedQueryEditor.id ? alterInArr( @@ -522,12 +525,20 @@ export default function sqlLabReducer(state = {}, action) { }; }, [actions.QUERY_EDITOR_SET_SQL]() { + const { unsavedQueryEditor } = state; + if ( + unsavedQueryEditor?.id === action.queryEditor.id && + unsavedQueryEditor.sql === action.sql + ) { + return state; + } return { ...state, ...alterUnsavedQueryEditorState( state, { sql: action.sql, + ...(action.queryId && { latestQueryId: action.queryId }), }, action.queryEditor.id, ), @@ -566,6 +577,7 @@ export default function sqlLabReducer(state = {}, action) { selectedText: action.sql, }, action.queryEditor.id, + true, ), }; }, @@ -708,6 +720,9 @@ export default function sqlLabReducer(state = {}, action) { [actions.CREATE_DATASOURCE_FAILED]() { return { ...state, isDatasourceLoading: false, errorMessage: action.err }; }, + [actions.SET_EDITOR_TAB_LAST_UPDATE]() { + return { ...state, editorTabLastUpdatedAt: action.timestamp }; + }, }; if (action.type in actionHandlers) { return actionHandlers[action.type](); diff --git a/superset-frontend/src/SqlLab/types.ts b/superset-frontend/src/SqlLab/types.ts index 5ecd69293ca8b..6eb42718f0c70 100644 --- a/superset-frontend/src/SqlLab/types.ts +++ b/superset-frontend/src/SqlLab/types.ts @@ -29,7 +29,14 @@ export type QueryDictionary = { [id: string]: QueryResponse; }; +export enum QueryEditorVersion { + v1 = 1, +} + +export const LatestQueryEditorVersion = QueryEditorVersion.v1; + export interface QueryEditor { + version: QueryEditorVersion; id: string; dbId?: number; name: string; @@ -48,6 +55,7 @@ export interface QueryEditor { inLocalStorage?: boolean; northPercent?: number; southPercent?: number; + updatedAt?: number; } export type toastState = { @@ -86,7 +94,7 @@ export type SqlLabRootState = { errorMessage: string | null; unsavedQueryEditor: UnsavedQueryEditor; queryCostEstimates?: Record<string, QueryCostEstimate>; - editorTabLastUpdatedAt?: number; + editorTabLastUpdatedAt: number; }; localStorageUsageInKilobytes: number; messageToasts: toastState[]; diff --git a/superset-frontend/src/SqlLab/utils/emptyQueryResults.test.js b/superset-frontend/src/SqlLab/utils/emptyQueryResults.test.js index 9984e1efcaf06..f08fccbef7a53 100644 --- a/superset-frontend/src/SqlLab/utils/emptyQueryResults.test.js +++ b/superset-frontend/src/SqlLab/utils/emptyQueryResults.test.js @@ -83,10 +83,11 @@ describe('reduxStateToLocalStorageHelper', () => { }); it('should only return selected keys for query editor', () => { - const queryEditors = [defaultQueryEditor]; - expect(Object.keys(queryEditors[0])).toContain('schema'); + const queryEditors = [{ ...defaultQueryEditor, dummy: 'value' }]; + expect(Object.keys(queryEditors[0])).toContain('dummy'); const clearedQueryEditors = clearQueryEditors(queryEditors); - expect(Object.keys(clearedQueryEditors)[0]).not.toContain('schema'); + expect(Object.keys(clearedQueryEditors[0])).toContain('version'); + expect(Object.keys(clearedQueryEditors[0])).not.toContain('dummy'); }); }); diff --git a/superset-frontend/src/SqlLab/utils/reduxStateToLocalStorageHelper.js b/superset-frontend/src/SqlLab/utils/reduxStateToLocalStorageHelper.js index 281f08bcb366f..f82711362ddbf 100644 --- a/superset-frontend/src/SqlLab/utils/reduxStateToLocalStorageHelper.js +++ b/superset-frontend/src/SqlLab/utils/reduxStateToLocalStorageHelper.js @@ -26,6 +26,7 @@ import { } from '../constants'; const PERSISTENT_QUERY_EDITOR_KEYS = new Set([ + 'version', 'remoteId', 'autorun', 'dbId', diff --git a/superset-frontend/src/assets/images/doris.png b/superset-frontend/src/assets/images/doris.png new file mode 100644 index 0000000000000..4d88f2a36cf72 Binary files /dev/null and b/superset-frontend/src/assets/images/doris.png differ diff --git a/superset-frontend/src/components/AuditInfo/ModifiedInfo.test.tsx b/superset-frontend/src/components/AuditInfo/ModifiedInfo.test.tsx new file mode 100644 index 0000000000000..af9d6913d80d9 --- /dev/null +++ b/superset-frontend/src/components/AuditInfo/ModifiedInfo.test.tsx @@ -0,0 +1,42 @@ +import React from 'react'; +import { render, screen, waitFor } from 'spec/helpers/testing-library'; +import '@testing-library/jest-dom'; +import userEvent from '@testing-library/user-event'; + +import { ModifiedInfo } from '.'; + +const TEST_DATE = '2023-11-20'; +const USER = { + id: 1, + first_name: 'Foo', + last_name: 'Bar', +}; + +test('should render a tooltip when user is provided', async () => { + render(<ModifiedInfo user={USER} date={TEST_DATE} />); + + const dateElement = screen.getByTestId('audit-info-date'); + expect(dateElement).toBeInTheDocument(); + expect(screen.getByText(TEST_DATE)).toBeInTheDocument(); + expect(screen.queryByText('Modified by: Foo Bar')).not.toBeInTheDocument(); + userEvent.hover(dateElement); + const tooltip = await screen.findByRole('tooltip'); + expect(tooltip).toBeInTheDocument(); + expect(screen.getByText('Modified by: Foo Bar')).toBeInTheDocument(); +}); + +test('should render only the date if username is not provided', async () => { + render(<ModifiedInfo date={TEST_DATE} />); + + const dateElement = screen.getByTestId('audit-info-date'); + expect(dateElement).toBeInTheDocument(); + expect(screen.getByText(TEST_DATE)).toBeInTheDocument(); + userEvent.hover(dateElement); + await waitFor( + () => { + const tooltip = screen.queryByRole('tooltip'); + expect(tooltip).not.toBeInTheDocument(); + }, + { timeout: 1000 }, + ); +}); diff --git a/superset-frontend/src/components/AuditInfo/index.tsx b/superset-frontend/src/components/AuditInfo/index.tsx new file mode 100644 index 0000000000000..24223a1554a31 --- /dev/null +++ b/superset-frontend/src/components/AuditInfo/index.tsx @@ -0,0 +1,30 @@ +import React from 'react'; + +import Owner from 'src/types/Owner'; +import { Tooltip } from 'src/components/Tooltip'; +import getOwnerName from 'src/utils/getOwnerName'; +import { t } from '@superset-ui/core'; + +export type ModifiedInfoProps = { + user?: Owner; + date: string; +}; + +export const ModifiedInfo = ({ user, date }: ModifiedInfoProps) => { + const dateSpan = ( + <span className="no-wrap" data-test="audit-info-date"> + {date} + </span> + ); + + if (user) { + const userName = getOwnerName(user); + const title = t('Modified by: %s', userName); + return ( + <Tooltip title={title} placement="bottom"> + {dateSpan} + </Tooltip> + ); + } + return dateSpan; +}; diff --git a/superset-frontend/src/components/Chart/Chart.jsx b/superset-frontend/src/components/Chart/Chart.jsx index af90ae6b0a089..da9a81516f5e8 100644 --- a/superset-frontend/src/components/Chart/Chart.jsx +++ b/superset-frontend/src/components/Chart/Chart.jsx @@ -169,7 +169,7 @@ class Chart extends React.PureComponent { // Create chart with POST request this.props.actions.postChartFormData( this.props.formData, - this.props.force || getUrlParam(URL_PARAMS.force), // allow override via url params force=true + Boolean(this.props.force || getUrlParam(URL_PARAMS.force)), // allow override via url params force=true this.props.timeout, this.props.chartId, this.props.dashboardId, diff --git a/superset-frontend/src/components/Chart/DrillDetail/DrillDetailMenuItems.test.tsx b/superset-frontend/src/components/Chart/DrillDetail/DrillDetailMenuItems.test.tsx index 8a0f8dbfc5cd1..b57f061a3e057 100644 --- a/superset-frontend/src/components/Chart/DrillDetail/DrillDetailMenuItems.test.tsx +++ b/superset-frontend/src/components/Chart/DrillDetail/DrillDetailMenuItems.test.tsx @@ -19,6 +19,7 @@ import React from 'react'; import userEvent from '@testing-library/user-event'; import { render, screen, within } from 'spec/helpers/testing-library'; +import setupPlugins from 'src/setup/setupPlugins'; import { getMockStoreWithNativeFilters } from 'spec/fixtures/mockStore'; import chartQueries, { sliceId } from 'spec/fixtures/mockChartQueries'; import { BinaryQueryObjectFilterClause } from '@superset-ui/core'; @@ -241,6 +242,10 @@ const expectDrillToDetailByAll = async ( await expectDrillToDetailModal(menuItemName, filters); }; +beforeAll(() => { + setupPlugins(); +}); + test('dropdown menu for unsupported chart', async () => { renderMenu({ formData: unsupportedChartFormData }); await expectDrillToDetailEnabled(); diff --git a/superset-frontend/src/components/Chart/chartAction.js b/superset-frontend/src/components/Chart/chartAction.js index 9e5dc0eddde96..8cd3785ae5156 100644 --- a/superset-frontend/src/components/Chart/chartAction.js +++ b/superset-frontend/src/components/Chart/chartAction.js @@ -183,7 +183,7 @@ const v1ChartDataRequest = async ( const qs = {}; if (sliceId !== undefined) qs.form_data = `{"slice_id":${sliceId}}`; if (dashboardId !== undefined) qs.dashboard_id = dashboardId; - if (force !== false) qs.force = force; + if (force) qs.force = force; const allowDomainSharding = // eslint-disable-next-line camelcase @@ -269,9 +269,12 @@ export function runAnnotationQuery({ return Promise.resolve(); } - const granularity = fd.time_grain_sqla || fd.granularity; - fd.time_grain_sqla = granularity; - fd.granularity = granularity; + // In the original formData the `granularity` attribute represents the time grain (eg + // `P1D`), but in the request payload it corresponds to the name of the column where + // the time grain should be applied (eg, `Date`), so we need to move things around. + fd.time_grain_sqla = fd.time_grain_sqla || fd.granularity; + fd.granularity = fd.granularity_sqla; + const overridesKeys = Object.keys(annotation.overrides); if (overridesKeys.includes('since') || overridesKeys.includes('until')) { annotation.overrides = { diff --git a/superset-frontend/src/components/Chart/chartActions.test.js b/superset-frontend/src/components/Chart/chartActions.test.js index 65b008de62f52..b3a6fed9f5c15 100644 --- a/superset-frontend/src/components/Chart/chartActions.test.js +++ b/superset-frontend/src/components/Chart/chartActions.test.js @@ -21,6 +21,7 @@ import fetchMock from 'fetch-mock'; import sinon from 'sinon'; import * as chartlib from '@superset-ui/core'; +import { SupersetClient } from '@superset-ui/core'; import { LOG_EVENT } from 'src/logger/actions'; import * as exploreUtils from 'src/explore/exploreUtils'; import * as actions from 'src/components/Chart/chartAction'; @@ -51,7 +52,7 @@ describe('chart actions', () => { .callsFake(() => MOCK_URL); getChartDataUriStub = sinon .stub(exploreUtils, 'getChartDataUri') - .callsFake(() => URI(MOCK_URL)); + .callsFake(({ qs }) => URI(MOCK_URL).query(qs)); fakeMetadata = { useLegacyApi: true }; metadataRegistryStub = sinon .stub(chartlib, 'getChartMetadataRegistry') @@ -81,7 +82,7 @@ describe('chart actions', () => { }); it('should query with the built query', async () => { - const actionThunk = actions.postChartFormData({}); + const actionThunk = actions.postChartFormData({}, null); await actionThunk(dispatch); expect(fetchMock.calls(MOCK_URL)).toHaveLength(1); @@ -233,4 +234,70 @@ describe('chart actions', () => { expect(json.result[0].value.toString()).toEqual(expectedBigNumber); }); }); + + describe('runAnnotationQuery', () => { + const mockDispatch = jest.fn(); + const mockGetState = () => ({ + charts: { + chartKey: { + latestQueryFormData: { + time_grain_sqla: 'P1D', + granularity_sqla: 'Date', + }, + }, + }, + }); + + beforeEach(() => { + jest.clearAllMocks(); + }); + + it('should dispatch annotationQueryStarted and annotationQuerySuccess on successful query', async () => { + const annotation = { + name: 'Holidays', + annotationType: 'EVENT', + sourceType: 'NATIVE', + color: null, + opacity: '', + style: 'solid', + width: 1, + showMarkers: false, + hideLine: false, + value: 1, + overrides: { + time_range: null, + }, + show: true, + showLabel: false, + titleColumn: '', + descriptionColumns: [], + timeColumn: '', + intervalEndColumn: '', + }; + const key = undefined; + + const postSpy = jest.spyOn(SupersetClient, 'post'); + postSpy.mockImplementation(() => + Promise.resolve({ json: { result: [] } }), + ); + const buildV1ChartDataPayloadSpy = jest.spyOn( + exploreUtils, + 'buildV1ChartDataPayload', + ); + + const queryFunc = actions.runAnnotationQuery({ annotation, key }); + await queryFunc(mockDispatch, mockGetState); + + expect(buildV1ChartDataPayloadSpy).toHaveBeenCalledWith({ + formData: { + granularity: 'Date', + granularity_sqla: 'Date', + time_grain_sqla: 'P1D', + }, + force: false, + resultFormat: 'json', + resultType: 'full', + }); + }); + }); }); diff --git a/superset-frontend/src/components/DatabaseSelector/DatabaseSelector.test.tsx b/superset-frontend/src/components/DatabaseSelector/DatabaseSelector.test.tsx index 7635361d89339..874d22ea6bb2b 100644 --- a/superset-frontend/src/components/DatabaseSelector/DatabaseSelector.test.tsx +++ b/superset-frontend/src/components/DatabaseSelector/DatabaseSelector.test.tsx @@ -290,7 +290,13 @@ test('Sends the correct db when changing the database', async () => { test('Sends the correct schema when changing the schema', async () => { const props = createProps(); - render(<DatabaseSelector {...props} />, { useRedux: true, store }); + const { rerender } = render(<DatabaseSelector {...props} db={null} />, { + useRedux: true, + store, + }); + await waitFor(() => expect(fetchMock.calls(databaseApiRoute).length).toBe(1)); + rerender(<DatabaseSelector {...props} />); + expect(props.onSchemaChange).toBeCalledTimes(0); const select = screen.getByRole('combobox', { name: 'Select schema or type to search schemas', }); @@ -301,4 +307,5 @@ test('Sends the correct schema when changing the schema', async () => { await waitFor(() => expect(props.onSchemaChange).toHaveBeenCalledWith('information_schema'), ); + expect(props.onSchemaChange).toBeCalledTimes(1); }); diff --git a/superset-frontend/src/components/DatabaseSelector/index.tsx b/superset-frontend/src/components/DatabaseSelector/index.tsx index d17489a9c2273..7b4afd9af05aa 100644 --- a/superset-frontend/src/components/DatabaseSelector/index.tsx +++ b/superset-frontend/src/components/DatabaseSelector/index.tsx @@ -16,7 +16,7 @@ * specific language governing permissions and limitations * under the License. */ -import React, { ReactNode, useState, useMemo, useEffect } from 'react'; +import React, { ReactNode, useState, useMemo, useEffect, useRef } from 'react'; import { styled, SupersetClient, t } from '@superset-ui/core'; import rison from 'rison'; import { AsyncSelect, Select } from 'src/components'; @@ -133,6 +133,8 @@ export default function DatabaseSelector({ const [currentSchema, setCurrentSchema] = useState<SchemaOption | undefined>( schema ? { label: schema, value: schema, title: schema } : undefined, ); + const schemaRef = useRef(schema); + schemaRef.current = schema; const { addSuccessToast } = useToasts(); const loadDatabases = useMemo( @@ -215,7 +217,7 @@ export default function DatabaseSelector({ function changeSchema(schema: SchemaOption | undefined) { setCurrentSchema(schema); - if (onSchemaChange) { + if (onSchemaChange && schema?.value !== schemaRef.current) { onSchemaChange(schema?.value); } } @@ -229,7 +231,9 @@ export default function DatabaseSelector({ onSuccess: (schemas, isFetched) => { if (schemas.length === 1) { changeSchema(schemas[0]); - } else if (!schemas.find(schemaOption => schema === schemaOption.value)) { + } else if ( + !schemas.find(schemaOption => schemaRef.current === schemaOption.value) + ) { changeSchema(undefined); } diff --git a/superset-frontend/src/components/Datasource/DatasourceEditor.jsx b/superset-frontend/src/components/Datasource/DatasourceEditor.jsx index 86b5c2277723f..751001297a92a 100644 --- a/superset-frontend/src/components/Datasource/DatasourceEditor.jsx +++ b/superset-frontend/src/components/Datasource/DatasourceEditor.jsx @@ -1114,7 +1114,7 @@ class DatasourceEditor extends React.PureComponent { <div css={{ width: 'calc(100% - 34px)', marginTop: -16 }}> <Field fieldKey="table_name" - label={t('Dataset name')} + label={t('Name')} control={ <TextControl controlId="table_name" diff --git a/superset-frontend/src/components/Dropdown/index.tsx b/superset-frontend/src/components/Dropdown/index.tsx index c40f479579d2e..1e2e03ceb24e8 100644 --- a/superset-frontend/src/components/Dropdown/index.tsx +++ b/superset-frontend/src/components/Dropdown/index.tsx @@ -104,12 +104,6 @@ interface ExtendedDropDownProps extends DropDownProps { ref?: RefObject<HTMLDivElement>; } -// @z-index-below-dashboard-header (100) - 1 = 99 export const NoAnimationDropdown = ( props: ExtendedDropDownProps & { children?: React.ReactNode }, -) => ( - <AntdDropdown - overlayStyle={{ zIndex: 99, animationDuration: '0s' }} - {...props} - /> -); +) => <AntdDropdown overlayStyle={props.overlayStyle} {...props} />; diff --git a/superset-frontend/src/components/DynamicEditableTitle/index.tsx b/superset-frontend/src/components/DynamicEditableTitle/index.tsx index 86205bebc267e..670962de5fb5c 100644 --- a/superset-frontend/src/components/DynamicEditableTitle/index.tsx +++ b/superset-frontend/src/components/DynamicEditableTitle/index.tsx @@ -113,10 +113,7 @@ export const DynamicEditableTitle = ({ // then we can measure the width of that span to resize the input element useLayoutEffect(() => { if (sizerRef?.current) { - sizerRef.current.innerHTML = (currentTitle || placeholder).replace( - /\s/g, - ' ', - ); + sizerRef.current.textContent = currentTitle || placeholder; } }, [currentTitle, placeholder, sizerRef]); diff --git a/superset-frontend/src/components/Select/AsyncSelect.test.tsx b/superset-frontend/src/components/Select/AsyncSelect.test.tsx index c1442a6b70a1c..0bb24b474a0cc 100644 --- a/superset-frontend/src/components/Select/AsyncSelect.test.tsx +++ b/superset-frontend/src/components/Select/AsyncSelect.test.tsx @@ -868,6 +868,20 @@ test('fires onChange when clearing the selection in multiple mode', async () => expect(onChange).toHaveBeenCalledTimes(1); }); +test('fires onChange when pasting a selection', async () => { + const onChange = jest.fn(); + render(<AsyncSelect {...defaultProps} onChange={onChange} />); + await open(); + const input = getElementByClassName('.ant-select-selection-search-input'); + const paste = createEvent.paste(input, { + clipboardData: { + getData: () => OPTIONS[0].label, + }, + }); + fireEvent(input, paste); + expect(onChange).toHaveBeenCalledTimes(1); +}); + test('does not duplicate options when using numeric values', async () => { render( <AsyncSelect diff --git a/superset-frontend/src/components/Select/AsyncSelect.tsx b/superset-frontend/src/components/Select/AsyncSelect.tsx index 20de7bb5911c0..d102af74833ee 100644 --- a/superset-frontend/src/components/Select/AsyncSelect.tsx +++ b/superset-frontend/src/components/Select/AsyncSelect.tsx @@ -554,6 +554,7 @@ const AsyncSelect = forwardRef( ...values, ]); } + fireOnChange(); }; const shouldRenderChildrenOptions = useMemo( diff --git a/superset-frontend/src/components/Select/Select.test.tsx b/superset-frontend/src/components/Select/Select.test.tsx index a6b83075825be..2910353295187 100644 --- a/superset-frontend/src/components/Select/Select.test.tsx +++ b/superset-frontend/src/components/Select/Select.test.tsx @@ -985,6 +985,20 @@ test('fires onChange when clearing the selection in multiple mode', async () => expect(onChange).toHaveBeenCalledTimes(1); }); +test('fires onChange when pasting a selection', async () => { + const onChange = jest.fn(); + render(<Select {...defaultProps} onChange={onChange} />); + await open(); + const input = getElementByClassName('.ant-select-selection-search-input'); + const paste = createEvent.paste(input, { + clipboardData: { + getData: () => OPTIONS[0].label, + }, + }); + fireEvent(input, paste); + expect(onChange).toHaveBeenCalledTimes(1); +}); + test('does not duplicate options when using numeric values', async () => { render( <Select diff --git a/superset-frontend/src/components/Select/Select.tsx b/superset-frontend/src/components/Select/Select.tsx index 6ccc1e1715dd5..1e3bc73758cb1 100644 --- a/superset-frontend/src/components/Select/Select.tsx +++ b/superset-frontend/src/components/Select/Select.tsx @@ -571,6 +571,7 @@ const Select = forwardRef( ]); } } + fireOnChange(); }; return ( diff --git a/superset-frontend/src/dashboard/actions/dashboardState.js b/superset-frontend/src/dashboard/actions/dashboardState.js index dcf1020e6d0df..b461275d8c69d 100644 --- a/superset-frontend/src/dashboard/actions/dashboardState.js +++ b/superset-frontend/src/dashboard/actions/dashboardState.js @@ -611,9 +611,14 @@ export function setDirectPathToChild(path) { return { type: SET_DIRECT_PATH, path }; } +export const SET_ACTIVE_TAB = 'SET_ACTIVE_TAB'; +export function setActiveTab(tabId, prevTabId) { + return { type: SET_ACTIVE_TAB, tabId, prevTabId }; +} + export const SET_ACTIVE_TABS = 'SET_ACTIVE_TABS'; -export function setActiveTabs(tabId, prevTabId) { - return { type: SET_ACTIVE_TABS, tabId, prevTabId }; +export function setActiveTabs(activeTabs) { + return { type: SET_ACTIVE_TABS, activeTabs }; } export const SET_FOCUSED_FILTER_FIELD = 'SET_FOCUSED_FILTER_FIELD'; diff --git a/superset-frontend/src/dashboard/components/Dashboard.jsx b/superset-frontend/src/dashboard/components/Dashboard.jsx index 827f0f455d3d6..6e909f3b1527f 100644 --- a/superset-frontend/src/dashboard/components/Dashboard.jsx +++ b/superset-frontend/src/dashboard/components/Dashboard.jsx @@ -25,9 +25,8 @@ import Loading from 'src/components/Loading'; import getBootstrapData from 'src/utils/getBootstrapData'; import getChartIdsFromLayout from '../util/getChartIdsFromLayout'; import getLayoutComponentFromChartId from '../util/getLayoutComponentFromChartId'; -import DashboardBuilder from './DashboardBuilder/DashboardBuilder'; + import { - chartPropShape, slicePropShape, dashboardInfoPropShape, dashboardStatePropShape, @@ -53,7 +52,6 @@ const propTypes = { }).isRequired, dashboardInfo: dashboardInfoPropShape.isRequired, dashboardState: dashboardStatePropShape.isRequired, - charts: PropTypes.objectOf(chartPropShape).isRequired, slices: PropTypes.objectOf(slicePropShape).isRequired, activeFilters: PropTypes.object.isRequired, chartConfiguration: PropTypes.object, @@ -213,11 +211,6 @@ class Dashboard extends React.PureComponent { } } - // return charts in array - getAllCharts() { - return Object.values(this.props.charts); - } - applyFilters() { const { appliedFilters } = this; const { activeFilters, ownDataCharts } = this.props; @@ -288,11 +281,7 @@ class Dashboard extends React.PureComponent { if (this.context.loading) { return <Loading />; } - return ( - <> - <DashboardBuilder /> - </> - ); + return this.props.children; } } diff --git a/superset-frontend/src/dashboard/components/Dashboard.test.jsx b/superset-frontend/src/dashboard/components/Dashboard.test.jsx index 56a696f913140..a66eab37e37d7 100644 --- a/superset-frontend/src/dashboard/components/Dashboard.test.jsx +++ b/superset-frontend/src/dashboard/components/Dashboard.test.jsx @@ -21,7 +21,6 @@ import { shallow } from 'enzyme'; import sinon from 'sinon'; import Dashboard from 'src/dashboard/components/Dashboard'; -import DashboardBuilder from 'src/dashboard/components/DashboardBuilder/DashboardBuilder'; import { CHART_TYPE } from 'src/dashboard/util/componentTypes'; import newComponentFactory from 'src/dashboard/util/newComponentFactory'; @@ -63,8 +62,14 @@ describe('Dashboard', () => { loadStats: {}, }; + const ChildrenComponent = () => <div>Test</div>; + function setup(overrideProps) { - const wrapper = shallow(<Dashboard {...props} {...overrideProps} />); + const wrapper = shallow( + <Dashboard {...props} {...overrideProps}> + <ChildrenComponent /> + </Dashboard>, + ); return wrapper; } @@ -76,9 +81,9 @@ describe('Dashboard', () => { '3_country_name': { values: ['USA'], scope: [] }, }; - it('should render a DashboardBuilder', () => { + it('should render the children component', () => { const wrapper = setup(); - expect(wrapper.find(DashboardBuilder)).toExist(); + expect(wrapper.find(ChildrenComponent)).toExist(); }); describe('UNSAFE_componentWillReceiveProps', () => { diff --git a/superset-frontend/src/dashboard/components/DashboardBuilder/DashboardBuilder.test.tsx b/superset-frontend/src/dashboard/components/DashboardBuilder/DashboardBuilder.test.tsx index 7c3dd23392f80..02a3a49971c3e 100644 --- a/superset-frontend/src/dashboard/components/DashboardBuilder/DashboardBuilder.test.tsx +++ b/superset-frontend/src/dashboard/components/DashboardBuilder/DashboardBuilder.test.tsx @@ -25,7 +25,7 @@ import DashboardBuilder from 'src/dashboard/components/DashboardBuilder/Dashboar import useStoredSidebarWidth from 'src/components/ResizableSidebar/useStoredSidebarWidth'; import { fetchFaveStar, - setActiveTabs, + setActiveTab, setDirectPathToChild, } from 'src/dashboard/actions/dashboardState'; import { @@ -41,7 +41,7 @@ fetchMock.get('glob:*/csstemplateasyncmodelview/api/read', {}); jest.mock('src/dashboard/actions/dashboardState', () => ({ ...jest.requireActual('src/dashboard/actions/dashboardState'), fetchFaveStar: jest.fn(), - setActiveTabs: jest.fn(), + setActiveTab: jest.fn(), setDirectPathToChild: jest.fn(), })); jest.mock('src/components/ResizableSidebar/useStoredSidebarWidth'); @@ -90,7 +90,7 @@ describe('DashboardBuilder', () => { favStarStub = (fetchFaveStar as jest.Mock).mockReturnValue({ type: 'mock-action', }); - activeTabsStub = (setActiveTabs as jest.Mock).mockReturnValue({ + activeTabsStub = (setActiveTab as jest.Mock).mockReturnValue({ type: 'mock-action', }); (useStoredSidebarWidth as jest.Mock).mockImplementation(() => [ diff --git a/superset-frontend/src/dashboard/components/DashboardBuilder/DashboardBuilder.tsx b/superset-frontend/src/dashboard/components/DashboardBuilder/DashboardBuilder.tsx index a17b168374592..51de15c7a1fcd 100644 --- a/superset-frontend/src/dashboard/components/DashboardBuilder/DashboardBuilder.tsx +++ b/superset-frontend/src/dashboard/components/DashboardBuilder/DashboardBuilder.tsx @@ -86,72 +86,10 @@ import { import { getRootLevelTabsComponent, shouldFocusTabs } from './utils'; import DashboardContainer from './DashboardContainer'; import { useNativeFilters } from './state'; +import DashboardWrapper from './DashboardWrapper'; type DashboardBuilderProps = {}; -const StyledDiv = styled.div` - ${({ theme }) => css` - display: grid; - grid-template-columns: auto 1fr; - grid-template-rows: auto 1fr; - flex: 1; - /* Special cases */ - - /* A row within a column has inset hover menu */ - .dragdroppable-column .dragdroppable-row .hover-menu--left { - left: ${theme.gridUnit * -3}px; - background: ${theme.colors.grayscale.light5}; - border: 1px solid ${theme.colors.grayscale.light2}; - } - - .dashboard-component-tabs { - position: relative; - } - - /* A column within a column or tabs has inset hover menu */ - .dragdroppable-column .dragdroppable-column .hover-menu--top, - .dashboard-component-tabs .dragdroppable-column .hover-menu--top { - top: ${theme.gridUnit * -3}px; - background: ${theme.colors.grayscale.light5}; - border: 1px solid ${theme.colors.grayscale.light2}; - } - - /* move Tabs hover menu to top near actual Tabs */ - .dashboard-component-tabs > .hover-menu-container > .hover-menu--left { - top: 0; - transform: unset; - background: transparent; - } - - /* push Chart actions to upper right */ - .dragdroppable-column .dashboard-component-chart-holder .hover-menu--top, - .dragdroppable .dashboard-component-header .hover-menu--top { - right: ${theme.gridUnit * 2}px; - top: ${theme.gridUnit * 2}px; - background: transparent; - border: none; - transform: unset; - left: unset; - } - div:hover > .hover-menu-container .hover-menu, - .hover-menu-container .hover-menu:hover { - opacity: 1; - } - - p { - margin: 0 0 ${theme.gridUnit * 2}px 0; - } - - i.danger { - color: ${theme.colors.error.base}; - } - - i.warning { - color: ${theme.colors.alert.base}; - } - `} -`; - // @z-index-above-dashboard-charts + 1 = 11 const FiltersPanel = styled.div<{ width: number; hidden: boolean }>` grid-column: 1; @@ -317,7 +255,7 @@ const DashboardContentWrapper = styled.div` width: 100%; } - & > .empty-droptarget:first-child { + & > .empty-droptarget:first-child:not(.empty-droptarget--full) { height: ${theme.gridUnit * 4}px; top: -2px; z-index: 10; @@ -640,7 +578,7 @@ const DashboardBuilder: FC<DashboardBuilderProps> = () => { : theme.gridUnit * 8; return ( - <StyledDiv> + <DashboardWrapper> {showFilterBar && filterBarOrientation === FilterBarOrientation.VERTICAL && ( <> <ResizableSidebar @@ -749,7 +687,7 @@ const DashboardBuilder: FC<DashboardBuilderProps> = () => { `} /> )} - </StyledDiv> + </DashboardWrapper> ); }; diff --git a/superset-frontend/src/dashboard/components/DashboardBuilder/DashboardContainer.tsx b/superset-frontend/src/dashboard/components/DashboardBuilder/DashboardContainer.tsx index cc4e2db780e74..f3f214468e833 100644 --- a/superset-frontend/src/dashboard/components/DashboardBuilder/DashboardContainer.tsx +++ b/superset-frontend/src/dashboard/components/DashboardBuilder/DashboardContainer.tsx @@ -18,7 +18,7 @@ */ // ParentSize uses resize observer so the dashboard will update size // when its container size changes, due to e.g., builder side panel opening -import React, { FC, useCallback, useEffect, useMemo } from 'react'; +import React, { FC, useCallback, useEffect, useMemo, useRef } from 'react'; import { useDispatch, useSelector } from 'react-redux'; import { FeatureFlag, @@ -51,7 +51,7 @@ import { setColorScheme } from 'src/dashboard/actions/dashboardState'; import jsonStringify from 'json-stringify-pretty-compact'; import { NATIVE_FILTER_DIVIDER_PREFIX } from '../nativeFilters/FiltersConfigModal/utils'; import { findTabsWithChartsInScope } from '../nativeFilters/utils'; -import { getRootLevelTabIndex, getRootLevelTabsComponent } from './utils'; +import { getRootLevelTabsComponent } from './utils'; type DashboardContainerProps = { topLevelTabs?: LayoutItem; @@ -89,15 +89,18 @@ const DashboardContainer: FC<DashboardContainerProps> = ({ topLevelTabs }) => { Object.values(state.charts).map(chart => chart.id), ); + const prevTabIndexRef = useRef(); const tabIndex = useMemo(() => { const nextTabIndex = findTabIndexByComponentId({ currentComponent: getRootLevelTabsComponent(dashboardLayout), directPathToChild, }); - return nextTabIndex > -1 - ? nextTabIndex - : getRootLevelTabIndex(dashboardLayout, directPathToChild); + if (nextTabIndex === -1) { + return prevTabIndexRef.current ?? 0; + } + prevTabIndexRef.current = nextTabIndex; + return nextTabIndex; }, [dashboardLayout, directPathToChild]); useEffect(() => { diff --git a/superset-frontend/src/dashboard/components/DashboardBuilder/DashboardWrapper.test.tsx b/superset-frontend/src/dashboard/components/DashboardBuilder/DashboardWrapper.test.tsx new file mode 100644 index 0000000000000..fb913b46273d4 --- /dev/null +++ b/superset-frontend/src/dashboard/components/DashboardBuilder/DashboardWrapper.test.tsx @@ -0,0 +1,75 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +import React from 'react'; +import { fireEvent, render } from 'spec/helpers/testing-library'; +import { OptionControlLabel } from 'src/explore/components/controls/OptionControls'; + +import DashboardWrapper from './DashboardWrapper'; + +test('should render children', () => { + const { getByTestId } = render( + <DashboardWrapper> + <div data-test="mock-children" /> + </DashboardWrapper>, + { useRedux: true, useDnd: true }, + ); + expect(getByTestId('mock-children')).toBeInTheDocument(); +}); + +test('should update the style on dragging state', () => { + const defaultProps = { + label: <span>Test label</span>, + tooltipTitle: 'This is a tooltip title', + onRemove: jest.fn(), + onMoveLabel: jest.fn(), + onDropLabel: jest.fn(), + type: 'test', + index: 0, + }; + const { container, getByText } = render( + <DashboardWrapper> + <OptionControlLabel + {...defaultProps} + index={1} + label={<span>Label 1</span>} + /> + <OptionControlLabel + {...defaultProps} + index={2} + label={<span>Label 2</span>} + /> + </DashboardWrapper>, + { + useRedux: true, + useDnd: true, + initialState: { + dashboardState: { + editMode: true, + }, + }, + }, + ); + expect( + container.getElementsByClassName('dragdroppable--dragging'), + ).toHaveLength(0); + fireEvent.dragStart(getByText('Label 1')); + expect( + container.getElementsByClassName('dragdroppable--dragging'), + ).toHaveLength(1); +}); diff --git a/superset-frontend/src/dashboard/components/DashboardBuilder/DashboardWrapper.tsx b/superset-frontend/src/dashboard/components/DashboardBuilder/DashboardWrapper.tsx new file mode 100644 index 0000000000000..f39c7ed630277 --- /dev/null +++ b/superset-frontend/src/dashboard/components/DashboardBuilder/DashboardWrapper.tsx @@ -0,0 +1,128 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +import React, { useEffect } from 'react'; +import { css, styled } from '@superset-ui/core'; +import { RootState } from 'src/dashboard/types'; +import { useSelector } from 'react-redux'; +import { useDragDropManager } from 'react-dnd'; +import classNames from 'classnames'; + +const StyledDiv = styled.div` + ${({ theme }) => css` + display: grid; + grid-template-columns: auto 1fr; + grid-template-rows: auto 1fr; + flex: 1; + /* Special cases */ + + &.dragdroppable--dragging + .dashboard-component-tabs-content + > .empty-droptarget.empty-droptarget--full { + height: 100%; + } + + /* A row within a column has inset hover menu */ + .dragdroppable-column .dragdroppable-row .hover-menu--left { + left: ${theme.gridUnit * -3}px; + background: ${theme.colors.grayscale.light5}; + border: 1px solid ${theme.colors.grayscale.light2}; + } + + .dashboard-component-tabs { + position: relative; + } + + /* A column within a column or tabs has inset hover menu */ + .dragdroppable-column .dragdroppable-column .hover-menu--top, + .dashboard-component-tabs .dragdroppable-column .hover-menu--top { + top: ${theme.gridUnit * -3}px; + background: ${theme.colors.grayscale.light5}; + border: 1px solid ${theme.colors.grayscale.light2}; + } + + /* move Tabs hover menu to top near actual Tabs */ + .dashboard-component-tabs > .hover-menu-container > .hover-menu--left { + top: 0; + transform: unset; + background: transparent; + } + + /* push Chart actions to upper right */ + .dragdroppable-column .dashboard-component-chart-holder .hover-menu--top, + .dragdroppable .dashboard-component-header .hover-menu--top { + right: ${theme.gridUnit * 2}px; + top: ${theme.gridUnit * 2}px; + background: transparent; + border: none; + transform: unset; + left: unset; + } + div:hover > .hover-menu-container .hover-menu, + .hover-menu-container .hover-menu:hover { + opacity: 1; + } + + p { + margin: 0 0 ${theme.gridUnit * 2}px 0; + } + + i.danger { + color: ${theme.colors.error.base}; + } + + i.warning { + color: ${theme.colors.alert.base}; + } + `} +`; + +type Props = {}; + +const DashboardWrapper: React.FC<Props> = ({ children }) => { + const editMode = useSelector<RootState, boolean>( + state => state.dashboardState.editMode, + ); + const dragDropManager = useDragDropManager(); + const [isDragged, setIsDragged] = React.useState( + dragDropManager.getMonitor().isDragging(), + ); + + useEffect(() => { + const monitor = dragDropManager.getMonitor(); + const unsub = monitor.subscribeToStateChange(() => { + setIsDragged(monitor.isDragging()); + }); + + return () => { + unsub(); + }; + }, [dragDropManager]); + + return ( + <StyledDiv + className={classNames({ + 'dragdroppable--dragging': editMode && isDragged, + })} + > + {children} + </StyledDiv> + ); +}; + +export default DashboardWrapper; diff --git a/superset-frontend/src/dashboard/components/DashboardBuilder/utils.ts b/superset-frontend/src/dashboard/components/DashboardBuilder/utils.ts index 50aa989c68610..8ba5405bf3079 100644 --- a/superset-frontend/src/dashboard/components/DashboardBuilder/utils.ts +++ b/superset-frontend/src/dashboard/components/DashboardBuilder/utils.ts @@ -21,7 +21,6 @@ import { DASHBOARD_ROOT_ID, } from 'src/dashboard/util/constants'; import { DashboardLayout } from 'src/dashboard/types'; -import findTabIndexByComponentId from 'src/dashboard/util/findTabIndexByComponentId'; export const getRootLevelTabsComponent = (dashboardLayout: DashboardLayout) => { const dashboardRoot = dashboardLayout[DASHBOARD_ROOT_ID]; @@ -38,15 +37,3 @@ export const shouldFocusTabs = ( // don't focus the tabs when we click on a tab event.target.className === 'ant-tabs-nav-wrap' || container.contains(event.target); - -export const getRootLevelTabIndex = ( - dashboardLayout: DashboardLayout, - directPathToChild: string[], -): number => - Math.max( - 0, - findTabIndexByComponentId({ - currentComponent: getRootLevelTabsComponent(dashboardLayout), - directPathToChild, - }), - ); diff --git a/superset-frontend/src/dashboard/components/DashboardGrid.jsx b/superset-frontend/src/dashboard/components/DashboardGrid.jsx index 601dbac4a4cdc..70cf65218f2c3 100644 --- a/superset-frontend/src/dashboard/components/DashboardGrid.jsx +++ b/superset-frontend/src/dashboard/components/DashboardGrid.jsx @@ -18,6 +18,7 @@ */ import React from 'react'; import PropTypes from 'prop-types'; +import classNames from 'classnames'; import { addAlpha, css, styled, t } from '@superset-ui/core'; import { EmptyStateBig } from 'src/components/EmptyState'; import { componentShape } from '../util/propShapes'; @@ -76,10 +77,14 @@ const GridContent = styled.div` & > .empty-droptarget:first-child { height: ${theme.gridUnit * 12}px; margin-top: ${theme.gridUnit * -6}px; - margin-bottom: ${theme.gridUnit * -6}px; } - & > .empty-droptarget:only-child { + & > .empty-droptarget:last-child { + height: ${theme.gridUnit * 12}px; + margin-top: ${theme.gridUnit * -6}px; + } + + & > .empty-droptarget.empty-droptarget--full:only-child { height: 80vh; } `} @@ -270,10 +275,14 @@ class DashboardGrid extends React.PureComponent { index={0} orientation="column" onDrop={this.handleTopDropTargetDrop} - className="empty-droptarget" + className={classNames({ + 'empty-droptarget': true, + 'empty-droptarget--full': + gridComponent?.children?.length === 0, + })} editMode > - {renderDraggableContentBottom} + {renderDraggableContentTop} </DragDroppable> )} {gridComponent?.children?.map((id, index) => ( @@ -304,7 +313,7 @@ class DashboardGrid extends React.PureComponent { className="empty-droptarget" editMode > - {renderDraggableContentTop} + {renderDraggableContentBottom} </DragDroppable> )} {isResizing && diff --git a/superset-frontend/src/dashboard/components/FiltersBadge/index.tsx b/superset-frontend/src/dashboard/components/FiltersBadge/index.tsx index cb5d261a1b3a5..6dba29c6619b7 100644 --- a/superset-frontend/src/dashboard/components/FiltersBadge/index.tsx +++ b/superset-frontend/src/dashboard/components/FiltersBadge/index.tsx @@ -59,7 +59,7 @@ const StyledFilterCount = styled.div` vertical-align: middle; color: ${theme.colors.grayscale.base}; &:hover { - color: ${theme.colors.grayscale.light1} + color: ${theme.colors.grayscale.light1}; } } diff --git a/superset-frontend/src/dashboard/components/PropertiesModal/index.tsx b/superset-frontend/src/dashboard/components/PropertiesModal/index.tsx index 92d34a4faa5b8..3a1421e3805de 100644 --- a/superset-frontend/src/dashboard/components/PropertiesModal/index.tsx +++ b/superset-frontend/src/dashboard/components/PropertiesModal/index.tsx @@ -681,7 +681,7 @@ const PropertiesModal = ({ </Row> <Row gutter={16}> <Col xs={24} md={12}> - <FormItem label={t('Title')} name="title"> + <FormItem label={t('Name')} name="title"> <Input data-test="dashboard-title-input" type="text" diff --git a/superset-frontend/src/dashboard/components/SliceHeader/SliceHeader.test.tsx b/superset-frontend/src/dashboard/components/SliceHeader/SliceHeader.test.tsx index e16cab8daab86..f452e22ac82c8 100644 --- a/superset-frontend/src/dashboard/components/SliceHeader/SliceHeader.test.tsx +++ b/superset-frontend/src/dashboard/components/SliceHeader/SliceHeader.test.tsx @@ -19,6 +19,7 @@ import React from 'react'; import { Router } from 'react-router-dom'; import { createMemoryHistory } from 'history'; +import { getExtensionsRegistry } from '@superset-ui/core'; import { render, screen } from 'spec/helpers/testing-library'; import userEvent from '@testing-library/user-event'; import SliceHeader from '.'; @@ -472,3 +473,15 @@ test('Correct actions to "SliceHeaderControls"', () => { userEvent.click(screen.getByTestId('handleToggleFullSize')); expect(props.handleToggleFullSize).toBeCalledTimes(1); }); + +test('Add extension to SliceHeader', () => { + const extensionsRegistry = getExtensionsRegistry(); + extensionsRegistry.set('dashboard.slice.header', () => ( + <div>This is an extension</div> + )); + + const props = createProps(); + render(<SliceHeader {...props} />, { useRedux: true, useRouter: true }); + + expect(screen.getByText('This is an extension')).toBeInTheDocument(); +}); diff --git a/superset-frontend/src/dashboard/components/SliceHeader/index.tsx b/superset-frontend/src/dashboard/components/SliceHeader/index.tsx index c9cb74a8aff5e..ea4f3b63ba8f3 100644 --- a/superset-frontend/src/dashboard/components/SliceHeader/index.tsx +++ b/superset-frontend/src/dashboard/components/SliceHeader/index.tsx @@ -24,7 +24,7 @@ import React, { useRef, useState, } from 'react'; -import { css, styled, t } from '@superset-ui/core'; +import { css, getExtensionsRegistry, styled, t } from '@superset-ui/core'; import { useUiConfig } from 'src/components/UiConfigContext'; import { Tooltip } from 'src/components/Tooltip'; import { useSelector } from 'react-redux'; @@ -38,6 +38,8 @@ import { RootState } from 'src/dashboard/types'; import { getSliceHeaderTooltip } from 'src/dashboard/util/getSliceHeaderTooltip'; import { DashboardPageIdContext } from 'src/dashboard/containers/DashboardPage'; +const extensionsRegistry = getExtensionsRegistry(); + type SliceHeaderProps = SliceHeaderControlsProps & { innerRef?: string; updateSliceName?: (arg0: string) => void; @@ -161,6 +163,7 @@ const SliceHeader: FC<SliceHeaderProps> = ({ width, height, }) => { + const SliceHeaderExtension = extensionsRegistry.get('dashboard.slice.header'); const uiConfig = useUiConfig(); const dashboardPageId = useContext(DashboardPageIdContext); const [headerTooltip, setHeaderTooltip] = useState<ReactNode | null>(null); @@ -239,6 +242,12 @@ const SliceHeader: FC<SliceHeaderProps> = ({ <div className="header-controls"> {!editMode && ( <> + {SliceHeaderExtension && ( + <SliceHeaderExtension + sliceId={slice.slice_id} + dashboardId={dashboardId} + /> + )} {crossFilterValue && ( <Tooltip placement="top" diff --git a/superset-frontend/src/dashboard/components/SliceHeaderControls/index.tsx b/superset-frontend/src/dashboard/components/SliceHeaderControls/index.tsx index 287d83692f9d2..17d5bdc83e05c 100644 --- a/superset-frontend/src/dashboard/components/SliceHeaderControls/index.tsx +++ b/superset-frontend/src/dashboard/components/SliceHeaderControls/index.tsx @@ -373,6 +373,12 @@ const SliceHeaderControls = (props: SliceHeaderControlsPropsWithRouter) => { ? t('Exit fullscreen') : t('Enter fullscreen'); + // @z-index-below-dashboard-header (100) - 1 = 99 for !isFullSize and 101 for isFullSize + const dropdownOverlayStyle = { + zIndex: isFullSize ? 101 : 99, + animationDuration: '0s', + }; + const menu = ( <Menu onClick={handleMenuClick} @@ -541,6 +547,7 @@ const SliceHeaderControls = (props: SliceHeaderControlsPropsWithRouter) => { )} <NoAnimationDropdown overlay={menu} + overlayStyle={dropdownOverlayStyle} trigger={['click']} placement="bottomRight" > diff --git a/superset-frontend/src/dashboard/components/SyncDashboardState/SyncDashboardState.test.tsx b/superset-frontend/src/dashboard/components/SyncDashboardState/SyncDashboardState.test.tsx new file mode 100644 index 0000000000000..1565a43e19657 --- /dev/null +++ b/superset-frontend/src/dashboard/components/SyncDashboardState/SyncDashboardState.test.tsx @@ -0,0 +1,34 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +import React from 'react'; +import { render } from 'spec/helpers/testing-library'; +import { getItem, LocalStorageKeys } from 'src/utils/localStorageHelpers'; +import SyncDashboardState from '.'; + +test('stores the dashboard info with local storages', () => { + const testDashboardPageId = 'dashboardPageId'; + render(<SyncDashboardState dashboardPageId={testDashboardPageId} />, { + useRedux: true, + }); + expect(getItem(LocalStorageKeys.dashboard__explore_context, {})).toEqual({ + [testDashboardPageId]: expect.objectContaining({ + dashboardPageId: testDashboardPageId, + }), + }); +}); diff --git a/superset-frontend/src/dashboard/components/SyncDashboardState/index.tsx b/superset-frontend/src/dashboard/components/SyncDashboardState/index.tsx new file mode 100644 index 0000000000000..b25d243292254 --- /dev/null +++ b/superset-frontend/src/dashboard/components/SyncDashboardState/index.tsx @@ -0,0 +1,103 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +import React, { useEffect } from 'react'; +import pick from 'lodash/pick'; +import { shallowEqual, useSelector } from 'react-redux'; +import { DashboardContextForExplore } from 'src/types/DashboardContextForExplore'; +import { + getItem, + LocalStorageKeys, + setItem, +} from 'src/utils/localStorageHelpers'; +import { RootState } from 'src/dashboard/types'; +import { getActiveFilters } from 'src/dashboard/util/activeDashboardFilters'; + +type Props = { dashboardPageId: string }; + +const EMPTY_OBJECT = {}; + +export const getDashboardContextLocalStorage = () => { + const dashboardsContexts = getItem( + LocalStorageKeys.dashboard__explore_context, + {}, + ); + // A new dashboard tab id is generated on each dashboard page opening. + // We mark ids as redundant when user leaves the dashboard, because they won't be reused. + // Then we remove redundant dashboard contexts from local storage in order not to clutter it + return Object.fromEntries( + Object.entries(dashboardsContexts).filter( + ([, value]) => !value.isRedundant, + ), + ); +}; + +const updateDashboardTabLocalStorage = ( + dashboardPageId: string, + dashboardContext: DashboardContextForExplore, +) => { + const dashboardsContexts = getDashboardContextLocalStorage(); + setItem(LocalStorageKeys.dashboard__explore_context, { + ...dashboardsContexts, + [dashboardPageId]: dashboardContext, + }); +}; + +const SyncDashboardState: React.FC<Props> = ({ dashboardPageId }) => { + const dashboardContextForExplore = useSelector< + RootState, + DashboardContextForExplore + >( + ({ dashboardInfo, dashboardState, nativeFilters, dataMask }) => ({ + labelColors: dashboardInfo.metadata?.label_colors || EMPTY_OBJECT, + sharedLabelColors: + dashboardInfo.metadata?.shared_label_colors || EMPTY_OBJECT, + colorScheme: dashboardState?.colorScheme, + chartConfiguration: + dashboardInfo.metadata?.chart_configuration || EMPTY_OBJECT, + nativeFilters: Object.entries(nativeFilters.filters).reduce( + (acc, [key, filterValue]) => ({ + ...acc, + [key]: pick(filterValue, ['chartsInScope']), + }), + {}, + ), + dataMask, + dashboardId: dashboardInfo.id, + filterBoxFilters: getActiveFilters(), + dashboardPageId, + }), + shallowEqual, + ); + + useEffect(() => { + updateDashboardTabLocalStorage(dashboardPageId, dashboardContextForExplore); + return () => { + // mark tab id as redundant when dashboard unmounts - case when user opens + // Explore in the same tab + updateDashboardTabLocalStorage(dashboardPageId, { + ...dashboardContextForExplore, + isRedundant: true, + }); + }; + }, [dashboardContextForExplore, dashboardPageId]); + + return null; +}; + +export default SyncDashboardState; diff --git a/superset-frontend/src/dashboard/components/dnd/DragDroppable.jsx b/superset-frontend/src/dashboard/components/dnd/DragDroppable.jsx index 3bc9f4d299a00..6a49f9887550f 100644 --- a/superset-frontend/src/dashboard/components/dnd/DragDroppable.jsx +++ b/superset-frontend/src/dashboard/components/dnd/DragDroppable.jsx @@ -90,6 +90,11 @@ const DragDroppableStyles = styled.div` z-index: 10; } + &.empty-droptarget--full > .drop-indicator--top { + height: 100%; + opacity: 0.3; + } + & { .drop-indicator { display: block; @@ -99,7 +104,7 @@ const DragDroppableStyles = styled.div` } .drop-indicator--top { - top: 0; + top: ${-theme.gridUnit - 2}px; left: 0; height: ${theme.gridUnit}px; width: 100%; @@ -107,7 +112,7 @@ const DragDroppableStyles = styled.div` } .drop-indicator--bottom { - top: 100%; + bottom: ${-theme.gridUnit - 2}px; left: 0; height: ${theme.gridUnit}px; width: 100%; @@ -116,7 +121,7 @@ const DragDroppableStyles = styled.div` .drop-indicator--right { top: 0; - left: 100%; + left: calc(100% - ${theme.gridUnit}px); height: 100%; width: ${theme.gridUnit}px; min-height: ${theme.gridUnit * 4}px; diff --git a/superset-frontend/src/dashboard/components/gridComponents/Tab.jsx b/superset-frontend/src/dashboard/components/gridComponents/Tab.jsx index 32ac77936c6ad..d1d08176baa93 100644 --- a/superset-frontend/src/dashboard/components/gridComponents/Tab.jsx +++ b/superset-frontend/src/dashboard/components/gridComponents/Tab.jsx @@ -18,6 +18,7 @@ */ import React from 'react'; import PropTypes from 'prop-types'; +import classNames from 'classnames'; import { bindActionCreators } from 'redux'; import { connect } from 'react-redux'; import { styled, t } from '@superset-ui/core'; @@ -173,7 +174,10 @@ class Tab extends React.PureComponent { depth={depth} onDrop={this.handleTopDropTargetDrop} editMode - className="empty-droptarget" + className={classNames({ + 'empty-droptarget': true, + 'empty-droptarget--full': tabComponent.children.length === 0, + })} > {renderDraggableContentTop} </DragDroppable> @@ -234,7 +238,7 @@ class Tab extends React.PureComponent { /> ))} {/* Make bottom of tab droppable */} - {editMode && ( + {editMode && tabComponent.children.length > 0 && ( <DragDroppable component={tabComponent} parentComponent={tabParentComponent} diff --git a/superset-frontend/src/dashboard/components/gridComponents/Tabs.jsx b/superset-frontend/src/dashboard/components/gridComponents/Tabs.jsx index 7d9a46b75df77..67f4b3c598bd9 100644 --- a/superset-frontend/src/dashboard/components/gridComponents/Tabs.jsx +++ b/superset-frontend/src/dashboard/components/gridComponents/Tabs.jsx @@ -51,7 +51,7 @@ const propTypes = { // actions (from DashboardComponent.jsx) logEvent: PropTypes.func.isRequired, - setActiveTabs: PropTypes.func, + setActiveTab: PropTypes.func, // grid related availableColumnCount: PropTypes.number, @@ -75,7 +75,7 @@ const defaultProps = { columnWidth: 0, activeTabs: [], directPathToChild: [], - setActiveTabs() {}, + setActiveTab() {}, onResizeStart() {}, onResize() {}, onResizeStop() {}, @@ -125,12 +125,12 @@ export class Tabs extends React.PureComponent { } componentDidMount() { - this.props.setActiveTabs(this.state.activeKey); + this.props.setActiveTab(this.state.activeKey); } componentDidUpdate(prevProps, prevState) { if (prevState.activeKey !== this.state.activeKey) { - this.props.setActiveTabs(this.state.activeKey, prevState.activeKey); + this.props.setActiveTab(this.state.activeKey, prevState.activeKey); } } diff --git a/superset-frontend/src/dashboard/components/nativeFilters/FilterBar/FilterControls/FilterControl.tsx b/superset-frontend/src/dashboard/components/nativeFilters/FilterBar/FilterControls/FilterControl.tsx index 37739e5370686..96f51f5359e13 100644 --- a/superset-frontend/src/dashboard/components/nativeFilters/FilterBar/FilterControls/FilterControl.tsx +++ b/superset-frontend/src/dashboard/components/nativeFilters/FilterBar/FilterControls/FilterControl.tsx @@ -49,7 +49,6 @@ const VerticalFilterControlTitle = styled.h4` const HorizontalFilterControlTitle = styled(VerticalFilterControlTitle)` font-weight: ${({ theme }) => theme.typography.weights.normal}; color: ${({ theme }) => theme.colors.grayscale.base}; - max-width: ${({ theme }) => theme.gridUnit * 15}px; ${truncationCSS}; `; diff --git a/superset-frontend/src/dashboard/components/nativeFilters/FilterBar/FilterControls/FilterValue.tsx b/superset-frontend/src/dashboard/components/nativeFilters/FilterBar/FilterControls/FilterValue.tsx index 5235edcdc353d..f44a1a1df6878 100644 --- a/superset-frontend/src/dashboard/components/nativeFilters/FilterBar/FilterControls/FilterValue.tsx +++ b/superset-frontend/src/dashboard/components/nativeFilters/FilterBar/FilterControls/FilterValue.tsx @@ -52,6 +52,7 @@ import { onFiltersRefreshSuccess, setDirectPathToChild, } from 'src/dashboard/actions/dashboardState'; +import { RESPONSIVE_WIDTH } from 'src/filters/components/common'; import { FAST_DEBOUNCE } from 'src/constants'; import { dispatchHoverAction, dispatchFocusAction } from './utils'; import { FilterControlProps } from './types'; @@ -322,7 +323,7 @@ const FilterValue: React.FC<FilterControlProps> = ({ ) : ( <SuperChart height={HEIGHT} - width="100%" + width={RESPONSIVE_WIDTH} showOverflow={showOverflow} formData={formData} displaySettings={displaySettings} diff --git a/superset-frontend/src/dashboard/components/nativeFilters/FilterBar/index.tsx b/superset-frontend/src/dashboard/components/nativeFilters/FilterBar/index.tsx index 546742c6dd73b..61014a6e0f8dc 100644 --- a/superset-frontend/src/dashboard/components/nativeFilters/FilterBar/index.tsx +++ b/superset-frontend/src/dashboard/components/nativeFilters/FilterBar/index.tsx @@ -44,7 +44,10 @@ import { getInitialDataMask } from 'src/dataMask/reducer'; import { URL_PARAMS } from 'src/constants'; import { getUrlParam } from 'src/utils/urlUtils'; import { useTabId } from 'src/hooks/useTabId'; +import { logEvent } from 'src/logger/actions'; +import { LOG_ACTIONS_CHANGE_DASHBOARD_FILTER } from 'src/logger/LogUtils'; import { FilterBarOrientation, RootState } from 'src/dashboard/types'; +import { UserWithPermissionsAndRoles } from 'src/types/bootstrapTypes'; import { checkIsApplyDisabled } from './utils'; import { FiltersBarProps } from './types'; import { @@ -147,6 +150,10 @@ const FilterBar: React.FC<FiltersBarProps> = ({ const canEdit = useSelector<RootState, boolean>( ({ dashboardInfo }) => dashboardInfo.dash_edit_perm, ); + const user: UserWithPermissionsAndRoles = useSelector< + RootState, + UserWithPermissionsAndRoles + >(state => state.user); const [filtersInScope] = useSelectFiltersInScope(nativeFilterValues); @@ -218,11 +225,15 @@ const FilterBar: React.FC<FiltersBarProps> = ({ }, [dataMaskAppliedText, setDataMaskSelected]); useEffect(() => { - publishDataMask(history, dashboardId, updateKey, dataMaskApplied, tabId); + // embedded users can't persist filter combinations + if (user?.userId) { + publishDataMask(history, dashboardId, updateKey, dataMaskApplied, tabId); + } // eslint-disable-next-line react-hooks/exhaustive-deps }, [dashboardId, dataMaskAppliedText, history, updateKey, tabId]); const handleApply = useCallback(() => { + dispatch(logEvent(LOG_ACTIONS_CHANGE_DASHBOARD_FILTER, {})); const filterIds = Object.keys(dataMaskSelected); setUpdateKey(1); filterIds.forEach(filterId => { diff --git a/superset-frontend/src/dashboard/containers/Dashboard.ts b/superset-frontend/src/dashboard/containers/Dashboard.ts index 50e42fef24dce..5f9b29b95dd46 100644 --- a/superset-frontend/src/dashboard/containers/Dashboard.ts +++ b/superset-frontend/src/dashboard/containers/Dashboard.ts @@ -39,7 +39,6 @@ function mapStateToProps(state: RootState) { const { datasources, sliceEntities, - charts, dataMask, dashboardInfo, dashboardState, @@ -54,7 +53,6 @@ function mapStateToProps(state: RootState) { userId: dashboardInfo.userId, dashboardInfo, dashboardState, - charts, datasources, // filters prop: a map structure for all the active filter_box's values and scope in this dashboard, // for each filter field. map key is [chartId_column] diff --git a/superset-frontend/src/dashboard/containers/DashboardComponent.jsx b/superset-frontend/src/dashboard/containers/DashboardComponent.jsx index 08b7ed9f82d90..68478adb073fc 100644 --- a/superset-frontend/src/dashboard/containers/DashboardComponent.jsx +++ b/superset-frontend/src/dashboard/containers/DashboardComponent.jsx @@ -35,7 +35,7 @@ import { } from 'src/dashboard/actions/dashboardLayout'; import { setDirectPathToChild, - setActiveTabs, + setActiveTab, setFullSizeChartId, } from 'src/dashboard/actions/dashboardState'; @@ -109,7 +109,7 @@ function mapDispatchToProps(dispatch) { handleComponentDrop, setDirectPathToChild, setFullSizeChartId, - setActiveTabs, + setActiveTab, logEvent, }, dispatch, diff --git a/superset-frontend/src/dashboard/containers/DashboardPage.tsx b/superset-frontend/src/dashboard/containers/DashboardPage.tsx index aef0fb3b6e7c2..cf2098e547fcb 100644 --- a/superset-frontend/src/dashboard/containers/DashboardPage.tsx +++ b/superset-frontend/src/dashboard/containers/DashboardPage.tsx @@ -28,7 +28,6 @@ import { t, useTheme, } from '@superset-ui/core'; -import pick from 'lodash/pick'; import { useDispatch, useSelector } from 'react-redux'; import { useToasts } from 'src/components/MessageToasts/withToasts'; import Loading from 'src/components/Loading'; @@ -40,13 +39,8 @@ import { import { hydrateDashboard } from 'src/dashboard/actions/hydrate'; import { setDatasources } from 'src/dashboard/actions/datasources'; import injectCustomCss from 'src/dashboard/util/injectCustomCss'; -import setupPlugins from 'src/setup/setupPlugins'; -import { - getItem, - LocalStorageKeys, - setItem, -} from 'src/utils/localStorageHelpers'; +import { LocalStorageKeys, setItem } from 'src/utils/localStorageHelpers'; import { URL_PARAMS } from 'src/constants'; import { getUrlParam } from 'src/utils/urlUtils'; import { getFilterSets } from 'src/dashboard/actions/nativeFilters'; @@ -55,25 +49,27 @@ import { getFilterValue, getPermalinkValue, } from 'src/dashboard/components/nativeFilters/FilterBar/keyValue'; -import { DashboardContextForExplore } from 'src/types/DashboardContextForExplore'; +import DashboardContainer from 'src/dashboard/containers/Dashboard'; + import shortid from 'shortid'; import { RootState } from '../types'; -import { getActiveFilters } from '../util/activeDashboardFilters'; import { chartContextMenuStyles, filterCardPopoverStyle, headerStyles, } from '../styles'; +import SyncDashboardState, { + getDashboardContextLocalStorage, +} from '../components/SyncDashboardState'; export const DashboardPageIdContext = React.createContext(''); -setupPlugins(); -const DashboardContainer = React.lazy( +const DashboardBuilder = React.lazy( () => import( /* webpackChunkName: "DashboardContainer" */ /* webpackPreload: true */ - 'src/dashboard/containers/Dashboard' + 'src/dashboard/components/DashboardBuilder/DashboardBuilder' ), ); @@ -83,74 +79,15 @@ type PageProps = { idOrSlug: string; }; -const getDashboardContextLocalStorage = () => { - const dashboardsContexts = getItem( - LocalStorageKeys.dashboard__explore_context, - {}, - ); - // A new dashboard tab id is generated on each dashboard page opening. - // We mark ids as redundant when user leaves the dashboard, because they won't be reused. - // Then we remove redundant dashboard contexts from local storage in order not to clutter it - return Object.fromEntries( - Object.entries(dashboardsContexts).filter( - ([, value]) => !value.isRedundant, - ), - ); -}; - -const updateDashboardTabLocalStorage = ( - dashboardPageId: string, - dashboardContext: DashboardContextForExplore, -) => { - const dashboardsContexts = getDashboardContextLocalStorage(); - setItem(LocalStorageKeys.dashboard__explore_context, { - ...dashboardsContexts, - [dashboardPageId]: dashboardContext, - }); -}; - -const useSyncDashboardStateWithLocalStorage = () => { - const dashboardPageId = useMemo(() => shortid.generate(), []); - const dashboardContextForExplore = useSelector< - RootState, - DashboardContextForExplore - >(({ dashboardInfo, dashboardState, nativeFilters, dataMask }) => ({ - labelColors: dashboardInfo.metadata?.label_colors || {}, - sharedLabelColors: dashboardInfo.metadata?.shared_label_colors || {}, - colorScheme: dashboardState?.colorScheme, - chartConfiguration: dashboardInfo.metadata?.chart_configuration || {}, - nativeFilters: Object.entries(nativeFilters.filters).reduce( - (acc, [key, filterValue]) => ({ - ...acc, - [key]: pick(filterValue, ['chartsInScope']), - }), - {}, - ), - dataMask, - dashboardId: dashboardInfo.id, - filterBoxFilters: getActiveFilters(), - dashboardPageId, - })); - - useEffect(() => { - updateDashboardTabLocalStorage(dashboardPageId, dashboardContextForExplore); - return () => { - // mark tab id as redundant when dashboard unmounts - case when user opens - // Explore in the same tab - updateDashboardTabLocalStorage(dashboardPageId, { - ...dashboardContextForExplore, - isRedundant: true, - }); - }; - }, [dashboardContextForExplore, dashboardPageId]); - return dashboardPageId; -}; - export const DashboardPage: FC<PageProps> = ({ idOrSlug }: PageProps) => { const theme = useTheme(); const dispatch = useDispatch(); const history = useHistory(); - const dashboardPageId = useSyncDashboardStateWithLocalStorage(); + const dashboardPageId = useMemo(() => shortid.generate(), []); + const hasDashboardInfoInitiated = useSelector<RootState, Boolean>( + ({ dashboardInfo }) => + dashboardInfo && Object.keys(dashboardInfo).length > 0, + ); const { addDangerToast } = useToasts(); const { result: dashboard, error: dashboardApiError } = useDashboard(idOrSlug); @@ -284,7 +221,7 @@ export const DashboardPage: FC<PageProps> = ({ idOrSlug }: PageProps) => { }, [addDangerToast, datasets, datasetsApiError, dispatch]); if (error) throw error; // caught in error boundary - if (!readyToRender || !isDashboardHydrated.current) return <Loading />; + if (!readyToRender || !hasDashboardInfoInitiated) return <Loading />; return ( <> @@ -295,8 +232,11 @@ export const DashboardPage: FC<PageProps> = ({ idOrSlug }: PageProps) => { chartContextMenuStyles(theme), ]} /> + <SyncDashboardState dashboardPageId={dashboardPageId} /> <DashboardPageIdContext.Provider value={dashboardPageId}> - <DashboardContainer /> + <DashboardContainer> + <DashboardBuilder /> + </DashboardContainer> </DashboardPageIdContext.Provider> </> ); diff --git a/superset-frontend/src/dashboard/reducers/dashboardState.js b/superset-frontend/src/dashboard/reducers/dashboardState.js index 5d81cd8ac11f0..015cb9822c581 100644 --- a/superset-frontend/src/dashboard/reducers/dashboardState.js +++ b/superset-frontend/src/dashboard/reducers/dashboardState.js @@ -37,6 +37,7 @@ import { SET_DIRECT_PATH, SET_FOCUSED_FILTER_FIELD, UNSET_FOCUSED_FILTER_FIELD, + SET_ACTIVE_TAB, SET_ACTIVE_TABS, SET_FULL_SIZE_CHART_ID, ON_FILTERS_REFRESH, @@ -179,7 +180,7 @@ export default function dashboardStateReducer(state = {}, action) { directPathLastUpdated: Date.now(), }; }, - [SET_ACTIVE_TABS]() { + [SET_ACTIVE_TAB]() { const newActiveTabs = new Set(state.activeTabs); newActiveTabs.delete(action.prevTabId); newActiveTabs.add(action.tabId); @@ -188,6 +189,12 @@ export default function dashboardStateReducer(state = {}, action) { activeTabs: Array.from(newActiveTabs), }; }, + [SET_ACTIVE_TABS]() { + return { + ...state, + activeTabs: action.activeTabs, + }; + }, [SET_OVERRIDE_CONFIRM]() { return { ...state, diff --git a/superset-frontend/src/dashboard/reducers/dashboardState.test.ts b/superset-frontend/src/dashboard/reducers/dashboardState.test.ts index 274b26733ce1d..3a8adc6cbbdc7 100644 --- a/superset-frontend/src/dashboard/reducers/dashboardState.test.ts +++ b/superset-frontend/src/dashboard/reducers/dashboardState.test.ts @@ -18,21 +18,33 @@ */ import dashboardStateReducer from './dashboardState'; -import { setActiveTabs } from '../actions/dashboardState'; +import { setActiveTab, setActiveTabs } from '../actions/dashboardState'; describe('DashboardState reducer', () => { - it('SET_ACTIVE_TABS', () => { + it('SET_ACTIVE_TAB', () => { expect( - dashboardStateReducer({ activeTabs: [] }, setActiveTabs('tab1')), + dashboardStateReducer({ activeTabs: [] }, setActiveTab('tab1')), ).toEqual({ activeTabs: ['tab1'] }); expect( - dashboardStateReducer({ activeTabs: ['tab1'] }, setActiveTabs('tab1')), + dashboardStateReducer({ activeTabs: ['tab1'] }, setActiveTab('tab1')), ).toEqual({ activeTabs: ['tab1'] }); expect( dashboardStateReducer( { activeTabs: ['tab1'] }, - setActiveTabs('tab2', 'tab1'), + setActiveTab('tab2', 'tab1'), ), ).toEqual({ activeTabs: ['tab2'] }); }); + + it('SET_ACTIVE_TABS', () => { + expect( + dashboardStateReducer({ activeTabs: [] }, setActiveTabs(['tab1'])), + ).toEqual({ activeTabs: ['tab1'] }); + expect( + dashboardStateReducer( + { activeTabs: ['tab1', 'tab2'] }, + setActiveTabs(['tab3', 'tab4']), + ), + ).toEqual({ activeTabs: ['tab3', 'tab4'] }); + }); }); diff --git a/superset-frontend/src/dataMask/reducer.ts b/superset-frontend/src/dataMask/reducer.ts index 6e9a5fae5404a..f2163a54a44a0 100644 --- a/superset-frontend/src/dataMask/reducer.ts +++ b/superset-frontend/src/dataMask/reducer.ts @@ -56,7 +56,6 @@ export function getInitialDataMask( } return { ...otherProps, - __cache: {}, extraFormData: {}, filterState: {}, ownState: {}, diff --git a/superset-frontend/src/embedded/index.tsx b/superset-frontend/src/embedded/index.tsx index 50c026fba8f93..27c80e9279750 100644 --- a/superset-frontend/src/embedded/index.tsx +++ b/superset-frontend/src/embedded/index.tsx @@ -23,6 +23,7 @@ import { makeApi, t, logging } from '@superset-ui/core'; import Switchboard from '@superset-ui/switchboard'; import getBootstrapData from 'src/utils/getBootstrapData'; import setupClient from 'src/setup/setupClient'; +import setupPlugins from 'src/setup/setupPlugins'; import { RootContextProviders } from 'src/views/RootContextProviders'; import { store, USER_LOADED } from 'src/views/store'; import ErrorBoundary from 'src/components/ErrorBoundary'; @@ -32,6 +33,8 @@ import ToastContainer from 'src/components/MessageToasts/ToastContainer'; import { UserWithPermissionsAndRoles } from 'src/types/bootstrapTypes'; import { embeddedApi } from './api'; +setupPlugins(); + const debugMode = process.env.WEBPACK_MODE === 'development'; const bootstrapData = getBootstrapData(); diff --git a/superset-frontend/src/explore/actions/exploreActions.test.js b/superset-frontend/src/explore/actions/exploreActions.test.js index 9dd53756800d1..54cf8f16c5c35 100644 --- a/superset-frontend/src/explore/actions/exploreActions.test.js +++ b/superset-frontend/src/explore/actions/exploreActions.test.js @@ -21,6 +21,63 @@ import { defaultState } from 'src/explore/store'; import exploreReducer from 'src/explore/reducers/exploreReducer'; import * as actions from 'src/explore/actions/exploreActions'; +const METRICS = [ + { + expressionType: 'SIMPLE', + column: { + advanced_data_type: null, + certification_details: null, + certified_by: null, + column_name: 'a', + description: null, + expression: null, + filterable: true, + groupby: true, + id: 1, + is_certified: false, + is_dttm: false, + python_date_format: null, + type: 'DOUBLE PRECISION', + type_generic: 0, + verbose_name: null, + warning_markdown: null, + }, + aggregate: 'SUM', + sqlExpression: null, + datasourceWarning: false, + hasCustomLabel: false, + label: 'SUM(a)', + optionName: 'metric_1a2b3c4d5f_1a2b3c4d5f', + }, + { + expressionType: 'SIMPLE', + column: { + advanced_data_type: null, + certification_details: null, + certified_by: null, + column_name: 'b', + description: null, + expression: null, + filterable: true, + groupby: true, + id: 2, + is_certified: false, + is_dttm: false, + python_date_format: null, + type: 'BIGINT', + type_generic: 0, + verbose_name: null, + warning_markdown: null, + }, + aggregate: 'AVG', + sqlExpression: null, + datasourceWarning: false, + hasCustomLabel: false, + label: 'AVG(b)', + optionName: 'metric_6g7h8i9j0k_6g7h8i9j0k', + }, +]; + describe('reducers', () => { it('Does not set a control value if control does not exist', () => { const newState = exploreReducer( @@ -37,4 +94,127 @@ describe('reducers', () => { expect(newState.controls.y_axis_format.value).toBe('$,.2f'); expect(newState.form_data.y_axis_format).toBe('$,.2f'); }); + it('Keeps the column config when metric column positions are swapped', () => { + const mockedState = { + ...defaultState, + controls: { + ...defaultState.controls, + metrics: { + ...defaultState.controls.metrics, + value: METRICS, + }, + column_config: { + ...defaultState.controls.column_config, + value: { + 'AVG(b)': { + currencyFormat: { + symbolPosition: 'prefix', + symbol: 'USD', + }, + }, + }, + }, + }, + form_data: { + ...defaultState.form_data, + metrics: METRICS, + column_config: { + 'AVG(b)': { + currencyFormat: { + symbolPosition: 'prefix', + symbol: 'USD', + }, + }, + }, + }, + }; + + const swappedMetrics = [METRICS[1], METRICS[0]]; + const newState = exploreReducer( + mockedState, + actions.setControlValue('metrics', swappedMetrics, []), + ); + + const expectedColumnConfig = { + 'AVG(b)': { + currencyFormat: { + symbolPosition: 'prefix', + symbol: 'USD', + }, + }, + }; + + expect(newState.controls.metrics.value).toStrictEqual(swappedMetrics); + expect(newState.form_data.metrics).toStrictEqual(swappedMetrics); + expect(newState.controls.column_config.value).toStrictEqual( + expectedColumnConfig, + ); + expect(newState.form_data.column_config).toStrictEqual( + expectedColumnConfig, + ); + }); + + it('Keeps the column config when metric column name is updated', () => { + const mockedState = { + ...defaultState, + controls: { + ...defaultState.controls, + metrics: { + ...defaultState.controls.metrics, + value: METRICS, + }, + column_config: { + ...defaultState.controls.column_config, + value: { + 'AVG(b)': { + currencyFormat: { + symbolPosition: 'prefix', + symbol: 'USD', + }, + }, + }, + }, + }, + form_data: { + ...defaultState.form_data, + metrics: METRICS, + column_config: { + 'AVG(b)': { + currencyFormat: { + symbolPosition: 'prefix', + symbol: 'USD', + }, + }, + }, + }, + }; + + const updatedMetrics = [ + METRICS[0], + { + ...METRICS[1], + hasCustomLabel: true, + label: 'AVG of b', + }, + ]; + + const newState = exploreReducer( + mockedState, + actions.setControlValue('metrics', updatedMetrics, []), + ); + + const expectedColumnConfig = { + 'AVG of b': { + currencyFormat: { + symbolPosition: 'prefix', + symbol: 'USD', + }, + }, + }; + expect(newState.controls.metrics.value).toStrictEqual(updatedMetrics); + expect(newState.form_data.metrics).toStrictEqual(updatedMetrics); + expect(newState.form_data.column_config).toStrictEqual( + expectedColumnConfig, + ); + }); }); diff --git a/superset-frontend/src/explore/components/controls/DndColumnSelectControl/ColumnSelectPopover.test.tsx b/superset-frontend/src/explore/components/controls/DndColumnSelectControl/ColumnSelectPopover.test.tsx new file mode 100644 index 0000000000000..e7ff7cd9a7565 --- /dev/null +++ b/superset-frontend/src/explore/components/controls/DndColumnSelectControl/ColumnSelectPopover.test.tsx @@ -0,0 +1,77 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + +import React from 'react'; +import { render, fireEvent } from '@testing-library/react'; +import '@testing-library/jest-dom/extend-expect'; +import { Provider } from 'react-redux'; +import configureMockStore from 'redux-mock-store'; +import thunk from 'redux-thunk'; +import { supersetTheme, ThemeProvider } from '@superset-ui/core'; +import ColumnSelectPopover from 'src/explore/components/controls/DndColumnSelectControl/ColumnSelectPopover'; + +const middlewares = [thunk]; +const mockStore = configureMockStore(middlewares); + +describe('ColumnSelectPopover - onTabChange function', () => { + it('updates adhocColumn when switching to sqlExpression tab with custom label', () => { + const mockColumns = [{ column_name: 'year' }]; + const mockOnClose = jest.fn(); + const mockOnChange = jest.fn(); + const mockGetCurrentTab = jest.fn(); + const mockSetDatasetModal = jest.fn(); + const mockSetLabel = jest.fn(); + + const store = mockStore({ explore: { datasource: { type: 'table' } } }); + + const { container, getByText } = render( + <Provider store={store}> + <ThemeProvider theme={supersetTheme}> + <ColumnSelectPopover + columns={mockColumns} + editedColumn={mockColumns[0]} + getCurrentTab={mockGetCurrentTab} + hasCustomLabel + isTemporal + label="Custom Label" + onChange={mockOnChange} + onClose={mockOnClose} + setDatasetModal={mockSetDatasetModal} + setLabel={mockSetLabel} + /> + </ThemeProvider> + </Provider>, + ); + + const sqlExpressionTab = container.querySelector( + '#adhoc-metric-edit-tabs-tab-sqlExpression', + ); + expect(sqlExpressionTab).not.toBeNull(); + fireEvent.click(sqlExpressionTab!); + expect(mockGetCurrentTab).toHaveBeenCalledWith('sqlExpression'); + + const saveButton = getByText('Save'); + fireEvent.click(saveButton); + expect(mockOnChange).toHaveBeenCalledWith({ + label: 'Custom Label', + sqlExpression: 'year', + expressionType: 'SQL', + }); + }); +}); diff --git a/superset-frontend/src/explore/components/controls/DndColumnSelectControl/ColumnSelectPopover.tsx b/superset-frontend/src/explore/components/controls/DndColumnSelectControl/ColumnSelectPopover.tsx index 4806e5394a3ae..96abf36484c0f 100644 --- a/superset-frontend/src/explore/components/controls/DndColumnSelectControl/ColumnSelectPopover.tsx +++ b/superset-frontend/src/explore/components/controls/DndColumnSelectControl/ColumnSelectPopover.tsx @@ -68,6 +68,7 @@ interface ColumnSelectPopoverProps { editedColumn?: ColumnMeta | AdhocColumn; onChange: (column: ColumnMeta | AdhocColumn) => void; onClose: () => void; + hasCustomLabel: boolean; setLabel: (title: string) => void; getCurrentTab: (tab: string) => void; label: string; @@ -93,13 +94,14 @@ const getInitialColumnValues = ( const ColumnSelectPopover = ({ columns, editedColumn, + getCurrentTab, + hasCustomLabel, + isTemporal, + label, onChange, onClose, setDatasetModal, setLabel, - getCurrentTab, - label, - isTemporal, }: ColumnSelectPopoverProps) => { const datasourceType = useSelector<ExplorePageState, string | undefined>( state => state.explore.datasource.type, @@ -117,6 +119,7 @@ const ColumnSelectPopover = ({ const [selectedSimpleColumn, setSelectedSimpleColumn] = useState< ColumnMeta | undefined >(initialSimpleColumn); + const [selectedTab, setSelectedTab] = useState<string | null>(null); const [resizeButton, width, height] = useResizeButton( POPOVER_INITIAL_WIDTH, @@ -188,7 +191,34 @@ const ColumnSelectPopover = ({ useEffect(() => { getCurrentTab(defaultActiveTabKey); - }, [defaultActiveTabKey, getCurrentTab]); + setSelectedTab(defaultActiveTabKey); + }, [defaultActiveTabKey, getCurrentTab, setSelectedTab]); + + useEffect(() => { + /* if the adhoc column is not set (because it was never edited) but the + * tab is selected and the label has changed, then we need to set the + * adhoc column manually */ + if ( + adhocColumn === undefined && + selectedTab === 'sqlExpression' && + hasCustomLabel + ) { + const sqlExpression = + selectedSimpleColumn?.column_name || + selectedCalculatedColumn?.expression || + ''; + setAdhocColumn({ label, sqlExpression, expressionType: 'SQL' }); + } + }, [ + adhocColumn, + defaultActiveTabKey, + hasCustomLabel, + getCurrentTab, + label, + selectedCalculatedColumn, + selectedSimpleColumn, + selectedTab, + ]); const onSave = useCallback(() => { if (adhocColumn && adhocColumn.label !== label) { @@ -225,6 +255,7 @@ const ColumnSelectPopover = ({ const onTabChange = useCallback( tab => { getCurrentTab(tab); + setSelectedTab(tab); // @ts-ignore sqlEditorRef.current?.editor.focus(); }, diff --git a/superset-frontend/src/explore/components/controls/DndColumnSelectControl/ColumnSelectPopoverTrigger.tsx b/superset-frontend/src/explore/components/controls/DndColumnSelectControl/ColumnSelectPopoverTrigger.tsx index 4340317f04d11..341d91e616cc8 100644 --- a/superset-frontend/src/explore/components/controls/DndColumnSelectControl/ColumnSelectPopoverTrigger.tsx +++ b/superset-frontend/src/explore/components/controls/DndColumnSelectControl/ColumnSelectPopoverTrigger.tsx @@ -103,6 +103,7 @@ const ColumnSelectPopoverTrigger = ({ setDatasetModal={setDatasetModal} onClose={handleClosePopover} onChange={onColumnEdit} + hasCustomLabel={hasCustomLabel} label={popoverLabel} setLabel={setPopoverLabel} getCurrentTab={getCurrentTab} @@ -114,6 +115,7 @@ const ColumnSelectPopoverTrigger = ({ columns, editedColumn, getCurrentTab, + hasCustomLabel, handleClosePopover, isTemporal, onColumnEdit, @@ -121,10 +123,13 @@ const ColumnSelectPopoverTrigger = ({ ], ); - const onLabelChange = useCallback((e: any) => { - setPopoverLabel(e.target.value); - setHasCustomLabel(true); - }, []); + const onLabelChange = useCallback( + (e: any) => { + setPopoverLabel(e.target.value); + setHasCustomLabel(true); + }, + [setPopoverLabel, setHasCustomLabel], + ); const popoverTitle = useMemo( () => ( diff --git a/superset-frontend/src/explore/components/controls/SelectControl.jsx b/superset-frontend/src/explore/components/controls/SelectControl.jsx index 166382a15ca78..7881fc6858bec 100644 --- a/superset-frontend/src/explore/components/controls/SelectControl.jsx +++ b/superset-frontend/src/explore/components/controls/SelectControl.jsx @@ -215,7 +215,7 @@ export default class SelectControl extends React.PureComponent { const getValue = () => { const currentValue = - value || + value ?? (this.props.default !== undefined ? this.props.default : undefined); // safety check - the value is intended to be undefined but null was used diff --git a/superset-frontend/src/explore/components/controls/VizTypeControl/VizTypeGallery.tsx b/superset-frontend/src/explore/components/controls/VizTypeControl/VizTypeGallery.tsx index 2563dba01cb7a..2d14376516931 100644 --- a/superset-frontend/src/explore/components/controls/VizTypeControl/VizTypeGallery.tsx +++ b/superset-frontend/src/explore/components/controls/VizTypeControl/VizTypeGallery.tsx @@ -849,10 +849,18 @@ export default function VizTypeGallery(props: VizTypeGalleryProps) { grid-area: examples-header; `} > - {!!selectedVizMetadata?.exampleGallery?.length && t('Examples')} + {t('Examples')} </SectionTitle> <Examples> - {(selectedVizMetadata?.exampleGallery || []).map(example => ( + {(selectedVizMetadata?.exampleGallery?.length + ? selectedVizMetadata.exampleGallery + : [ + { + url: selectedVizMetadata?.thumbnail, + caption: selectedVizMetadata?.name, + }, + ] + ).map(example => ( <img key={example.url} src={example.url} diff --git a/superset-frontend/src/explore/reducers/exploreReducer.js b/superset-frontend/src/explore/reducers/exploreReducer.js index d5565a0dad5eb..1797c57637d2d 100644 --- a/superset-frontend/src/explore/reducers/exploreReducer.js +++ b/superset-frontend/src/explore/reducers/exploreReducer.js @@ -115,7 +115,12 @@ export default function exploreReducer(state = {}, action) { // need to update column config as well to keep the previous config. if (controlName === 'metrics' && old_metrics_data && new_column_config) { value.forEach((item, index) => { + const itemExist = old_metrics_data.some( + oldItem => oldItem?.label === item?.label, + ); + if ( + !itemExist && item?.label !== old_metrics_data[index]?.label && !!new_column_config[old_metrics_data[index]?.label] ) { diff --git a/superset-frontend/src/features/alerts/AlertReportModal.tsx b/superset-frontend/src/features/alerts/AlertReportModal.tsx index 571c7b1b2a815..e371efcf1b929 100644 --- a/superset-frontend/src/features/alerts/AlertReportModal.tsx +++ b/superset-frontend/src/features/alerts/AlertReportModal.tsx @@ -152,6 +152,10 @@ const StyledModal = styled(Modal)` } `; +const StyledTooltip = styled(InfoTooltipWithTrigger)` + margin-left: ${({ theme }) => theme.gridUnit}px; +`; + const StyledIcon = (theme: SupersetTheme) => css` margin: auto ${theme.gridUnit * 2}px auto 0; color: ${theme.colors.grayscale.base}; @@ -397,10 +401,12 @@ export const TRANSLATIONS = { ALERT_CONDITION_TEXT: t('Alert condition'), DATABASE_TEXT: t('Database'), SQL_QUERY_TEXT: t('SQL Query'), + SQL_QUERY_TOOLTIP: t( + 'The result of this query should be a numeric-esque value', + ), TRIGGER_ALERT_IF_TEXT: t('Trigger Alert If...'), CONDITION_TEXT: t('Condition'), VALUE_TEXT: t('Value'), - VALUE_TOOLTIP: t('Threshold value should be double precision number'), REPORT_SCHEDULE_TEXT: t('Report schedule'), ALERT_CONDITION_SCHEDULE_TEXT: t('Alert condition schedule'), TIMEZONE_TEXT: t('Timezone'), @@ -1284,6 +1290,7 @@ const AlertReportModal: FunctionComponent<AlertReportModalProps> = ({ <StyledInputContainer> <div className="control-label"> {TRANSLATIONS.SQL_QUERY_TEXT} + <StyledTooltip tooltip={TRANSLATIONS.SQL_QUERY_TOOLTIP} /> <span className="required">*</span> </div> <TextAreaControl @@ -1319,10 +1326,7 @@ const AlertReportModal: FunctionComponent<AlertReportModalProps> = ({ </StyledInputContainer> <StyledInputContainer> <div className="control-label"> - {TRANSLATIONS.VALUE_TEXT}{' '} - <InfoTooltipWithTrigger - tooltip={TRANSLATIONS.VALUE_TOOLTIP} - /> + {TRANSLATIONS.VALUE_TEXT} <span className="required">*</span> </div> <div className="input-container"> diff --git a/superset-frontend/src/features/annotations/AnnotationModal.tsx b/superset-frontend/src/features/annotations/AnnotationModal.tsx index a5c5aa9c31a68..dd1107dfba672 100644 --- a/superset-frontend/src/features/annotations/AnnotationModal.tsx +++ b/superset-frontend/src/features/annotations/AnnotationModal.tsx @@ -287,7 +287,7 @@ const AnnotationModal: FunctionComponent<AnnotationModalProps> = ({ </StyledAnnotationTitle> <AnnotationContainer> <div className="control-label"> - {t('Annotation name')} + {t('Name')} <span className="required">*</span> </div> <input diff --git a/superset-frontend/src/features/cssTemplates/CssTemplateModal.tsx b/superset-frontend/src/features/cssTemplates/CssTemplateModal.tsx index 73bbfe7555fc3..bd3c5b13a6b34 100644 --- a/superset-frontend/src/features/cssTemplates/CssTemplateModal.tsx +++ b/superset-frontend/src/features/cssTemplates/CssTemplateModal.tsx @@ -105,6 +105,9 @@ const CssTemplateModal: FunctionComponent<CssTemplateModalProps> = ({ const update_id = currentCssTemplate.id; delete currentCssTemplate.id; delete currentCssTemplate.created_by; + delete currentCssTemplate.changed_by; + delete currentCssTemplate.changed_on_delta_humanized; + updateResource(update_id, currentCssTemplate).then(response => { if (!response) { return; @@ -235,7 +238,7 @@ const CssTemplateModal: FunctionComponent<CssTemplateModalProps> = ({ </StyledCssTemplateTitle> <TemplateContainer> <div className="control-label"> - {t('CSS template name')} + {t('Name')} <span className="required">*</span> </div> <input diff --git a/superset-frontend/src/features/cssTemplates/types.ts b/superset-frontend/src/features/cssTemplates/types.ts index 1bb5b2e6593d3..5e7e1af97ae46 100644 --- a/superset-frontend/src/features/cssTemplates/types.ts +++ b/superset-frontend/src/features/cssTemplates/types.ts @@ -1,3 +1,5 @@ +import Owner from 'src/types/Owner'; + /** * Licensed to the Apache Software Foundation (ASF) under one * or more contributor license agreements. See the NOTICE file @@ -16,17 +18,12 @@ * specific language governing permissions and limitations * under the License. */ -type CreatedByUser = { - id: number; - first_name: string; - last_name: string; -}; - export type TemplateObject = { id?: number; changed_on_delta_humanized?: string; created_on?: string; - created_by?: CreatedByUser; + changed_by?: Owner; + created_by?: Owner; css?: string; template_name: string; }; diff --git a/superset-frontend/src/features/databases/DatabaseModal/ExtraOptions.tsx b/superset-frontend/src/features/databases/DatabaseModal/ExtraOptions.tsx index 207e197cd40d9..45706da5868da 100644 --- a/superset-frontend/src/features/databases/DatabaseModal/ExtraOptions.tsx +++ b/superset-frontend/src/features/databases/DatabaseModal/ExtraOptions.tsx @@ -202,7 +202,7 @@ const ExtraOptions = ({ /> </div> </StyledInputContainer> - <StyledInputContainer> + <StyledInputContainer css={no_margin_bottom}> <div className="input-container"> <IndeterminateCheckbox id="disable_data_preview" @@ -220,6 +220,22 @@ const ExtraOptions = ({ /> </div> </StyledInputContainer> + <StyledInputContainer> + <div className="input-container"> + <IndeterminateCheckbox + id="expand_rows" + indeterminate={false} + checked={!!extraJson?.schema_options?.expand_rows} + onChange={onExtraInputChange} + labelText={t('Enable row expansion in schemas')} + /> + <InfoTooltip + tooltip={t( + 'For Trino, describe full schemas of nested ROW types, expanding them with dotted paths', + )} + /> + </div> + </StyledInputContainer> </StyledExpandableForm> </StyledInputContainer> </Collapse.Panel> @@ -541,7 +557,7 @@ const ExtraOptions = ({ </div> <div className="input-container" data-test="version-spinbutton-test"> <input - type="number" + type="text" name="version" placeholder={t('Version number')} onChange={onExtraInputChange} @@ -550,8 +566,8 @@ const ExtraOptions = ({ </div> <div className="helper"> {t( - 'Specify the database version. This should be used with ' + - 'Presto in order to enable query cost estimation.', + 'Specify the database version. This is used with Presto for query cost ' + + 'estimation, and Dremio for syntax changes, among others.', )} </div> </StyledInputContainer> diff --git a/superset-frontend/src/features/databases/DatabaseModal/index.test.tsx b/superset-frontend/src/features/databases/DatabaseModal/index.test.tsx index bcd9fbe694706..ba443e0099457 100644 --- a/superset-frontend/src/features/databases/DatabaseModal/index.test.tsx +++ b/superset-frontend/src/features/databases/DatabaseModal/index.test.tsx @@ -674,7 +674,7 @@ describe('DatabaseModal', () => { const exposeInSQLLabCheckbox = screen.getByRole('checkbox', { name: /expose database in sql lab/i, }); - // This is both the checkbox and it's respective SVG + // This is both the checkbox and its respective SVG // const exposeInSQLLabCheckboxSVG = checkboxOffSVGs[0].parentElement; const exposeInSQLLabText = screen.getByText( /expose database in sql lab/i, @@ -721,6 +721,13 @@ describe('DatabaseModal', () => { /Disable SQL Lab data preview queries/i, ); + const enableRowExpansionCheckbox = screen.getByRole('checkbox', { + name: /enable row expansion in schemas/i, + }); + const enableRowExpansionText = screen.getByText( + /enable row expansion in schemas/i, + ); + // ---------- Assertions ---------- const visibleComponents = [ closeButton, @@ -737,6 +744,7 @@ describe('DatabaseModal', () => { checkboxOffSVGs[2], checkboxOffSVGs[3], checkboxOffSVGs[4], + checkboxOffSVGs[5], tooltipIcons[0], tooltipIcons[1], tooltipIcons[2], @@ -744,6 +752,7 @@ describe('DatabaseModal', () => { tooltipIcons[4], tooltipIcons[5], tooltipIcons[6], + tooltipIcons[7], exposeInSQLLabText, allowCTASText, allowCVASText, @@ -754,6 +763,7 @@ describe('DatabaseModal', () => { enableQueryCostEstimationText, allowDbExplorationText, disableSQLLabDataPreviewQueriesText, + enableRowExpansionText, ]; // These components exist in the DOM but are not visible const invisibleComponents = [ @@ -764,6 +774,7 @@ describe('DatabaseModal', () => { enableQueryCostEstimationCheckbox, allowDbExplorationCheckbox, disableSQLLabDataPreviewQueriesCheckbox, + enableRowExpansionCheckbox, ]; visibleComponents.forEach(component => { expect(component).toBeVisible(); @@ -771,8 +782,8 @@ describe('DatabaseModal', () => { invisibleComponents.forEach(component => { expect(component).not.toBeVisible(); }); - expect(checkboxOffSVGs).toHaveLength(5); - expect(tooltipIcons).toHaveLength(7); + expect(checkboxOffSVGs).toHaveLength(6); + expect(tooltipIcons).toHaveLength(8); }); test('renders the "Advanced" - PERFORMANCE tab correctly', async () => { diff --git a/superset-frontend/src/features/databases/DatabaseModal/index.tsx b/superset-frontend/src/features/databases/DatabaseModal/index.tsx index 0c1ac56369692..18c93f2bf462f 100644 --- a/superset-frontend/src/features/databases/DatabaseModal/index.tsx +++ b/superset-frontend/src/features/databases/DatabaseModal/index.tsx @@ -307,6 +307,18 @@ export function dbReducer( }), }; } + if (action.payload.name === 'expand_rows') { + return { + ...trimmedState, + extra: JSON.stringify({ + ...extraJson, + schema_options: { + ...extraJson?.schema_options, + [action.payload.name]: !!action.payload.value, + }, + }), + }; + } return { ...trimmedState, extra: JSON.stringify({ diff --git a/superset-frontend/src/features/databases/types.ts b/superset-frontend/src/features/databases/types.ts index e138a9143669e..1d616fa13c053 100644 --- a/superset-frontend/src/features/databases/types.ts +++ b/superset-frontend/src/features/databases/types.ts @@ -226,5 +226,8 @@ export interface ExtraJson { table_cache_timeout?: number; // in Performance }; // No field, holds schema and table timeout schemas_allowed_for_file_upload?: string[]; // in Security + schema_options?: { + expand_rows?: boolean; + }; version?: string; } diff --git a/superset-frontend/src/features/home/Menu.tsx b/superset-frontend/src/features/home/Menu.tsx index 56a2fd611ec95..67b72fc515b41 100644 --- a/superset-frontend/src/features/home/Menu.tsx +++ b/superset-frontend/src/features/home/Menu.tsx @@ -24,7 +24,7 @@ import { getUrlParam } from 'src/utils/urlUtils'; import { Row, Col, Grid } from 'src/components'; import { MainNav as DropdownMenu, MenuMode } from 'src/components/Menu'; import { Tooltip } from 'src/components/Tooltip'; -import { Link, useLocation } from 'react-router-dom'; +import { NavLink, useLocation } from 'react-router-dom'; import { GenericLink } from 'src/components/GenericLink/GenericLink'; import Icons from 'src/components/Icons'; import { useUiConfig } from 'src/components/UiConfigContext'; @@ -154,6 +154,29 @@ const globalStyles = (theme: SupersetTheme) => css` margin-left: ${theme.gridUnit * 1.75}px; } } + .ant-menu-item-selected { + background-color: transparent; + &:not(.ant-menu-item-active) { + color: inherit; + border-bottom-color: transparent; + & > a { + color: inherit; + } + } + } + .ant-menu-horizontal > .ant-menu-item:has(> .is-active) { + color: ${theme.colors.primary.base}; + border-bottom-color: ${theme.colors.primary.base}; + & > a { + color: ${theme.colors.primary.base}; + } + } + .ant-menu-vertical > .ant-menu-item:has(> .is-active) { + background-color: ${theme.colors.primary.light5}; + & > a { + color: ${theme.colors.primary.base}; + } + } `; const { SubMenu } = DropdownMenu; @@ -226,9 +249,9 @@ export function Menu({ if (url && isFrontendRoute) { return ( <DropdownMenu.Item key={label} role="presentation"> - <Link role="button" to={url}> + <NavLink role="button" to={url} activeClassName="is-active"> {label} - </Link> + </NavLink> </DropdownMenu.Item> ); } @@ -253,7 +276,13 @@ export function Menu({ return ( <DropdownMenu.Item key={`${child.label}`}> {child.isFrontendRoute ? ( - <Link to={child.url || ''}>{child.label}</Link> + <NavLink + to={child.url || ''} + exact + activeClassName="is-active" + > + {child.label} + </NavLink> ) : ( <a href={child.url}>{child.label}</a> )} diff --git a/superset-frontend/src/features/reports/ReportModal/HeaderReportDropdown/index.tsx b/superset-frontend/src/features/reports/ReportModal/HeaderReportDropdown/index.tsx index b38d44a710b78..84d19bae69504 100644 --- a/superset-frontend/src/features/reports/ReportModal/HeaderReportDropdown/index.tsx +++ b/superset-frontend/src/features/reports/ReportModal/HeaderReportDropdown/index.tsx @@ -191,6 +191,12 @@ export default function HeaderReportDropDown({ const showReportSubMenu = report && setShowReportSubMenu && canAddReports(); + // @z-index-below-dashboard-header (100) - 1 = 99 + const dropdownOverlayStyle = { + zIndex: 99, + animationDuration: '0s', + }; + useEffect(() => { if (showReportSubMenu) { setShowReportSubMenu(true); @@ -288,6 +294,7 @@ export default function HeaderReportDropDown({ <> <NoAnimationDropdown overlay={menu()} + overlayStyle={dropdownOverlayStyle} trigger={['click']} getPopupContainer={(triggerNode: any) => triggerNode.closest('.action-button') diff --git a/superset-frontend/src/features/rls/RowLevelSecurityModal.tsx b/superset-frontend/src/features/rls/RowLevelSecurityModal.tsx index d7e7af7126b92..d14d48d0e51dc 100644 --- a/superset-frontend/src/features/rls/RowLevelSecurityModal.tsx +++ b/superset-frontend/src/features/rls/RowLevelSecurityModal.tsx @@ -385,10 +385,10 @@ function RowLevelSecurityModal(props: RowLevelSecurityModalProps) { <StyledInputContainer> <div className="control-label"> - {t('Tables')} <span className="required">*</span> + {t('Datasets')} <span className="required">*</span> <InfoTooltip tooltip={t( - 'These are the tables this filter will be applied to.', + 'These are the datasets this filter will be applied to.', )} /> </div> diff --git a/superset-frontend/src/features/tags/TagModal.test.tsx b/superset-frontend/src/features/tags/TagModal.test.tsx index 5f4fd4e2b9348..99b7a3365e4f0 100644 --- a/superset-frontend/src/features/tags/TagModal.test.tsx +++ b/superset-frontend/src/features/tags/TagModal.test.tsx @@ -56,10 +56,12 @@ test('renders correctly in edit mode', () => { changed_on_delta_humanized: '', created_on_delta_humanized: '', created_by: { + id: 1, first_name: 'joe', last_name: 'smith', }, changed_by: { + id: 2, first_name: 'tom', last_name: 'brown', }, diff --git a/superset-frontend/src/features/tags/TagModal.tsx b/superset-frontend/src/features/tags/TagModal.tsx index 4339d69130792..5057c8441d399 100644 --- a/superset-frontend/src/features/tags/TagModal.tsx +++ b/superset-frontend/src/features/tags/TagModal.tsx @@ -26,7 +26,7 @@ import { Input } from 'antd'; import { Divider } from 'src/components'; import Button from 'src/components/Button'; import { Tag } from 'src/views/CRUD/types'; -import { fetchObjects } from 'src/features/tags/tags'; +import { fetchObjectsByTagIds } from 'src/features/tags/tags'; const StyledModalBody = styled.div` .ant-select-dropdown { @@ -88,6 +88,14 @@ const TagModal: React.FC<TagModalProps> = ({ setSavedQueriesToTag([]); }; + const clearTagForm = () => { + setTagName(''); + setDescription(''); + setDashboardsToTag([]); + setChartsToTag([]); + setSavedQueriesToTag([]); + }; + useEffect(() => { const resourceMap: { [key: string]: TaggableResourceOption[] } = { [TaggableResources.Dashboard]: [], @@ -107,8 +115,8 @@ const TagModal: React.FC<TagModalProps> = ({ }; clearResources(); if (isEditMode) { - fetchObjects( - { tags: editTag.name, types: null }, + fetchObjectsByTagIds( + { tagIds: [editTag.id], types: null }, (data: Tag[]) => { data.forEach(updateResourceOptions); setDashboardsToTag(resourceMap[TaggableResources.Dashboard]); @@ -225,7 +233,9 @@ const TagModal: React.FC<TagModalProps> = ({ }) .then(({ json = {} }) => { refreshData(); + clearTagForm(); addSuccessToast(t('Tag updated')); + onHide(); }) .catch(err => { addDangerToast(err.message || 'Error Updating Tag'); @@ -241,24 +251,19 @@ const TagModal: React.FC<TagModalProps> = ({ }) .then(({ json = {} }) => { refreshData(); + clearTagForm(); addSuccessToast(t('Tag created')); + onHide(); }) .catch(err => addDangerToast(err.message || 'Error Creating Tag')); } - onHide(); }; return ( <Modal title={modalTitle} onHide={() => { - if (clearOnHide) { - setTagName(''); - setDescription(''); - setDashboardsToTag([]); - setChartsToTag([]); - setSavedQueriesToTag([]); - } + if (clearOnHide) clearTagForm(); onHide(); }} show={show} diff --git a/superset-frontend/src/features/tags/tags.ts b/superset-frontend/src/features/tags/tags.ts index 45c4e88fc56e6..db172681cb90f 100644 --- a/superset-frontend/src/features/tags/tags.ts +++ b/superset-frontend/src/features/tags/tags.ts @@ -194,3 +194,20 @@ export function fetchObjects( .then(({ json }) => callback(json.result)) .catch(response => error(response)); } + +export function fetchObjectsByTagIds( + { + tagIds = [], + types, + }: { tagIds: number[] | undefined; types: string | null }, + callback: (json: JsonObject) => void, + error: (response: Response) => void, +) { + let url = `/api/v1/tag/get_objects/?tagIds=${tagIds}`; + if (types) { + url += `&types=${types}`; + } + SupersetClient.get({ endpoint: url }) + .then(({ json }) => callback(json.result)) + .catch(response => error(response)); +} diff --git a/superset-frontend/src/filters/components/Select/SelectFilterPlugin.test.tsx b/superset-frontend/src/filters/components/Select/SelectFilterPlugin.test.tsx index c035f81c01b89..99e6259871430 100644 --- a/superset-frontend/src/filters/components/Select/SelectFilterPlugin.test.tsx +++ b/superset-frontend/src/filters/components/Select/SelectFilterPlugin.test.tsx @@ -91,15 +91,6 @@ describe('SelectFilterPlugin', () => { test('Add multiple values with first render', async () => { getWrapper(); expect(setDataMask).toHaveBeenCalledWith({ - extraFormData: {}, - filterState: { - value: ['boy'], - }, - }); - expect(setDataMask).toHaveBeenCalledWith({ - __cache: { - value: ['boy'], - }, extraFormData: { filters: [ { @@ -118,9 +109,6 @@ describe('SelectFilterPlugin', () => { userEvent.click(screen.getByTitle('girl')); expect(await screen.findByTitle(/girl/i)).toBeInTheDocument(); expect(setDataMask).toHaveBeenCalledWith({ - __cache: { - value: ['boy'], - }, extraFormData: { filters: [ { @@ -146,9 +134,6 @@ describe('SelectFilterPlugin', () => { }), ); expect(setDataMask).toHaveBeenCalledWith({ - __cache: { - value: ['boy'], - }, extraFormData: { adhoc_filters: [ { @@ -174,9 +159,6 @@ describe('SelectFilterPlugin', () => { }), ); expect(setDataMask).toHaveBeenCalledWith({ - __cache: { - value: ['boy'], - }, extraFormData: {}, filterState: { label: undefined, @@ -191,9 +173,6 @@ describe('SelectFilterPlugin', () => { expect(await screen.findByTitle('girl')).toBeInTheDocument(); userEvent.click(screen.getByTitle('girl')); expect(setDataMask).toHaveBeenCalledWith({ - __cache: { - value: ['boy'], - }, extraFormData: { filters: [ { @@ -216,9 +195,6 @@ describe('SelectFilterPlugin', () => { expect(await screen.findByRole('combobox')).toBeInTheDocument(); userEvent.click(screen.getByTitle(NULL_STRING)); expect(setDataMask).toHaveBeenLastCalledWith({ - __cache: { - value: ['boy'], - }, extraFormData: { filters: [ { diff --git a/superset-frontend/src/filters/components/Select/SelectFilterPlugin.tsx b/superset-frontend/src/filters/components/Select/SelectFilterPlugin.tsx index 7d8ab55fb5571..a4b9f5b05efaf 100644 --- a/superset-frontend/src/filters/components/Select/SelectFilterPlugin.tsx +++ b/superset-frontend/src/filters/components/Select/SelectFilterPlugin.tsx @@ -37,7 +37,6 @@ import { Select } from 'src/components'; import { SLOW_DEBOUNCE } from 'src/constants'; import { hasOption, propertyComparator } from 'src/components/Select/utils'; import { FilterBarOrientation } from 'src/dashboard/types'; -import { uniqWith, isEqual } from 'lodash'; import { PluginFilterSelectProps, SelectValue } from './types'; import { FilterPluginStyle, StatusMessage, StyledFormItem } from '../common'; import { getDataRecordFormatter, getSelectExtraFormData } from '../../utils'; @@ -46,15 +45,11 @@ type DataMaskAction = | { type: 'ownState'; ownState: JsonObject } | { type: 'filterState'; - __cache: JsonObject; extraFormData: ExtraFormData; filterState: { value: SelectValue; label?: string }; }; -function reducer( - draft: DataMask & { __cache?: JsonObject }, - action: DataMaskAction, -) { +function reducer(draft: DataMask, action: DataMaskAction) { switch (action.type) { case 'ownState': draft.ownState = { @@ -63,10 +58,18 @@ function reducer( }; return draft; case 'filterState': - draft.extraFormData = action.extraFormData; - // eslint-disable-next-line no-underscore-dangle - draft.__cache = action.__cache; - draft.filterState = { ...draft.filterState, ...action.filterState }; + if ( + JSON.stringify(draft.extraFormData) !== + JSON.stringify(action.extraFormData) + ) { + draft.extraFormData = action.extraFormData; + } + if ( + JSON.stringify(draft.filterState) !== JSON.stringify(action.filterState) + ) { + draft.filterState = { ...draft.filterState, ...action.filterState }; + } + return draft; default: return draft; @@ -130,7 +133,6 @@ export default function PluginFilterSelect(props: PluginFilterSelectProps) { const suffix = inverseSelection && values?.length ? t(' (excluded)') : ''; dispatchDataMask({ type: 'filterState', - __cache: filterState, extraFormData: getSelectExtraFormData( col, values, @@ -219,16 +221,13 @@ export default function PluginFilterSelect(props: PluginFilterSelectProps) { }, [filterState.validateMessage, filterState.validateStatus]); const uniqueOptions = useMemo(() => { - const allOptions = [...data]; - return uniqWith(allOptions, isEqual).map(row => { - const [value] = groupby.map(col => row[col]); - return { - label: labelFormatter(value, datatype), - value, - isNewOption: false, - }; - }); - }, [data, datatype, groupby, labelFormatter]); + const allOptions = new Set([...data.map(el => el[col])]); + return [...allOptions].map((value: string) => ({ + label: labelFormatter(value, datatype), + value, + isNewOption: false, + })); + }, [data, datatype, col, labelFormatter]); const options = useMemo(() => { if (search && !multiSelect && !hasOption(search, uniqueOptions, true)) { diff --git a/superset-frontend/src/filters/components/common.ts b/superset-frontend/src/filters/components/common.ts index af1fe9c791761..cb6d7f22f14be 100644 --- a/superset-frontend/src/filters/components/common.ts +++ b/superset-frontend/src/filters/components/common.ts @@ -20,9 +20,11 @@ import { styled } from '@superset-ui/core'; import { PluginFilterStylesProps } from './types'; import FormItem from '../../components/Form/FormItem'; +export const RESPONSIVE_WIDTH = 0; + export const FilterPluginStyle = styled.div<PluginFilterStylesProps>` min-height: ${({ height }) => height}px; - width: ${({ width }) => width}px; + width: ${({ width }) => (width === RESPONSIVE_WIDTH ? '100%' : `${width}px`)}; `; export const StyledFormItem = styled(FormItem)` diff --git a/superset-frontend/src/hooks/apiResources/dashboards.ts b/superset-frontend/src/hooks/apiResources/dashboards.ts index b21cc668c06a1..61896ba1309dc 100644 --- a/superset-frontend/src/hooks/apiResources/dashboards.ts +++ b/superset-frontend/src/hooks/apiResources/dashboards.ts @@ -31,6 +31,7 @@ export const useDashboard = (idOrSlug: string | number) => (dashboard.json_metadata && JSON.parse(dashboard.json_metadata)) || {}, position_data: dashboard.position_json && JSON.parse(dashboard.position_json), + owners: dashboard.owners || [], }), ); diff --git a/superset-frontend/src/hooks/apiResources/sqlEditorTabs.test.ts b/superset-frontend/src/hooks/apiResources/sqlEditorTabs.test.ts new file mode 100644 index 0000000000000..d0f2230f13d90 --- /dev/null +++ b/superset-frontend/src/hooks/apiResources/sqlEditorTabs.test.ts @@ -0,0 +1,99 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +import fetchMock from 'fetch-mock'; +import { act, renderHook } from '@testing-library/react-hooks'; +import { + createWrapper, + defaultStore as store, +} from 'spec/helpers/testing-library'; +import { api } from 'src/hooks/apiResources/queryApi'; +import { LatestQueryEditorVersion } from 'src/SqlLab/types'; +import { useUpdateSqlEditorTabMutation } from './sqlEditorTabs'; + +const expectedQueryEditor = { + version: LatestQueryEditorVersion, + id: '123', + dbId: 456, + name: 'tab 1', + sql: 'SELECT * from example_table', + schema: 'my_schema', + templateParams: '{"a": 1, "v": "str"}', + queryLimit: 1000, + remoteId: null, + autorun: false, + hideLeftBar: false, + updatedAt: Date.now(), +}; + +afterEach(() => { + fetchMock.reset(); + act(() => { + store.dispatch(api.util.resetApiState()); + }); +}); + +test('puts api request with formData', async () => { + const tabStateMutationApiRoute = `glob:*/tabstateview/${expectedQueryEditor.id}`; + fetchMock.put(tabStateMutationApiRoute, 200); + const { result, waitFor } = renderHook( + () => useUpdateSqlEditorTabMutation(), + { + wrapper: createWrapper({ + useRedux: true, + store, + }), + }, + ); + act(() => { + result.current[0]({ + queryEditor: expectedQueryEditor, + }); + }); + await waitFor(() => + expect(fetchMock.calls(tabStateMutationApiRoute).length).toBe(1), + ); + const formData = fetchMock.calls(tabStateMutationApiRoute)[0][1] + ?.body as FormData; + expect(formData.get('database_id')).toBe(`${expectedQueryEditor.dbId}`); + expect(formData.get('schema')).toBe( + JSON.stringify(`${expectedQueryEditor.schema}`), + ); + expect(formData.get('sql')).toBe( + JSON.stringify(`${expectedQueryEditor.sql}`), + ); + expect(formData.get('label')).toBe( + JSON.stringify(`${expectedQueryEditor.name}`), + ); + expect(formData.get('query_limit')).toBe(`${expectedQueryEditor.queryLimit}`); + expect(formData.has('latest_query_id')).toBe(false); + expect(formData.get('template_params')).toBe( + JSON.stringify(`${expectedQueryEditor.templateParams}`), + ); + expect(formData.get('hide_left_bar')).toBe( + `${expectedQueryEditor.hideLeftBar}`, + ); + expect(formData.get('extra_json')).toBe( + JSON.stringify( + JSON.stringify({ + updatedAt: expectedQueryEditor.updatedAt, + version: LatestQueryEditorVersion, + }), + ), + ); +}); diff --git a/superset-frontend/src/hooks/apiResources/sqlEditorTabs.ts b/superset-frontend/src/hooks/apiResources/sqlEditorTabs.ts new file mode 100644 index 0000000000000..71e0cf2936e5a --- /dev/null +++ b/superset-frontend/src/hooks/apiResources/sqlEditorTabs.ts @@ -0,0 +1,70 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +import { pickBy } from 'lodash'; +import { QueryEditor, LatestQueryEditorVersion } from 'src/SqlLab/types'; +import { api, JsonResponse } from './queryApi'; + +export type EditorMutationParams = { + queryEditor: QueryEditor; + extra?: Record<string, any>; +}; + +const sqlEditorApi = api.injectEndpoints({ + endpoints: builder => ({ + updateSqlEditorTab: builder.mutation<JsonResponse, EditorMutationParams>({ + query: ({ + queryEditor: { + version = LatestQueryEditorVersion, + id, + dbId, + schema, + queryLimit, + sql, + name, + latestQueryId, + hideLeftBar, + templateParams, + autorun, + updatedAt, + }, + extra, + }) => ({ + method: 'PUT', + endpoint: encodeURI(`/tabstateview/${id}`), + postPayload: pickBy( + { + database_id: dbId, + schema, + sql, + label: name, + query_limit: queryLimit, + latest_query_id: latestQueryId, + template_params: templateParams, + hide_left_bar: hideLeftBar, + autorun, + extra_json: JSON.stringify({ updatedAt, version, ...extra }), + }, + value => value !== undefined, + ), + }), + }), + }), +}); + +export const { useUpdateSqlEditorTabMutation } = sqlEditorApi; diff --git a/superset-frontend/src/hooks/apiResources/sqlLab.ts b/superset-frontend/src/hooks/apiResources/sqlLab.ts index 123db414e2681..16e8ffde6c609 100644 --- a/superset-frontend/src/hooks/apiResources/sqlLab.ts +++ b/superset-frontend/src/hooks/apiResources/sqlLab.ts @@ -50,7 +50,7 @@ export type InitialState = { template_params: string | null; hide_left_bar?: boolean; saved_query: { id: number } | null; - extra_json?: object; + extra_json?: Record<string, any>; }; databases: object[]; queries: Record< diff --git a/superset-frontend/src/hooks/useDebounceValue.ts b/superset-frontend/src/hooks/useDebounceValue.ts index 711b2dbd5a98c..862c83770779d 100644 --- a/superset-frontend/src/hooks/useDebounceValue.ts +++ b/superset-frontend/src/hooks/useDebounceValue.ts @@ -19,8 +19,8 @@ import { useState, useEffect } from 'react'; import { FAST_DEBOUNCE } from 'src/constants'; -export function useDebounceValue(value: string, delay = FAST_DEBOUNCE) { - const [debouncedValue, setDebouncedValue] = useState(value); +export function useDebounceValue<T>(value: T, delay = FAST_DEBOUNCE) { + const [debouncedValue, setDebouncedValue] = useState<T>(value); useEffect(() => { const handler: NodeJS.Timeout = setTimeout(() => { diff --git a/superset-frontend/src/pages/AlertReportList/index.tsx b/superset-frontend/src/pages/AlertReportList/index.tsx index b0cd0a46226db..c6d14d186f100 100644 --- a/superset-frontend/src/pages/AlertReportList/index.tsx +++ b/superset-frontend/src/pages/AlertReportList/index.tsx @@ -53,6 +53,8 @@ import { isUserAdmin } from 'src/dashboard/util/permissionUtils'; import Owner from 'src/types/Owner'; import AlertReportModal from 'src/features/alerts/AlertReportModal'; import { AlertObject, AlertState } from 'src/features/alerts/types'; +import { ModifiedInfo } from 'src/components/AuditInfo'; +import { QueryObjectColumns } from 'src/views/CRUD/types'; const extensionsRegistry = getExtensionsRegistry(); @@ -303,18 +305,6 @@ function AlertList({ disableSortBy: true, size: 'xl', }, - { - Cell: ({ - row: { - original: { created_by }, - }, - }: any) => - created_by ? `${created_by.first_name} ${created_by.last_name}` : '', - Header: t('Created by'), - id: 'created_by', - disableSortBy: true, - size: 'xl', - }, { Cell: ({ row: { @@ -329,10 +319,13 @@ function AlertList({ { Cell: ({ row: { - original: { changed_on_delta_humanized: changedOn }, + original: { + changed_on_delta_humanized: changedOn, + changed_by: changedBy, + }, }, - }: any) => <span className="no-wrap">{changedOn}</span>, - Header: t('Modified'), + }: any) => <ModifiedInfo date={changedOn} user={changedBy} />, + Header: t('Last modified'), accessor: 'changed_on_delta_humanized', size: 'xl', }, @@ -407,6 +400,10 @@ function AlertList({ disableSortBy: true, size: 'xl', }, + { + accessor: QueryObjectColumns.changed_by, + hidden: true, + }, ], [canDelete, canEdit, isReportEnabled, toggleActive], ); @@ -448,6 +445,13 @@ function AlertList({ const filters: Filters = useMemo( () => [ + { + Header: t('Name'), + key: 'search', + id: 'name', + input: 'search', + operator: FilterOperator.contains, + }, { Header: t('Owner'), key: 'owner', @@ -465,23 +469,6 @@ function AlertList({ ), paginate: true, }, - { - Header: t('Created by'), - key: 'created_by', - id: 'created_by', - input: 'select', - operator: FilterOperator.relationOneMany, - unfilteredLabel: 'All', - fetchSelects: createFetchRelated( - 'report', - 'created_by', - createErrorHandler(errMsg => - t('An error occurred while fetching created by values: %s', errMsg), - ), - user, - ), - paginate: true, - }, { Header: t('Status'), key: 'status', @@ -504,11 +491,24 @@ function AlertList({ ], }, { - Header: t('Search'), - key: 'search', - id: 'name', - input: 'search', - operator: FilterOperator.contains, + Header: t('Modified by'), + key: 'changed_by', + id: 'changed_by', + input: 'select', + operator: FilterOperator.relationOneMany, + unfilteredLabel: t('All'), + fetchSelects: createFetchRelated( + 'report', + 'changed_by', + createErrorHandler(errMsg => + t( + 'An error occurred while fetching dataset datasource values: %s', + errMsg, + ), + ), + user, + ), + paginate: true, }, ], [], diff --git a/superset-frontend/src/pages/AllEntities/index.tsx b/superset-frontend/src/pages/AllEntities/index.tsx index ca815795d6fbb..b94cab846dfe4 100644 --- a/superset-frontend/src/pages/AllEntities/index.tsx +++ b/superset-frontend/src/pages/AllEntities/index.tsx @@ -33,8 +33,9 @@ import { PageHeaderWithActions } from 'src/components/PageHeaderWithActions'; import { Tag } from 'src/views/CRUD/types'; import TagModal from 'src/features/tags/TagModal'; import withToasts, { useToasts } from 'src/components/MessageToasts/withToasts'; -import { fetchObjects, fetchSingleTag } from 'src/features/tags/tags'; +import { fetchObjectsByTagIds, fetchSingleTag } from 'src/features/tags/tags'; import Loading from 'src/components/Loading'; +import getOwnerName from 'src/utils/getOwnerName'; interface TaggedObject { id: number; @@ -132,7 +133,7 @@ function AllEntities() { const owner: Owner = { type: MetadataType.OWNER, - createdBy: `${tag?.created_by.first_name} ${tag?.created_by.last_name}`, + createdBy: getOwnerName(tag?.created_by), createdOn: tag?.created_on_delta_humanized || '', }; items.push(owner); @@ -140,14 +141,18 @@ function AllEntities() { const lastModified: LastModified = { type: MetadataType.LAST_MODIFIED, value: tag?.changed_on_delta_humanized || '', - modifiedBy: `${tag?.changed_by.first_name} ${tag?.changed_by.last_name}`, + modifiedBy: getOwnerName(tag?.changed_by), }; items.push(lastModified); const fetchTaggedObjects = () => { setLoading(true); - fetchObjects( - { tags: tag?.name || '', types: null }, + if (!tag) { + addDangerToast('Error tag object is not referenced!'); + return; + } + fetchObjectsByTagIds( + { tagIds: [tag?.id] || '', types: null }, (data: TaggedObject[]) => { const objects = { dashboard: [], chart: [], query: [] }; data.forEach(function (object) { diff --git a/superset-frontend/src/pages/AnnotationLayerList/index.tsx b/superset-frontend/src/pages/AnnotationLayerList/index.tsx index fc909538c0d94..fff5743b5ab90 100644 --- a/superset-frontend/src/pages/AnnotationLayerList/index.tsx +++ b/superset-frontend/src/pages/AnnotationLayerList/index.tsx @@ -21,7 +21,6 @@ import React, { useMemo, useState } from 'react'; import rison from 'rison'; import { t, SupersetClient } from '@superset-ui/core'; import { Link, useHistory } from 'react-router-dom'; -import moment from 'moment'; import { useListViewResource } from 'src/views/CRUD/hooks'; import { createFetchRelated, createErrorHandler } from 'src/views/CRUD/utils'; import withToasts from 'src/components/MessageToasts/withToasts'; @@ -36,9 +35,10 @@ import DeleteModal from 'src/components/DeleteModal'; import ConfirmStatusChange from 'src/components/ConfirmStatusChange'; import AnnotationLayerModal from 'src/features/annotationLayers/AnnotationLayerModal'; import { AnnotationLayerObject } from 'src/features/annotationLayers/types'; +import { ModifiedInfo } from 'src/components/AuditInfo'; +import { QueryObjectColumns } from 'src/views/CRUD/types'; const PAGE_SIZE = 25; -const MOMENT_FORMAT = 'MMM DD, YYYY'; interface AnnotationLayersListProps { addDangerToast: (msg: string) => void; @@ -156,65 +156,16 @@ function AnnotationLayersList({ { Cell: ({ row: { - original: { changed_on: changedOn }, + original: { + changed_on_delta_humanized: changedOn, + changed_by: changedBy, + }, }, - }: any) => { - const date = new Date(changedOn); - const utc = new Date( - Date.UTC( - date.getFullYear(), - date.getMonth(), - date.getDate(), - date.getHours(), - date.getMinutes(), - date.getSeconds(), - date.getMilliseconds(), - ), - ); - - return moment(utc).format(MOMENT_FORMAT); - }, + }: any) => <ModifiedInfo date={changedOn} user={changedBy} />, Header: t('Last modified'), accessor: 'changed_on', size: 'xl', }, - { - Cell: ({ - row: { - original: { created_on: createdOn }, - }, - }: any) => { - const date = new Date(createdOn); - const utc = new Date( - Date.UTC( - date.getFullYear(), - date.getMonth(), - date.getDate(), - date.getHours(), - date.getMinutes(), - date.getSeconds(), - date.getMilliseconds(), - ), - ); - - return moment(utc).format(MOMENT_FORMAT); - }, - Header: t('Created on'), - accessor: 'created_on', - size: 'xl', - }, - { - accessor: 'created_by', - disableSortBy: true, - Header: t('Created by'), - Cell: ({ - row: { - original: { created_by: createdBy }, - }, - }: any) => - createdBy ? `${createdBy.first_name} ${createdBy.last_name}` : '', - size: 'xl', - }, { Cell: ({ row: { original } }: any) => { const handleEdit = () => handleAnnotationLayerEdit(original); @@ -249,6 +200,10 @@ function AnnotationLayersList({ hidden: !canEdit && !canDelete, size: 'xl', }, + { + accessor: QueryObjectColumns.changed_by, + hidden: true, + }, ], [canDelete, canCreate], ); @@ -280,15 +235,22 @@ function AnnotationLayersList({ const filters: Filters = useMemo( () => [ { - Header: t('Created by'), - key: 'created_by', - id: 'created_by', + Header: t('Name'), + key: 'search', + id: 'name', + input: 'search', + operator: FilterOperator.contains, + }, + { + Header: t('Changed by'), + key: 'changed_by', + id: 'changed_by', input: 'select', operator: FilterOperator.relationOneMany, unfilteredLabel: t('All'), fetchSelects: createFetchRelated( 'annotation_layer', - 'created_by', + 'changed_by', createErrorHandler(errMsg => t( 'An error occurred while fetching dataset datasource values: %s', @@ -299,13 +261,6 @@ function AnnotationLayersList({ ), paginate: true, }, - { - Header: t('Search'), - key: 'search', - id: 'name', - input: 'search', - operator: FilterOperator.contains, - }, ], [], ); diff --git a/superset-frontend/src/pages/AnnotationList/index.tsx b/superset-frontend/src/pages/AnnotationList/index.tsx index 980a18ba72e49..e04b48080f32a 100644 --- a/superset-frontend/src/pages/AnnotationList/index.tsx +++ b/superset-frontend/src/pages/AnnotationList/index.tsx @@ -154,7 +154,7 @@ function AnnotationList({ () => [ { accessor: 'short_descr', - Header: t('Label'), + Header: t('Name'), }, { accessor: 'long_descr', diff --git a/superset-frontend/src/pages/ChartList/index.tsx b/superset-frontend/src/pages/ChartList/index.tsx index d13113158e778..cbda387681b23 100644 --- a/superset-frontend/src/pages/ChartList/index.tsx +++ b/superset-frontend/src/pages/ChartList/index.tsx @@ -29,7 +29,6 @@ import { import React, { useState, useMemo, useCallback } from 'react'; import rison from 'rison'; import { uniqBy } from 'lodash'; -import moment from 'moment'; import { useSelector } from 'react-redux'; import { createErrorHandler, @@ -65,15 +64,16 @@ import Tag from 'src/types/TagType'; import { Tooltip } from 'src/components/Tooltip'; import Icons from 'src/components/Icons'; import { nativeFilterGate } from 'src/dashboard/components/nativeFilters/utils'; -import setupPlugins from 'src/setup/setupPlugins'; import InfoTooltip from 'src/components/InfoTooltip'; import CertifiedBadge from 'src/components/CertifiedBadge'; import { GenericLink } from 'src/components/GenericLink/GenericLink'; -import Owner from 'src/types/Owner'; import { loadTags } from 'src/components/Tags/utils'; +import FacePile from 'src/components/FacePile'; import ChartCard from 'src/features/charts/ChartCard'; import { UserWithPermissionsAndRoles } from 'src/types/bootstrapTypes'; import { findPermission } from 'src/utils/findPermission'; +import { ModifiedInfo } from 'src/components/AuditInfo'; +import { QueryObjectColumns } from 'src/views/CRUD/types'; const FlexRowContainer = styled.div` align-items: center; @@ -105,7 +105,6 @@ const CONFIRM_OVERWRITE_MESSAGE = t( 'sure you want to overwrite?', ); -setupPlugins(); const registry = getChartMetadataRegistry(); const createFetchDatasets = async ( @@ -245,10 +244,6 @@ function ChartList(props: ChartListProps) { }); setPreparingExport(true); }; - const changedByName = (lastSavedBy: Owner) => - lastSavedBy?.first_name - ? `${lastSavedBy?.first_name} ${lastSavedBy?.last_name}` - : null; function handleBulkChartDelete(chartsToDelete: Chart[]) { SupersetClient.delete({ @@ -366,7 +361,7 @@ function ChartList(props: ChartListProps) { )} </FlexRowContainer> ), - Header: t('Chart'), + Header: t('Name'), accessor: 'slice_name', }, { @@ -375,7 +370,7 @@ function ChartList(props: ChartListProps) { original: { viz_type: vizType }, }, }: any) => registry.get(vizType)?.name || vizType, - Header: t('Visualization type'), + Header: t('Type'), accessor: 'viz_type', size: 'xxl', }, @@ -438,44 +433,27 @@ function ChartList(props: ChartListProps) { { Cell: ({ row: { - original: { last_saved_by: lastSavedBy }, + original: { owners = [] }, }, - }: any) => <>{changedByName(lastSavedBy)}</>, - Header: t('Modified by'), - accessor: 'last_saved_by.first_name', + }: any) => <FacePile users={owners} />, + Header: t('Owners'), + accessor: 'owners', + disableSortBy: true, size: 'xl', }, { Cell: ({ row: { - original: { last_saved_at: lastSavedAt }, + original: { + changed_on_delta_humanized: changedOn, + changed_by: changedBy, + }, }, - }: any) => ( - <span className="no-wrap"> - {lastSavedAt ? moment.utc(lastSavedAt).fromNow() : null} - </span> - ), + }: any) => <ModifiedInfo date={changedOn} user={changedBy} />, Header: t('Last modified'), accessor: 'last_saved_at', size: 'xl', }, - { - accessor: 'owners', - hidden: true, - disableSortBy: true, - }, - { - Cell: ({ - row: { - original: { created_by: createdBy }, - }, - }: any) => - createdBy ? `${createdBy.first_name} ${createdBy.last_name}` : '', - Header: t('Created by'), - accessor: 'created_by', - disableSortBy: true, - size: 'xl', - }, { Cell: ({ row: { original } }: any) => { const handleDelete = () => @@ -563,6 +541,10 @@ function ChartList(props: ChartListProps) { disableSortBy: true, hidden: !canEdit && !canDelete, }, + { + accessor: QueryObjectColumns.changed_by, + hidden: true, + }, ], [ userId, @@ -597,58 +579,14 @@ function ChartList(props: ChartListProps) { const filters: Filters = useMemo(() => { const filters_list = [ { - Header: t('Search'), + Header: t('Name'), key: 'search', id: 'slice_name', input: 'search', operator: FilterOperator.chartAllText, }, { - Header: t('Owner'), - key: 'owner', - id: 'owners', - input: 'select', - operator: FilterOperator.relationManyMany, - unfilteredLabel: t('All'), - fetchSelects: createFetchRelated( - 'chart', - 'owners', - createErrorHandler(errMsg => - addDangerToast( - t( - 'An error occurred while fetching chart owners values: %s', - errMsg, - ), - ), - ), - props.user, - ), - paginate: true, - }, - { - Header: t('Created by'), - key: 'created_by', - id: 'created_by', - input: 'select', - operator: FilterOperator.relationOneMany, - unfilteredLabel: t('All'), - fetchSelects: createFetchRelated( - 'chart', - 'created_by', - createErrorHandler(errMsg => - addDangerToast( - t( - 'An error occurred while fetching chart created by values: %s', - errMsg, - ), - ), - ), - props.user, - ), - paginate: true, - }, - { - Header: t('Chart type'), + Header: t('Type'), key: 'viz_type', id: 'viz_type', input: 'select', @@ -683,8 +621,43 @@ function ChartList(props: ChartListProps) { fetchSelects: createFetchDatasets, paginate: true, }, + ...(isFeatureEnabled(FeatureFlag.TAGGING_SYSTEM) && canReadTag + ? [ + { + Header: t('Tag'), + key: 'tags', + id: 'tags', + input: 'select', + operator: FilterOperator.chartTags, + unfilteredLabel: t('All'), + fetchSelects: loadTags, + }, + ] + : []), { - Header: t('Dashboards'), + Header: t('Owner'), + key: 'owner', + id: 'owners', + input: 'select', + operator: FilterOperator.relationManyMany, + unfilteredLabel: t('All'), + fetchSelects: createFetchRelated( + 'chart', + 'owners', + createErrorHandler(errMsg => + addDangerToast( + t( + 'An error occurred while fetching chart owners values: %s', + errMsg, + ), + ), + ), + props.user, + ), + paginate: true, + }, + { + Header: t('Dashboard'), key: 'dashboards', id: 'dashboards', input: 'select', @@ -707,18 +680,27 @@ function ChartList(props: ChartListProps) { { label: t('No'), value: false }, ], }, - ] as Filters; - if (isFeatureEnabled(FeatureFlag.TAGGING_SYSTEM) && canReadTag) { - filters_list.push({ - Header: t('Tags'), - key: 'tags', - id: 'tags', + { + Header: t('Modified by'), + key: 'changed_by', + id: 'changed_by', input: 'select', - operator: FilterOperator.chartTags, + operator: FilterOperator.relationOneMany, unfilteredLabel: t('All'), - fetchSelects: loadTags, - }); - } + fetchSelects: createFetchRelated( + 'chart', + 'changed_by', + createErrorHandler(errMsg => + t( + 'An error occurred while fetching dataset datasource values: %s', + errMsg, + ), + ), + props.user, + ), + paginate: true, + }, + ] as Filters; return filters_list; }, [addDangerToast, favoritesFilter, props.user]); diff --git a/superset-frontend/src/pages/CssTemplateList/index.tsx b/superset-frontend/src/pages/CssTemplateList/index.tsx index f777f8e743ee4..b77217b22f7eb 100644 --- a/superset-frontend/src/pages/CssTemplateList/index.tsx +++ b/superset-frontend/src/pages/CssTemplateList/index.tsx @@ -21,13 +21,11 @@ import React, { useMemo, useState } from 'react'; import { t, SupersetClient } from '@superset-ui/core'; import rison from 'rison'; -import moment from 'moment'; import { useListViewResource } from 'src/views/CRUD/hooks'; -import { createFetchRelated, createErrorHandler } from 'src/views/CRUD/utils'; +import { createErrorHandler, createFetchRelated } from 'src/views/CRUD/utils'; import withToasts from 'src/components/MessageToasts/withToasts'; import SubMenu, { SubMenuProps } from 'src/features/home/SubMenu'; import DeleteModal from 'src/components/DeleteModal'; -import { Tooltip } from 'src/components/Tooltip'; import ConfirmStatusChange from 'src/components/ConfirmStatusChange'; import ActionsBar, { ActionProps } from 'src/components/ListView/ActionsBar'; import ListView, { @@ -37,6 +35,8 @@ import ListView, { } from 'src/components/ListView'; import CssTemplateModal from 'src/features/cssTemplates/CssTemplateModal'; import { TemplateObject } from 'src/features/cssTemplates/types'; +import { ModifiedInfo } from 'src/components/AuditInfo'; +import { QueryObjectColumns } from 'src/views/CRUD/types'; const PAGE_SIZE = 25; @@ -138,66 +138,12 @@ function CssTemplatesList({ changed_by: changedBy, }, }, - }: any) => { - let name = 'null'; - - if (changedBy) { - name = `${changedBy.first_name} ${changedBy.last_name}`; - } - - return ( - <Tooltip - id="allow-run-async-header-tooltip" - title={t('Last modified by %s', name)} - placement="right" - > - <span>{changedOn}</span> - </Tooltip> - ); - }, + }: any) => <ModifiedInfo date={changedOn} user={changedBy} />, Header: t('Last modified'), accessor: 'changed_on_delta_humanized', size: 'xl', disableSortBy: true, }, - { - Cell: ({ - row: { - original: { created_on: createdOn }, - }, - }: any) => { - const date = new Date(createdOn); - const utc = new Date( - Date.UTC( - date.getFullYear(), - date.getMonth(), - date.getDate(), - date.getHours(), - date.getMinutes(), - date.getSeconds(), - date.getMilliseconds(), - ), - ); - - return moment(utc).fromNow(); - }, - Header: t('Created on'), - accessor: 'created_on', - size: 'xl', - disableSortBy: true, - }, - { - accessor: 'created_by', - disableSortBy: true, - Header: t('Created by'), - Cell: ({ - row: { - original: { created_by: createdBy }, - }, - }: any) => - createdBy ? `${createdBy.first_name} ${createdBy.last_name}` : '', - size: 'xl', - }, { Cell: ({ row: { original } }: any) => { const handleEdit = () => handleCssTemplateEdit(original); @@ -232,6 +178,10 @@ function CssTemplatesList({ hidden: !canEdit && !canDelete, size: 'xl', }, + { + accessor: QueryObjectColumns.changed_by, + hidden: true, + }, ], [canDelete, canCreate], ); @@ -270,15 +220,22 @@ function CssTemplatesList({ const filters: Filters = useMemo( () => [ { - Header: t('Created by'), - key: 'created_by', - id: 'created_by', + Header: t('Name'), + key: 'search', + id: 'template_name', + input: 'search', + operator: FilterOperator.contains, + }, + { + Header: t('Modified by'), + key: 'changed_by', + id: 'changed_by', input: 'select', operator: FilterOperator.relationOneMany, unfilteredLabel: t('All'), fetchSelects: createFetchRelated( 'css_template', - 'created_by', + 'changed_by', createErrorHandler(errMsg => t( 'An error occurred while fetching dataset datasource values: %s', @@ -289,13 +246,6 @@ function CssTemplatesList({ ), paginate: true, }, - { - Header: t('Search'), - key: 'search', - id: 'template_name', - input: 'search', - operator: FilterOperator.contains, - }, ], [], ); diff --git a/superset-frontend/src/pages/DashboardList/index.tsx b/superset-frontend/src/pages/DashboardList/index.tsx index 6542d85129722..e82b70185991e 100644 --- a/superset-frontend/src/pages/DashboardList/index.tsx +++ b/superset-frontend/src/pages/DashboardList/index.tsx @@ -57,13 +57,17 @@ import { Tooltip } from 'src/components/Tooltip'; import ImportModelsModal from 'src/components/ImportModal/index'; import Dashboard from 'src/dashboard/containers/Dashboard'; -import { Dashboard as CRUDDashboard } from 'src/views/CRUD/types'; +import { + Dashboard as CRUDDashboard, + QueryObjectColumns, +} from 'src/views/CRUD/types'; import CertifiedBadge from 'src/components/CertifiedBadge'; import { loadTags } from 'src/components/Tags/utils'; import DashboardCard from 'src/features/dashboards/DashboardCard'; import { DashboardStatus } from 'src/features/dashboards/types'; import { UserWithPermissionsAndRoles } from 'src/types/bootstrapTypes'; import { findPermission } from 'src/utils/findPermission'; +import { ModifiedInfo } from 'src/components/AuditInfo'; const PAGE_SIZE = 25; const PASSWORDS_NEEDED_MESSAGE = t( @@ -108,11 +112,7 @@ const Actions = styled.div` `; function DashboardList(props: DashboardListProps) { - const { - addDangerToast, - addSuccessToast, - user: { userId }, - } = props; + const { addDangerToast, addSuccessToast, user } = props; const { roles } = useSelector<any, UserWithPermissionsAndRoles>( state => state.user, @@ -178,7 +178,7 @@ function DashboardList(props: DashboardListProps) { }; // TODO: Fix usage of localStorage keying on the user id - const userKey = dangerouslyGetItemDoNotUse(userId?.toString(), null); + const userKey = dangerouslyGetItemDoNotUse(user?.userId?.toString(), null); const canCreate = hasPerm('can_write'); const canEdit = hasPerm('can_write'); @@ -274,7 +274,7 @@ function DashboardList(props: DashboardListProps) { original: { id }, }, }: any) => - userId && ( + user?.userId && ( <FaveStar itemId={id} saveFaveStar={saveFavoriteStatus} @@ -285,7 +285,7 @@ function DashboardList(props: DashboardListProps) { id: 'id', disableSortBy: true, size: 'xs', - hidden: !userId, + hidden: !user?.userId, }, { Cell: ({ @@ -310,9 +310,20 @@ function DashboardList(props: DashboardListProps) { {dashboardTitle} </Link> ), - Header: t('Title'), + Header: t('Name'), accessor: 'dashboard_title', }, + { + Cell: ({ + row: { + original: { status }, + }, + }: any) => + status === DashboardStatus.PUBLISHED ? t('Published') : t('Draft'), + Header: t('Status'), + accessor: 'published', + size: 'xl', + }, { Cell: ({ row: { @@ -341,55 +352,25 @@ function DashboardList(props: DashboardListProps) { { Cell: ({ row: { - original: { changed_by_name: changedByName }, - }, - }: any) => <>{changedByName}</>, - Header: t('Modified by'), - accessor: 'changed_by.first_name', - size: 'xl', - }, - { - Cell: ({ - row: { - original: { status }, - }, - }: any) => - status === DashboardStatus.PUBLISHED ? t('Published') : t('Draft'), - Header: t('Status'), - accessor: 'published', - size: 'xl', - }, - { - Cell: ({ - row: { - original: { changed_on_delta_humanized: changedOn }, - }, - }: any) => <span className="no-wrap">{changedOn}</span>, - Header: t('Modified'), - accessor: 'changed_on_delta_humanized', - size: 'xl', - }, - { - Cell: ({ - row: { - original: { created_by: createdBy }, + original: { owners = [] }, }, - }: any) => - createdBy ? `${createdBy.first_name} ${createdBy.last_name}` : '', - Header: t('Created by'), - accessor: 'created_by', + }: any) => <FacePile users={owners} />, + Header: t('Owners'), + accessor: 'owners', disableSortBy: true, size: 'xl', }, { Cell: ({ row: { - original: { owners = [] }, + original: { + changed_on_delta_humanized: changedOn, + changed_by: changedBy, + }, }, - }: any) => <FacePile users={owners} />, - Header: t('Owners'), - accessor: 'owners', - disableSortBy: true, + }: any) => <ModifiedInfo date={changedOn} user={changedBy} />, + Header: t('Last modified'), + accessor: 'changed_on_delta_humanized', size: 'xl', }, { @@ -475,9 +456,13 @@ function DashboardList(props: DashboardListProps) { hidden: !canEdit && !canDelete && !canExport, disableSortBy: true, }, + { + accessor: QueryObjectColumns.changed_by, + hidden: true, + }, ], [ - userId, + user?.userId, canEdit, canDelete, canExport, @@ -509,12 +494,37 @@ function DashboardList(props: DashboardListProps) { const filters: Filters = useMemo(() => { const filters_list = [ { - Header: t('Search'), + Header: t('Name'), key: 'search', id: 'dashboard_title', input: 'search', operator: FilterOperator.titleOrSlug, }, + { + Header: t('Status'), + key: 'published', + id: 'published', + input: 'select', + operator: FilterOperator.equals, + unfilteredLabel: t('Any'), + selects: [ + { label: t('Published'), value: true }, + { label: t('Draft'), value: false }, + ], + }, + ...(isFeatureEnabled(FeatureFlag.TAGGING_SYSTEM) && canReadTag + ? [ + { + Header: t('Tag'), + key: 'tags', + id: 'tags', + input: 'select', + operator: FilterOperator.dashboardTags, + unfilteredLabel: t('All'), + fetchSelects: loadTags, + }, + ] + : []), { Header: t('Owner'), key: 'owner', @@ -537,41 +547,7 @@ function DashboardList(props: DashboardListProps) { ), paginate: true, }, - { - Header: t('Created by'), - key: 'created_by', - id: 'created_by', - input: 'select', - operator: FilterOperator.relationOneMany, - unfilteredLabel: t('All'), - fetchSelects: createFetchRelated( - 'dashboard', - 'created_by', - createErrorHandler(errMsg => - addDangerToast( - t( - 'An error occurred while fetching dashboard created by values: %s', - errMsg, - ), - ), - ), - props.user, - ), - paginate: true, - }, - { - Header: t('Status'), - key: 'published', - id: 'published', - input: 'select', - operator: FilterOperator.equals, - unfilteredLabel: t('Any'), - selects: [ - { label: t('Published'), value: true }, - { label: t('Draft'), value: false }, - ], - }, - ...(userId ? [favoritesFilter] : []), + ...(user?.userId ? [favoritesFilter] : []), { Header: t('Certified'), key: 'certified', @@ -585,18 +561,27 @@ function DashboardList(props: DashboardListProps) { { label: t('No'), value: false }, ], }, - ] as Filters; - if (isFeatureEnabled(FeatureFlag.TAGGING_SYSTEM) && canReadTag) { - filters_list.push({ - Header: t('Tags'), - key: 'tags', - id: 'tags', + { + Header: t('Modified by'), + key: 'changed_by', + id: 'changed_by', input: 'select', - operator: FilterOperator.dashboardTags, + operator: FilterOperator.relationOneMany, unfilteredLabel: t('All'), - fetchSelects: loadTags, - }); - } + fetchSelects: createFetchRelated( + 'dashboard', + 'changed_by', + createErrorHandler(errMsg => + t( + 'An error occurred while fetching dataset datasource values: %s', + errMsg, + ), + ), + user, + ), + paginate: true, + }, + ] as Filters; return filters_list; }, [addDangerToast, favoritesFilter, props.user]); @@ -632,7 +617,7 @@ function DashboardList(props: DashboardListProps) { ? userKey.thumbnails : isFeatureEnabled(FeatureFlag.THUMBNAILS) } - userId={userId} + userId={user?.userId} loading={loading} openDashboardEditModal={openDashboardEditModal} saveFavoriteStatus={saveFavoriteStatus} @@ -646,7 +631,7 @@ function DashboardList(props: DashboardListProps) { favoriteStatus, hasPerm, loading, - userId, + user?.userId, saveFavoriteStatus, userKey, ], @@ -743,7 +728,7 @@ function DashboardList(props: DashboardListProps) { addSuccessToast, addDangerToast, undefined, - userId, + user?.userId, ); setDashboardToDelete(null); }} diff --git a/superset-frontend/src/pages/DatabaseList/DatabaseList.test.jsx b/superset-frontend/src/pages/DatabaseList/DatabaseList.test.jsx index fd989b50d2270..b1bfb245d37d1 100644 --- a/superset-frontend/src/pages/DatabaseList/DatabaseList.test.jsx +++ b/superset-frontend/src/pages/DatabaseList/DatabaseList.test.jsx @@ -218,7 +218,7 @@ describe('Admin DatabaseList', () => { await waitForComponentToPaint(wrapper); expect(fetchMock.lastCall()[0]).toMatchInlineSnapshot( - `"http://localhost/api/v1/database/?q=(filters:!((col:expose_in_sqllab,opr:eq,value:!t),(col:allow_run_async,opr:eq,value:!f),(col:database_name,opr:ct,value:fooo)),order_column:changed_on_delta_humanized,order_direction:desc,page:0,page_size:25)"`, + `"http://localhost/api/v1/database/?q=(filters:!((col:database_name,opr:ct,value:fooo),(col:expose_in_sqllab,opr:eq,value:!t),(col:allow_run_async,opr:eq,value:!f)),order_column:changed_on_delta_humanized,order_direction:desc,page:0,page_size:25)"`, ); }); diff --git a/superset-frontend/src/pages/DatabaseList/index.tsx b/superset-frontend/src/pages/DatabaseList/index.tsx index d2308bd117f61..8c98392aca93e 100644 --- a/superset-frontend/src/pages/DatabaseList/index.tsx +++ b/superset-frontend/src/pages/DatabaseList/index.tsx @@ -32,7 +32,11 @@ import { LocalStorageKeys, setItem } from 'src/utils/localStorageHelpers'; import Loading from 'src/components/Loading'; import { useListViewResource } from 'src/views/CRUD/hooks'; -import { createErrorHandler, uploadUserPerms } from 'src/views/CRUD/utils'; +import { + createErrorHandler, + createFetchRelated, + uploadUserPerms, +} from 'src/views/CRUD/utils'; import withToasts from 'src/components/MessageToasts/withToasts'; import SubMenu, { SubMenuProps } from 'src/features/home/SubMenu'; import DeleteModal from 'src/components/DeleteModal'; @@ -48,6 +52,8 @@ import { UserWithPermissionsAndRoles } from 'src/types/bootstrapTypes'; import type { MenuObjectProps } from 'src/types/bootstrapTypes'; import DatabaseModal from 'src/features/databases/DatabaseModal'; import { DatabaseObject } from 'src/features/databases/types'; +import { ModifiedInfo } from 'src/components/AuditInfo'; +import { QueryObjectColumns } from 'src/views/CRUD/types'; const extensionsRegistry = getExtensionsRegistry(); const DatabaseDeleteRelatedExtension = extensionsRegistry.get( @@ -67,6 +73,11 @@ interface DatabaseDeleteObject extends DatabaseObject { interface DatabaseListProps { addDangerToast: (msg: string) => void; addSuccessToast: (msg: string) => void; + user: { + userId: string | number; + firstName: string; + lastName: string; + }; } const IconCheck = styled(Icons.Check)` @@ -90,7 +101,11 @@ function BooleanDisplay({ value }: { value: Boolean }) { return value ? <IconCheck /> : <IconCancelX />; } -function DatabaseList({ addDangerToast, addSuccessToast }: DatabaseListProps) { +function DatabaseList({ + addDangerToast, + addSuccessToast, + user, +}: DatabaseListProps) { const { state: { loading, @@ -105,7 +120,7 @@ function DatabaseList({ addDangerToast, addSuccessToast }: DatabaseListProps) { t('database'), addDangerToast, ); - const user = useSelector<any, UserWithPermissionsAndRoles>( + const fullUser = useSelector<any, UserWithPermissionsAndRoles>( state => state.user, ); const showDatabaseModal = getUrlParam(URL_PARAMS.showDatabaseModal); @@ -123,11 +138,11 @@ function DatabaseList({ addDangerToast, addSuccessToast }: DatabaseListProps) { null, ); const [allowUploads, setAllowUploads] = useState<boolean>(false); - const isAdmin = isUserAdmin(user); + const isAdmin = isUserAdmin(fullUser); const showUploads = allowUploads || isAdmin; const [preparingExport, setPreparingExport] = useState<boolean>(false); - const { roles } = user; + const { roles } = fullUser; const { CSV_EXTENSIONS, COLUMNAR_EXTENSIONS, @@ -313,7 +328,7 @@ function DatabaseList({ addDangerToast, addSuccessToast }: DatabaseListProps) { () => [ { accessor: 'database_name', - Header: t('Database'), + Header: t('Name'), }, { accessor: 'backend', @@ -380,23 +395,14 @@ function DatabaseList({ addDangerToast, addSuccessToast }: DatabaseListProps) { size: 'md', }, { - accessor: 'created_by', - disableSortBy: true, - Header: t('Created by'), Cell: ({ row: { - original: { created_by: createdBy }, + original: { + changed_by: changedBy, + changed_on_delta_humanized: changedOn, + }, }, - }: any) => - createdBy ? `${createdBy.first_name} ${createdBy.last_name}` : '', - size: 'xl', - }, - { - Cell: ({ - row: { - original: { changed_on_delta_humanized: changedOn }, - }, - }: any) => changedOn, + }: any) => <ModifiedInfo date={changedOn} user={changedBy} />, Header: t('Last modified'), accessor: 'changed_on_delta_humanized', size: 'xl', @@ -470,12 +476,23 @@ function DatabaseList({ addDangerToast, addSuccessToast }: DatabaseListProps) { hidden: !canEdit && !canDelete, disableSortBy: true, }, + { + accessor: QueryObjectColumns.changed_by, + hidden: true, + }, ], [canDelete, canEdit, canExport], ); const filters: Filters = useMemo( () => [ + { + Header: t('Name'), + key: 'search', + id: 'database_name', + input: 'search', + operator: FilterOperator.contains, + }, { Header: t('Expose in SQL Lab'), key: 'expose_in_sql_lab', @@ -509,11 +526,24 @@ function DatabaseList({ addDangerToast, addSuccessToast }: DatabaseListProps) { ], }, { - Header: t('Search'), - key: 'search', - id: 'database_name', - input: 'search', - operator: FilterOperator.contains, + Header: t('Modified by'), + key: 'changed_by', + id: 'changed_by', + input: 'select', + operator: FilterOperator.relationOneMany, + unfilteredLabel: t('All'), + fetchSelects: createFetchRelated( + 'database', + 'changed_by', + createErrorHandler(errMsg => + t( + 'An error occurred while fetching dataset datasource values: %s', + errMsg, + ), + ), + user, + ), + paginate: true, }, ], [], diff --git a/superset-frontend/src/pages/DatasetList/DatasetList.test.tsx b/superset-frontend/src/pages/DatasetList/DatasetList.test.tsx index 916dd0615bb8b..c316001bb46e4 100644 --- a/superset-frontend/src/pages/DatasetList/DatasetList.test.tsx +++ b/superset-frontend/src/pages/DatasetList/DatasetList.test.tsx @@ -285,56 +285,41 @@ describe('RTL', () => { }); describe('Prevent unsafe URLs', () => { + const columnCount = 8; + const exploreUrlIndex = 1; + const getTdIndex = (rowNumber: number): number => + rowNumber * columnCount + exploreUrlIndex; + const mockedProps = {}; let wrapper: any; it('Check prevent unsafe is on renders relative links', async () => { - const tdColumnsNumber = 9; useSelectorMock.mockReturnValue(true); wrapper = await mountAndWait(mockedProps); const tdElements = wrapper.find(ListView).find('td'); - expect( - tdElements - .at(0 * tdColumnsNumber + 1) - .find('a') - .prop('href'), - ).toBe('/https://www.google.com?0'); - expect( - tdElements - .at(1 * tdColumnsNumber + 1) - .find('a') - .prop('href'), - ).toBe('/https://www.google.com?1'); - expect( - tdElements - .at(2 * tdColumnsNumber + 1) - .find('a') - .prop('href'), - ).toBe('/https://www.google.com?2'); + expect(tdElements.at(getTdIndex(0)).find('a').prop('href')).toBe( + '/https://www.google.com?0', + ); + expect(tdElements.at(getTdIndex(1)).find('a').prop('href')).toBe( + '/https://www.google.com?1', + ); + expect(tdElements.at(getTdIndex(2)).find('a').prop('href')).toBe( + '/https://www.google.com?2', + ); }); it('Check prevent unsafe is off renders absolute links', async () => { - const tdColumnsNumber = 9; useSelectorMock.mockReturnValue(false); wrapper = await mountAndWait(mockedProps); const tdElements = wrapper.find(ListView).find('td'); - expect( - tdElements - .at(0 * tdColumnsNumber + 1) - .find('a') - .prop('href'), - ).toBe('https://www.google.com?0'); - expect( - tdElements - .at(1 * tdColumnsNumber + 1) - .find('a') - .prop('href'), - ).toBe('https://www.google.com?1'); - expect( - tdElements - .at(2 * tdColumnsNumber + 1) - .find('a') - .prop('href'), - ).toBe('https://www.google.com?2'); + expect(tdElements.at(getTdIndex(0)).find('a').prop('href')).toBe( + 'https://www.google.com?0', + ); + expect(tdElements.at(getTdIndex(1)).find('a').prop('href')).toBe( + 'https://www.google.com?1', + ); + expect(tdElements.at(getTdIndex(2)).find('a').prop('href')).toBe( + 'https://www.google.com?2', + ); }); }); diff --git a/superset-frontend/src/pages/DatasetList/index.tsx b/superset-frontend/src/pages/DatasetList/index.tsx index d86d7a7b0ffd5..8a39cb0463e2b 100644 --- a/superset-frontend/src/pages/DatasetList/index.tsx +++ b/superset-frontend/src/pages/DatasetList/index.tsx @@ -70,6 +70,8 @@ import { } from 'src/features/datasets/constants'; import DuplicateDatasetModal from 'src/features/datasets/DuplicateDatasetModal'; import { useSelector } from 'react-redux'; +import { ModifiedInfo } from 'src/components/AuditInfo'; +import { QueryObjectColumns } from 'src/views/CRUD/types'; const extensionsRegistry = getExtensionsRegistry(); const DatasetDeleteRelatedExtension = extensionsRegistry.get( @@ -380,26 +382,6 @@ const DatasetList: FunctionComponent<DatasetListProps> = ({ accessor: 'schema', size: 'lg', }, - { - Cell: ({ - row: { - original: { changed_on_delta_humanized: changedOn }, - }, - }: any) => <span className="no-wrap">{changedOn}</span>, - Header: t('Modified'), - accessor: 'changed_on_delta_humanized', - size: 'xl', - }, - { - Cell: ({ - row: { - original: { changed_by_name: changedByName }, - }, - }: any) => changedByName, - Header: t('Modified by'), - accessor: 'changed_by.first_name', - size: 'xl', - }, { accessor: 'database', disableSortBy: true, @@ -416,6 +398,19 @@ const DatasetList: FunctionComponent<DatasetListProps> = ({ disableSortBy: true, size: 'lg', }, + { + Cell: ({ + row: { + original: { + changed_on_delta_humanized: changedOn, + changed_by: changedBy, + }, + }, + }: any) => <ModifiedInfo date={changedOn} user={changedBy} />, + Header: t('Last modified'), + accessor: 'changed_on_delta_humanized', + size: 'xl', + }, { accessor: 'sql', hidden: true, @@ -515,6 +510,10 @@ const DatasetList: FunctionComponent<DatasetListProps> = ({ hidden: !canEdit && !canDelete && !canDuplicate, disableSortBy: true, }, + { + accessor: QueryObjectColumns.changed_by, + hidden: true, + }, ], [canEdit, canDelete, canExport, openDatasetEditModal, canDuplicate, user], ); @@ -522,31 +521,23 @@ const DatasetList: FunctionComponent<DatasetListProps> = ({ const filterTypes: Filters = useMemo( () => [ { - Header: t('Search'), + Header: t('Name'), key: 'search', id: 'table_name', input: 'search', operator: FilterOperator.contains, }, { - Header: t('Owner'), - key: 'owner', - id: 'owners', + Header: t('Type'), + key: 'sql', + id: 'sql', input: 'select', - operator: FilterOperator.relationManyMany, + operator: FilterOperator.datasetIsNullOrEmpty, unfilteredLabel: 'All', - fetchSelects: createFetchRelated( - 'dataset', - 'owners', - createErrorHandler(errMsg => - t( - 'An error occurred while fetching dataset owner values: %s', - errMsg, - ), - ), - user, - ), - paginate: true, + selects: [ + { label: t('Virtual'), value: false }, + { label: t('Physical'), value: true }, + ], }, { Header: t('Database'), @@ -581,16 +572,24 @@ const DatasetList: FunctionComponent<DatasetListProps> = ({ paginate: true, }, { - Header: t('Type'), - key: 'sql', - id: 'sql', + Header: t('Owner'), + key: 'owner', + id: 'owners', input: 'select', - operator: FilterOperator.datasetIsNullOrEmpty, + operator: FilterOperator.relationManyMany, unfilteredLabel: 'All', - selects: [ - { label: t('Virtual'), value: false }, - { label: t('Physical'), value: true }, - ], + fetchSelects: createFetchRelated( + 'dataset', + 'owners', + createErrorHandler(errMsg => + t( + 'An error occurred while fetching dataset owner values: %s', + errMsg, + ), + ), + user, + ), + paginate: true, }, { Header: t('Certified'), @@ -605,6 +604,26 @@ const DatasetList: FunctionComponent<DatasetListProps> = ({ { label: t('No'), value: false }, ], }, + { + Header: t('Modified by'), + key: 'changed_by', + id: 'changed_by', + input: 'select', + operator: FilterOperator.relationOneMany, + unfilteredLabel: t('All'), + fetchSelects: createFetchRelated( + 'dataset', + 'changed_by', + createErrorHandler(errMsg => + t( + 'An error occurred while fetching dataset datasource values: %s', + errMsg, + ), + ), + user, + ), + paginate: true, + }, ], [user], ); diff --git a/superset-frontend/src/pages/QueryHistoryList/index.tsx b/superset-frontend/src/pages/QueryHistoryList/index.tsx index 63e916e399299..94b646d9e4350 100644 --- a/superset-frontend/src/pages/QueryHistoryList/index.tsx +++ b/superset-frontend/src/pages/QueryHistoryList/index.tsx @@ -34,6 +34,7 @@ import { } from 'src/views/CRUD/utils'; import withToasts from 'src/components/MessageToasts/withToasts'; import { useListViewResource } from 'src/views/CRUD/hooks'; +import Label from 'src/components/Label'; import SubMenu, { SubMenuProps } from 'src/features/home/SubMenu'; import Popover from 'src/components/Popover'; import { commonMenuData } from 'src/features/home/commonMenuData'; @@ -52,6 +53,7 @@ import { QueryObject, QueryObjectColumns } from 'src/views/CRUD/types'; import Icons from 'src/components/Icons'; import QueryPreviewModal from 'src/features/queries/QueryPreviewModal'; import { addSuccessToast } from 'src/components/MessageToasts/actions'; +import getOwnerName from 'src/utils/getOwnerName'; const PAGE_SIZE = 25; const SQL_PREVIEW_MAX_LINES = 4; @@ -88,6 +90,11 @@ const StyledPopoverItem = styled.div` color: ${({ theme }) => theme.colors.grayscale.dark2}; `; +const TimerLabel = styled(Label)` + text-align: left; + font-family: ${({ theme }) => theme.typography.families.monospace}; +`; + function QueryList({ addDangerToast }: QueryListProps) { const { state: { loading, resourceCount: queryCount, resourceCollection: queries }, @@ -204,7 +211,7 @@ function QueryList({ addDangerToast }: QueryListProps) { size: 'xl', Cell: ({ row: { - original: { start_time, end_time }, + original: { start_time }, }, }: any) => { const startMoment = moment.utc(start_time).local(); @@ -218,19 +225,25 @@ function QueryList({ addDangerToast }: QueryListProps) { {formattedStartTimeData[1]} </> ); - - return end_time ? ( - <Tooltip - title={t( - 'Duration: %s', - moment(moment.utc(end_time - start_time)).format(TIME_WITH_MS), - )} - placement="bottom" - > - <span>{formattedStartTime}</span> - </Tooltip> - ) : ( - formattedStartTime + return formattedStartTime; + }, + }, + { + Header: t('Duration'), + size: 'xl', + Cell: ({ + row: { + original: { status, start_time, end_time }, + }, + }: any) => { + const timerType = status === QueryState.FAILED ? 'danger' : status; + const timerTime = end_time + ? moment(moment.utc(end_time - start_time)).format(TIME_WITH_MS) + : '00:00:00.000'; + return ( + <TimerLabel type={timerType} role="timer"> + {timerTime} + </TimerLabel> ); }, }, @@ -299,7 +312,7 @@ function QueryList({ addDangerToast }: QueryListProps) { row: { original: { user }, }, - }: any) => (user ? `${user.first_name} ${user.last_name}` : ''), + }: any) => getOwnerName(user), }, { accessor: QueryObjectColumns.user, diff --git a/superset-frontend/src/pages/RowLevelSecurityList/RowLevelSecurityList.test.tsx b/superset-frontend/src/pages/RowLevelSecurityList/RowLevelSecurityList.test.tsx index a4621ed10eada..6721f73add1fd 100644 --- a/superset-frontend/src/pages/RowLevelSecurityList/RowLevelSecurityList.test.tsx +++ b/superset-frontend/src/pages/RowLevelSecurityList/RowLevelSecurityList.test.tsx @@ -187,8 +187,8 @@ describe('RuleList RTL', () => { const searchFilters = screen.queryAllByTestId('filters-search'); expect(searchFilters).toHaveLength(2); - const typeFilter = await screen.findByTestId('filters-select'); - expect(typeFilter).toBeInTheDocument(); + const typeFilter = screen.queryAllByTestId('filters-select'); + expect(typeFilter).toHaveLength(2); }); it('renders correct list columns', async () => { @@ -201,7 +201,7 @@ describe('RuleList RTL', () => { const fitlerTypeColumn = await within(table).findByText('Filter Type'); const groupKeyColumn = await within(table).findByText('Group Key'); const clauseColumn = await within(table).findByText('Clause'); - const modifiedColumn = await within(table).findByText('Modified'); + const modifiedColumn = await within(table).findByText('Last modified'); const actionsColumn = await within(table).findByText('Actions'); expect(nameColumn).toBeInTheDocument(); diff --git a/superset-frontend/src/pages/RowLevelSecurityList/index.tsx b/superset-frontend/src/pages/RowLevelSecurityList/index.tsx index 3c1e3b8aae865..bef42284d0b76 100644 --- a/superset-frontend/src/pages/RowLevelSecurityList/index.tsx +++ b/superset-frontend/src/pages/RowLevelSecurityList/index.tsx @@ -33,7 +33,9 @@ import rison from 'rison'; import { useListViewResource } from 'src/views/CRUD/hooks'; import RowLevelSecurityModal from 'src/features/rls/RowLevelSecurityModal'; import { RLSObject } from 'src/features/rls/types'; -import { createErrorHandler } from 'src/views/CRUD/utils'; +import { createErrorHandler, createFetchRelated } from 'src/views/CRUD/utils'; +import { ModifiedInfo } from 'src/components/AuditInfo'; +import { QueryObjectColumns } from 'src/views/CRUD/types'; const Actions = styled.div` color: ${({ theme }) => theme.colors.grayscale.base}; @@ -43,7 +45,7 @@ interface RLSProps { addDangerToast: (msg: string) => void; addSuccessToast: (msg: string) => void; user: { - userId?: string | number; + userId: string | number; firstName: string; lastName: string; }; @@ -146,10 +148,13 @@ function RowLevelSecurityList(props: RLSProps) { { Cell: ({ row: { - original: { changed_on_delta_humanized: changedOn }, + original: { + changed_on_delta_humanized: changedOn, + changed_by: changedBy, + }, }, - }: any) => <span className="no-wrap">{changedOn}</span>, - Header: t('Modified'), + }: any) => <ModifiedInfo date={changedOn} user={changedBy} />, + Header: t('Last modified'), accessor: 'changed_on_delta_humanized', size: 'xl', }, @@ -218,6 +223,10 @@ function RowLevelSecurityList(props: RLSProps) { hidden: !canEdit && !canWrite && !canExport, disableSortBy: true, }, + { + accessor: QueryObjectColumns.changed_by, + hidden: true, + }, ], [ user.userId, @@ -270,6 +279,26 @@ function RowLevelSecurityList(props: RLSProps) { input: 'search', operator: FilterOperator.startsWith, }, + { + Header: t('Modified by'), + key: 'changed_by', + id: 'changed_by', + input: 'select', + operator: FilterOperator.relationOneMany, + unfilteredLabel: t('All'), + fetchSelects: createFetchRelated( + 'rowlevelsecurity', + 'changed_by', + createErrorHandler(errMsg => + t( + 'An error occurred while fetching dataset datasource values: %s', + errMsg, + ), + ), + user, + ), + paginate: true, + }, ], [user], ); diff --git a/superset-frontend/src/pages/SavedQueryList/index.tsx b/superset-frontend/src/pages/SavedQueryList/index.tsx index 3ee62c2ce6533..d48ffef8c90c3 100644 --- a/superset-frontend/src/pages/SavedQueryList/index.tsx +++ b/superset-frontend/src/pages/SavedQueryList/index.tsx @@ -18,20 +18,19 @@ */ import { - isFeatureEnabled, FeatureFlag, + isFeatureEnabled, styled, SupersetClient, t, } from '@superset-ui/core'; -import React, { useState, useMemo, useCallback } from 'react'; +import React, { useCallback, useMemo, useState } from 'react'; import { Link, useHistory } from 'react-router-dom'; import rison from 'rison'; -import moment from 'moment'; import { - createFetchRelated, - createFetchDistinct, createErrorHandler, + createFetchDistinct, + createFetchRelated, } from 'src/views/CRUD/utils'; import { useSelector } from 'react-redux'; import Popover from 'src/components/Popover'; @@ -39,11 +38,11 @@ import withToasts from 'src/components/MessageToasts/withToasts'; import { useListViewResource } from 'src/views/CRUD/hooks'; import ConfirmStatusChange from 'src/components/ConfirmStatusChange'; import handleResourceExport from 'src/utils/export'; -import SubMenu, { SubMenuProps, ButtonProps } from 'src/features/home/SubMenu'; +import SubMenu, { ButtonProps, SubMenuProps } from 'src/features/home/SubMenu'; import ListView, { - ListViewProps, - Filters, FilterOperator, + Filters, + ListViewProps, } from 'src/components/ListView'; import Loading from 'src/components/Loading'; import DeleteModal from 'src/components/DeleteModal'; @@ -51,15 +50,14 @@ import ActionsBar, { ActionProps } from 'src/components/ListView/ActionsBar'; import { TagsList } from 'src/components/Tags'; import { Tooltip } from 'src/components/Tooltip'; import { commonMenuData } from 'src/features/home/commonMenuData'; -import { SavedQueryObject } from 'src/views/CRUD/types'; +import { QueryObjectColumns, SavedQueryObject } from 'src/views/CRUD/types'; import copyTextToClipboard from 'src/utils/copy'; import Tag from 'src/types/TagType'; import ImportModelsModal from 'src/components/ImportModal/index'; +import { ModifiedInfo } from 'src/components/AuditInfo'; +import { loadTags } from 'src/components/Tags/utils'; import Icons from 'src/components/Icons'; -import { - BootstrapUser, - UserWithPermissionsAndRoles, -} from 'src/types/bootstrapTypes'; +import { UserWithPermissionsAndRoles } from 'src/types/bootstrapTypes'; import SavedQueryPreviewModal from 'src/features/queries/SavedQueryPreviewModal'; import { findPermission } from 'src/utils/findPermission'; @@ -80,7 +78,11 @@ const CONFIRM_OVERWRITE_MESSAGE = t( interface SavedQueryListProps { addDangerToast: (msg: string) => void; addSuccessToast: (msg: string) => void; - user: BootstrapUser; + user: { + userId: string | number; + firstName: string; + lastName: string; + }; } const StyledTableLabel = styled.div` @@ -99,6 +101,7 @@ const StyledPopoverItem = styled.div` function SavedQueryList({ addDangerToast, addSuccessToast, + user, }: SavedQueryListProps) { const { state: { @@ -348,41 +351,6 @@ function SavedQueryList({ size: 'xl', disableSortBy: true, }, - { - Cell: ({ - row: { - original: { created_on: createdOn }, - }, - }: any) => { - const date = new Date(createdOn); - const utc = new Date( - Date.UTC( - date.getFullYear(), - date.getMonth(), - date.getDate(), - date.getHours(), - date.getMinutes(), - date.getSeconds(), - date.getMilliseconds(), - ), - ); - - return moment(utc).fromNow(); - }, - Header: t('Created on'), - accessor: 'created_on', - size: 'xl', - }, - { - Cell: ({ - row: { - original: { changed_on_delta_humanized: changedOn }, - }, - }: any) => changedOn, - Header: t('Modified'), - accessor: 'changed_on_delta_humanized', - size: 'xl', - }, { Cell: ({ row: { @@ -397,6 +365,19 @@ function SavedQueryList({ disableSortBy: true, hidden: !isFeatureEnabled(FeatureFlag.TAGGING_SYSTEM), }, + { + Cell: ({ + row: { + original: { + changed_by: changedBy, + changed_on_delta_humanized: changedOn, + }, + }, + }: any) => <ModifiedInfo user={changedBy} date={changedOn} />, + Header: t('Last modified'), + accessor: 'changed_on_delta_humanized', + size: 'xl', + }, { Cell: ({ row: { original } }: any) => { const handlePreview = () => { @@ -452,12 +433,23 @@ function SavedQueryList({ id: 'actions', disableSortBy: true, }, + { + accessor: QueryObjectColumns.changed_by, + hidden: true, + }, ], [canDelete, canEdit, canExport, copyQueryLink, handleSavedQueryPreview], ); const filters: Filters = useMemo( () => [ + { + Header: t('Name'), + id: 'label', + key: 'search', + input: 'search', + operator: FilterOperator.allText, + }, { Header: t('Database'), key: 'database', @@ -497,28 +489,42 @@ function SavedQueryList({ ), paginate: true, }, - + ...((isFeatureEnabled(FeatureFlag.TAGGING_SYSTEM) && canReadTag + ? [ + { + Header: t('Tag'), + id: 'tags', + key: 'tags', + input: 'select', + operator: FilterOperator.savedQueryTags, + fetchSelects: loadTags, + }, + ] + : []) as Filters), { - Header: t('Search'), - id: 'label', - key: 'search', - input: 'search', - operator: FilterOperator.allText, + Header: t('Modified by'), + key: 'changed_by', + id: 'changed_by', + input: 'select', + operator: FilterOperator.relationOneMany, + unfilteredLabel: t('All'), + fetchSelects: createFetchRelated( + 'saved_query', + 'changed_by', + createErrorHandler(errMsg => + t( + 'An error occurred while fetching dataset datasource values: %s', + errMsg, + ), + ), + user, + ), + paginate: true, }, ], [addDangerToast], ); - if (isFeatureEnabled(FeatureFlag.TAGGING_SYSTEM) && canReadTag) { - filters.push({ - Header: t('Tags'), - id: 'tags', - key: 'tags', - input: 'search', - operator: FilterOperator.savedQueryTags, - }); - } - return ( <> <SubMenu {...menuData} /> diff --git a/superset-frontend/src/pages/SqlLab/index.tsx b/superset-frontend/src/pages/SqlLab/index.tsx index e9f84f1b1d646..3f19b54c29511 100644 --- a/superset-frontend/src/pages/SqlLab/index.tsx +++ b/superset-frontend/src/pages/SqlLab/index.tsx @@ -18,7 +18,7 @@ */ import React, { useEffect } from 'react'; import { useDispatch, useSelector } from 'react-redux'; -import { css } from '@superset-ui/core'; +import { css, isFeatureEnabled, FeatureFlag } from '@superset-ui/core'; import { useSqlLabInitialState } from 'src/hooks/apiResources/sqlLab'; import type { InitialState } from 'src/hooks/apiResources/sqlLab'; import { resetState } from 'src/SqlLab/actions/sqlLab'; @@ -27,16 +27,17 @@ import type { SqlLabRootState } from 'src/SqlLab/types'; import { SqlLabGlobalStyles } from 'src/SqlLab//SqlLabGlobalStyles'; import App from 'src/SqlLab/components/App'; import Loading from 'src/components/Loading'; +import EditorAutoSync from 'src/SqlLab/components/EditorAutoSync'; import useEffectEvent from 'src/hooks/useEffectEvent'; import { LocationProvider } from './LocationContext'; export default function SqlLab() { - const editorTabLastUpdatedAt = useSelector<SqlLabRootState, number>( - state => state.sqlLab.editorTabLastUpdatedAt || 0, + const lastInitializedAt = useSelector<SqlLabRootState, number>( + state => state.sqlLab.queriesLastUpdate || 0, ); const { data, isLoading, isError, error, fulfilledTimeStamp } = useSqlLabInitialState(); - const shouldInitialize = editorTabLastUpdatedAt <= (fulfilledTimeStamp || 0); + const shouldInitialize = lastInitializedAt <= (fulfilledTimeStamp || 0); const dispatch = useDispatch(); const initBootstrapData = useEffectEvent( @@ -72,6 +73,9 @@ export default function SqlLab() { > <SqlLabGlobalStyles /> <App /> + {isFeatureEnabled(FeatureFlag.SQLLAB_BACKEND_PERSISTENCE) && ( + <EditorAutoSync /> + )} </div> </LocationProvider> ); diff --git a/superset-frontend/src/pages/Tags/index.tsx b/superset-frontend/src/pages/Tags/index.tsx index a66d7c7b61b0d..b0c998c3f32f8 100644 --- a/superset-frontend/src/pages/Tags/index.tsx +++ b/superset-frontend/src/pages/Tags/index.tsx @@ -19,9 +19,9 @@ import React, { useMemo, useState } from 'react'; import { isFeatureEnabled, FeatureFlag, t } from '@superset-ui/core'; import { - createFetchRelated, - createErrorHandler, Actions, + createErrorHandler, + createFetchRelated, } from 'src/views/CRUD/utils'; import { useListViewResource, useFavoriteStatus } from 'src/views/CRUD/hooks'; import ConfirmStatusChange from 'src/components/ConfirmStatusChange'; @@ -35,13 +35,13 @@ import { dangerouslyGetItemDoNotUse } from 'src/utils/localStorageHelpers'; import withToasts from 'src/components/MessageToasts/withToasts'; import Icons from 'src/components/Icons'; import { Tooltip } from 'src/components/Tooltip'; -import FacePile from 'src/components/FacePile'; import { Link } from 'react-router-dom'; import { deleteTags } from 'src/features/tags/tags'; import { Tag as AntdTag } from 'antd'; -import { Tag } from 'src/views/CRUD/types'; +import { QueryObjectColumns, Tag } from 'src/views/CRUD/types'; import TagModal from 'src/features/tags/TagModal'; import FaveStar from 'src/components/FaveStar'; +import { ModifiedInfo } from 'src/components/AuditInfo'; const PAGE_SIZE = 25; @@ -56,11 +56,8 @@ interface TagListProps { } function TagList(props: TagListProps) { - const { - addDangerToast, - addSuccessToast, - user: { userId }, - } = props; + const { addDangerToast, addSuccessToast, user } = props; + const { userId } = user; const { state: { @@ -162,24 +159,16 @@ function TagList(props: TagListProps) { { Cell: ({ row: { - original: { changed_on_delta_humanized: changedOn }, + original: { + changed_on_delta_humanized: changedOn, + changed_by: changedBy, + }, }, - }: any) => <span className="no-wrap">{changedOn}</span>, - Header: t('Modified'), + }: any) => <ModifiedInfo date={changedOn} user={changedBy} />, + Header: t('Last modified'), accessor: 'changed_on_delta_humanized', size: 'xl', }, - { - Cell: ({ - row: { - original: { created_by: createdBy }, - }, - }: any) => (createdBy ? <FacePile users={[createdBy]} /> : ''), - Header: t('Created by'), - accessor: 'created_by', - disableSortBy: true, - size: 'xl', - }, { Cell: ({ row: { original } }: any) => { const handleEdit = () => handleTagEdit(original); @@ -238,6 +227,10 @@ function TagList(props: TagListProps) { hidden: !canDelete, disableSortBy: true, }, + { + accessor: QueryObjectColumns.changed_by, + hidden: true, + }, ], [userId, canDelete, refreshData, addSuccessToast, addDangerToast], ); @@ -245,32 +238,31 @@ function TagList(props: TagListProps) { const filters: Filters = useMemo(() => { const filters_list = [ { - Header: t('Created by'), - id: 'created_by', + Header: t('Name'), + id: 'name', + input: 'search', + operator: FilterOperator.contains, + }, + { + Header: t('Modified by'), + key: 'changed_by', + id: 'changed_by', input: 'select', operator: FilterOperator.relationOneMany, unfilteredLabel: t('All'), fetchSelects: createFetchRelated( 'tag', - 'created_by', + 'changed_by', createErrorHandler(errMsg => - addDangerToast( - t( - 'An error occurred while fetching tag created by values: %s', - errMsg, - ), + t( + 'An error occurred while fetching dataset datasource values: %s', + errMsg, ), ), - props.user, + user, ), paginate: true, }, - { - Header: t('Search'), - id: 'name', - input: 'search', - operator: FilterOperator.contains, - }, ] as Filters; return filters_list; }, [addDangerToast, props.user]); @@ -361,7 +353,7 @@ function TagList(props: TagListProps) { className="tags-list-view" columns={columns} count={tagCount} - data={tags.filter(tag => !tag.name.includes(':'))} + data={tags} disableBulkSelect={toggleBulkSelect} refreshData={refreshData} emptyState={emptyState} diff --git a/superset-frontend/src/setup/setupClient.ts b/superset-frontend/src/setup/setupClient.ts index 80ce6b54bb8c0..c6f2399436bc8 100644 --- a/superset-frontend/src/setup/setupClient.ts +++ b/superset-frontend/src/setup/setupClient.ts @@ -18,13 +18,18 @@ */ import { SupersetClient, logging, ClientConfig } from '@superset-ui/core'; import parseCookie from 'src/utils/parseCookie'; +import getBootstrapData from 'src/utils/getBootstrapData'; + +const bootstrapData = getBootstrapData(); function getDefaultConfiguration(): ClientConfig { const csrfNode = document.querySelector<HTMLInputElement>('#csrf_token'); const csrfToken = csrfNode?.value; // when using flask-jwt-extended csrf is set in cookies - const cookieCSRFToken = parseCookie().csrf_access_token || ''; + const jwtAccessCsrfCookieName = + bootstrapData.common.conf.JWT_ACCESS_CSRF_COOKIE_NAME; + const cookieCSRFToken = parseCookie()[jwtAccessCsrfCookieName] || ''; return { protocol: ['http:', 'https:'].includes(window?.location?.protocol) diff --git a/superset-frontend/src/types/dom-to-image-more.d.ts b/superset-frontend/src/types/dom-to-image-more.d.ts index c5a93de757438..374a41bb04570 100644 --- a/superset-frontend/src/types/dom-to-image-more.d.ts +++ b/superset-frontend/src/types/dom-to-image-more.d.ts @@ -18,20 +18,6 @@ */ declare module 'dom-to-image-more' { - export interface Options { - filter?: ((node: Node) => boolean) | undefined; - bgcolor?: string | undefined; - width?: number | undefined; - height?: number | undefined; - style?: {} | undefined; - quality?: number | undefined; - imagePlaceholder?: string | undefined; - cacheBust?: boolean | undefined; - } - - class DomToImageMore { - static toJpeg(node: Node, options?: Options): Promise<string>; - } - - export default DomToImageMore; + import domToImage = require('dom-to-image-more'); + export = domToImage; } diff --git a/superset-frontend/src/utils/downloadAsImage.ts b/superset-frontend/src/utils/downloadAsImage.ts index 79373cc76aade..a6f50926bcb72 100644 --- a/superset-frontend/src/utils/downloadAsImage.ts +++ b/superset-frontend/src/utils/downloadAsImage.ts @@ -62,7 +62,7 @@ export default function downloadAsImage( if (typeof node.className === 'string') { return ( node.className !== 'mapboxgl-control-container' && - !node.className.includes('ant-dropdown') + !node.className.includes('header-controls') ); } return true; @@ -70,17 +70,16 @@ export default function downloadAsImage( return domToImage .toJpeg(elementToPrint, { - quality: 1, bgcolor: supersetTheme.colors.grayscale.light4, filter, }) - .then(dataUrl => { + .then((dataUrl: string) => { const link = document.createElement('a'); link.download = `${generateFileStem(description)}.jpg`; link.href = dataUrl; link.click(); }) - .catch(e => { + .catch((e: Error) => { console.error('Creating image failed', e); }); }; diff --git a/superset-frontend/src/utils/getOwnerName.test.ts b/superset-frontend/src/utils/getOwnerName.test.ts new file mode 100644 index 0000000000000..a4a25e57b24ed --- /dev/null +++ b/superset-frontend/src/utils/getOwnerName.test.ts @@ -0,0 +1,29 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +import getOwnerName from './getOwnerName'; + +test('render owner name correctly', () => { + expect(getOwnerName({ id: 1, first_name: 'Foo', last_name: 'Bar' })).toEqual( + 'Foo Bar', + ); +}); + +test('return empty string for undefined owner', () => { + expect(getOwnerName(undefined)).toEqual(''); +}); diff --git a/superset-frontend/src/utils/getOwnerName.ts b/superset-frontend/src/utils/getOwnerName.ts new file mode 100644 index 0000000000000..2534c45f2cbb1 --- /dev/null +++ b/superset-frontend/src/utils/getOwnerName.ts @@ -0,0 +1,26 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +import Owner from 'src/types/Owner'; + +export default function getOwnerName(owner?: Owner): string { + if (!owner) { + return ''; + } + return `${owner.first_name} ${owner.last_name}`; +} diff --git a/superset-frontend/src/views/CRUD/types.ts b/superset-frontend/src/views/CRUD/types.ts index 5a53b57696f53..2fff111b47c9e 100644 --- a/superset-frontend/src/views/CRUD/types.ts +++ b/superset-frontend/src/views/CRUD/types.ts @@ -112,6 +112,7 @@ export interface QueryObject { export enum QueryObjectColumns { id = 'id', changed_on = 'changed_on', + changed_by = 'changed_by', database = 'database', database_name = 'database.database_name', schema = 'schema', @@ -138,17 +139,11 @@ export type ImportResourceName = export interface Tag { changed_on_delta_humanized: string; - changed_by: { - first_name: string; - last_name: string; - }; + changed_by: Owner; created_on_delta_humanized: string; name: string; id: number; - created_by: { - first_name: string; - last_name: string; - }; + created_by: Owner; description: string; type: string; } diff --git a/superset-frontend/src/views/store.ts b/superset-frontend/src/views/store.ts index 55df81c588b5f..a9c3a9eb13d81 100644 --- a/superset-frontend/src/views/store.ts +++ b/superset-frontend/src/views/store.ts @@ -38,7 +38,6 @@ import logger from 'src/middleware/loggerMiddleware'; import saveModal from 'src/explore/reducers/saveModalReducer'; import explore from 'src/explore/reducers/exploreReducer'; import exploreDatasources from 'src/explore/reducers/datasourcesReducer'; -import { FeatureFlag, isFeatureEnabled } from '@superset-ui/core'; import { persistSqlLabStateEnhancer } from 'src/SqlLab/middlewares/persistSqlLabStateEnhancer'; import sqlLabReducer from 'src/SqlLab/reducers/sqlLab'; @@ -167,9 +166,7 @@ export function setupStore({ }, middleware: getMiddleware, devTools: process.env.WEBPACK_MODE === 'development' && !disableDebugger, - ...(!isFeatureEnabled(FeatureFlag.SQLLAB_BACKEND_PERSISTENCE) && { - enhancers: [persistSqlLabStateEnhancer as StoreEnhancer], - }), + enhancers: [persistSqlLabStateEnhancer as StoreEnhancer], ...overrides, }); } diff --git a/superset-websocket/package-lock.json b/superset-websocket/package-lock.json index 6684792649805..e1c6232e9974f 100644 --- a/superset-websocket/package-lock.json +++ b/superset-websocket/package-lock.json @@ -9,8 +9,8 @@ "version": "0.0.1", "license": "Apache-2.0", "dependencies": { - "@types/lodash": "^4.14.200", - "cookie": "^0.5.0", + "@types/lodash": "^4.14.202", + "cookie": "^0.6.0", "hot-shots": "^10.0.0", "ioredis": "^4.28.0", "jsonwebtoken": "^9.0.2", @@ -20,16 +20,16 @@ "ws": "^8.14.2" }, "devDependencies": { - "@types/cookie": "^0.5.3", + "@types/cookie": "^0.5.4", "@types/ioredis": "^4.27.8", "@types/jest": "^27.0.2", - "@types/jsonwebtoken": "^9.0.4", - "@types/node": "^20.8.10", - "@types/uuid": "^9.0.6", - "@types/ws": "^8.5.7", + "@types/jsonwebtoken": "^9.0.5", + "@types/node": "^20.9.4", + "@types/uuid": "^9.0.7", + "@types/ws": "^8.5.10", "@typescript-eslint/eslint-plugin": "^5.61.0", "@typescript-eslint/parser": "^5.62.0", - "eslint": "^8.53.0", + "eslint": "^8.54.0", "eslint-config-prettier": "^9.0.0", "jest": "^27.3.1", "prettier": "^3.0.3", @@ -829,9 +829,9 @@ } }, "node_modules/@eslint/js": { - "version": "8.53.0", - "resolved": "https://registry.npmjs.org/@eslint/js/-/js-8.53.0.tgz", - "integrity": "sha512-Kn7K8dx/5U6+cT1yEhpX1w4PCSg0M+XyRILPgvwcEBjerFWCwQj5sbr3/VmxqV0JGHCBCzyd6LxypEuehypY1w==", + "version": "8.54.0", + "resolved": "https://registry.npmjs.org/@eslint/js/-/js-8.54.0.tgz", + "integrity": "sha512-ut5V+D+fOoWPgGGNj83GGjnntO39xDy6DWxO0wb7Jp3DcMX0TfIqdzHF85VTQkerdyGmuuMD9AKAo5KiNlf/AQ==", "dev": true, "engines": { "node": "^12.22.0 || ^14.17.0 || >=16.0.0" @@ -1351,9 +1351,9 @@ } }, "node_modules/@types/cookie": { - "version": "0.5.3", - "resolved": "https://registry.npmjs.org/@types/cookie/-/cookie-0.5.3.tgz", - "integrity": "sha512-SLg07AS9z1Ab2LU+QxzU8RCmzsja80ywjf/t5oqw+4NSH20gIGlhLOrBDm1L3PBWzPa4+wkgFQVZAjE6Ioj2ug==", + "version": "0.5.4", + "resolved": "https://registry.npmjs.org/@types/cookie/-/cookie-0.5.4.tgz", + "integrity": "sha512-7z/eR6O859gyWIAjuvBWFzNURmf2oPBmJlfVWkwehU5nzIyjwBsTh7WMmEEV4JFnHuQ3ex4oyTvfKzcyJVDBNA==", "dev": true }, "node_modules/@types/graceful-fs": { @@ -1415,23 +1415,23 @@ "dev": true }, "node_modules/@types/jsonwebtoken": { - "version": "9.0.4", - "resolved": "https://registry.npmjs.org/@types/jsonwebtoken/-/jsonwebtoken-9.0.4.tgz", - "integrity": "sha512-8UYapdmR0QlxgvJmyE8lP7guxD0UGVMfknsdtCFZh4ovShdBl3iOI4zdvqBHrB/IS+xUj3PSx73Qkey1fhWz+g==", + "version": "9.0.5", + "resolved": "https://registry.npmjs.org/@types/jsonwebtoken/-/jsonwebtoken-9.0.5.tgz", + "integrity": "sha512-VRLSGzik+Unrup6BsouBeHsf4d1hOEgYWTm/7Nmw1sXoN1+tRly/Gy/po3yeahnP4jfnQWWAhQAqcNfH7ngOkA==", "dev": true, "dependencies": { "@types/node": "*" } }, "node_modules/@types/lodash": { - "version": "4.14.200", - "resolved": "https://registry.npmjs.org/@types/lodash/-/lodash-4.14.200.tgz", - "integrity": "sha512-YI/M/4HRImtNf3pJgbF+W6FrXovqj+T+/HpENLTooK9PnkacBsDpeP3IpHab40CClUfhNmdM2WTNP2sa2dni5Q==" + "version": "4.14.202", + "resolved": "https://registry.npmjs.org/@types/lodash/-/lodash-4.14.202.tgz", + "integrity": "sha512-OvlIYQK9tNneDlS0VN54LLd5uiPCBOp7gS5Z0f1mjoJYBrtStzgmJBxONW3U6OZqdtNzZPmn9BS/7WI7BFFcFQ==" }, "node_modules/@types/node": { - "version": "20.8.10", - "resolved": "https://registry.npmjs.org/@types/node/-/node-20.8.10.tgz", - "integrity": "sha512-TlgT8JntpcbmKUFzjhsyhGfP2fsiz1Mv56im6enJ905xG1DAYesxJaeSbGqQmAw8OWPdhyJGhGSQGKRNJ45u9w==", + "version": "20.9.4", + "resolved": "https://registry.npmjs.org/@types/node/-/node-20.9.4.tgz", + "integrity": "sha512-wmyg8HUhcn6ACjsn8oKYjkN/zUzQeNtMy44weTJSM6p4MMzEOuKbA3OjJ267uPCOW7Xex9dyrNTful8XTQYoDA==", "dev": true, "dependencies": { "undici-types": "~5.26.4" @@ -1456,15 +1456,15 @@ "dev": true }, "node_modules/@types/uuid": { - "version": "9.0.6", - "resolved": "https://registry.npmjs.org/@types/uuid/-/uuid-9.0.6.tgz", - "integrity": "sha512-BT2Krtx4xaO6iwzwMFUYvWBWkV2pr37zD68Vmp1CDV196MzczBRxuEpD6Pr395HAgebC/co7hOphs53r8V7jew==", + "version": "9.0.7", + "resolved": "https://registry.npmjs.org/@types/uuid/-/uuid-9.0.7.tgz", + "integrity": "sha512-WUtIVRUZ9i5dYXefDEAI7sh9/O7jGvHg7Df/5O/gtH3Yabe5odI3UWopVR1qbPXQtvOxWu3mM4XxlYeZtMWF4g==", "dev": true }, "node_modules/@types/ws": { - "version": "8.5.7", - "resolved": "https://registry.npmjs.org/@types/ws/-/ws-8.5.7.tgz", - "integrity": "sha512-6UrLjiDUvn40CMrAubXuIVtj2PEfKDffJS7ychvnPU44j+KVeXmdHHTgqcM/dxLUTHxlXHiFM8Skmb8ozGdTnQ==", + "version": "8.5.10", + "resolved": "https://registry.npmjs.org/@types/ws/-/ws-8.5.10.tgz", + "integrity": "sha512-vmQSUcfalpIq0R9q7uTo2lXs6eGIpt9wtnLdMv9LVpIjCA/+ufZRozlVoVelIYixx1ugCBKDhn89vnsEGOCx9A==", "dev": true, "dependencies": { "@types/node": "*" @@ -2320,9 +2320,9 @@ } }, "node_modules/cookie": { - "version": "0.5.0", - "resolved": "https://registry.npmjs.org/cookie/-/cookie-0.5.0.tgz", - "integrity": "sha512-YZ3GUyn/o8gfKJlnlX7g7xq4gyO6OSuhGPKaaGssGB2qgDUS0gPgtTvoyZLTt9Ab6dC4hfc9dV5arkvc/OCmrw==", + "version": "0.6.0", + "resolved": "https://registry.npmjs.org/cookie/-/cookie-0.6.0.tgz", + "integrity": "sha512-U71cyTamuh1CRNCfpGY6to28lxvNwPG4Guz/EVjgf3Jmzv0vlDp1atT9eS5dDjMYHucpHbWns6Lwf3BKz6svdw==", "engines": { "node": ">= 0.6" } @@ -2604,15 +2604,15 @@ } }, "node_modules/eslint": { - "version": "8.53.0", - "resolved": "https://registry.npmjs.org/eslint/-/eslint-8.53.0.tgz", - "integrity": "sha512-N4VuiPjXDUa4xVeV/GC/RV3hQW9Nw+Y463lkWaKKXKYMvmRiRDAtfpuPFLN+E1/6ZhyR8J2ig+eVREnYgUsiag==", + "version": "8.54.0", + "resolved": "https://registry.npmjs.org/eslint/-/eslint-8.54.0.tgz", + "integrity": "sha512-NY0DfAkM8BIZDVl6PgSa1ttZbx3xHgJzSNJKYcQglem6CppHyMhRIQkBVSSMaSRnLhig3jsDbEzOjwCVt4AmmA==", "dev": true, "dependencies": { "@eslint-community/eslint-utils": "^4.2.0", "@eslint-community/regexpp": "^4.6.1", "@eslint/eslintrc": "^2.1.3", - "@eslint/js": "8.53.0", + "@eslint/js": "8.54.0", "@humanwhocodes/config-array": "^0.11.13", "@humanwhocodes/module-importer": "^1.0.1", "@nodelib/fs.walk": "^1.2.8", @@ -6781,9 +6781,9 @@ } }, "@eslint/js": { - "version": "8.53.0", - "resolved": "https://registry.npmjs.org/@eslint/js/-/js-8.53.0.tgz", - "integrity": "sha512-Kn7K8dx/5U6+cT1yEhpX1w4PCSg0M+XyRILPgvwcEBjerFWCwQj5sbr3/VmxqV0JGHCBCzyd6LxypEuehypY1w==", + "version": "8.54.0", + "resolved": "https://registry.npmjs.org/@eslint/js/-/js-8.54.0.tgz", + "integrity": "sha512-ut5V+D+fOoWPgGGNj83GGjnntO39xDy6DWxO0wb7Jp3DcMX0TfIqdzHF85VTQkerdyGmuuMD9AKAo5KiNlf/AQ==", "dev": true }, "@humanwhocodes/config-array": { @@ -7205,9 +7205,9 @@ } }, "@types/cookie": { - "version": "0.5.3", - "resolved": "https://registry.npmjs.org/@types/cookie/-/cookie-0.5.3.tgz", - "integrity": "sha512-SLg07AS9z1Ab2LU+QxzU8RCmzsja80ywjf/t5oqw+4NSH20gIGlhLOrBDm1L3PBWzPa4+wkgFQVZAjE6Ioj2ug==", + "version": "0.5.4", + "resolved": "https://registry.npmjs.org/@types/cookie/-/cookie-0.5.4.tgz", + "integrity": "sha512-7z/eR6O859gyWIAjuvBWFzNURmf2oPBmJlfVWkwehU5nzIyjwBsTh7WMmEEV4JFnHuQ3ex4oyTvfKzcyJVDBNA==", "dev": true }, "@types/graceful-fs": { @@ -7269,23 +7269,23 @@ "dev": true }, "@types/jsonwebtoken": { - "version": "9.0.4", - "resolved": "https://registry.npmjs.org/@types/jsonwebtoken/-/jsonwebtoken-9.0.4.tgz", - "integrity": "sha512-8UYapdmR0QlxgvJmyE8lP7guxD0UGVMfknsdtCFZh4ovShdBl3iOI4zdvqBHrB/IS+xUj3PSx73Qkey1fhWz+g==", + "version": "9.0.5", + "resolved": "https://registry.npmjs.org/@types/jsonwebtoken/-/jsonwebtoken-9.0.5.tgz", + "integrity": "sha512-VRLSGzik+Unrup6BsouBeHsf4d1hOEgYWTm/7Nmw1sXoN1+tRly/Gy/po3yeahnP4jfnQWWAhQAqcNfH7ngOkA==", "dev": true, "requires": { "@types/node": "*" } }, "@types/lodash": { - "version": "4.14.200", - "resolved": "https://registry.npmjs.org/@types/lodash/-/lodash-4.14.200.tgz", - "integrity": "sha512-YI/M/4HRImtNf3pJgbF+W6FrXovqj+T+/HpENLTooK9PnkacBsDpeP3IpHab40CClUfhNmdM2WTNP2sa2dni5Q==" + "version": "4.14.202", + "resolved": "https://registry.npmjs.org/@types/lodash/-/lodash-4.14.202.tgz", + "integrity": "sha512-OvlIYQK9tNneDlS0VN54LLd5uiPCBOp7gS5Z0f1mjoJYBrtStzgmJBxONW3U6OZqdtNzZPmn9BS/7WI7BFFcFQ==" }, "@types/node": { - "version": "20.8.10", - "resolved": "https://registry.npmjs.org/@types/node/-/node-20.8.10.tgz", - "integrity": "sha512-TlgT8JntpcbmKUFzjhsyhGfP2fsiz1Mv56im6enJ905xG1DAYesxJaeSbGqQmAw8OWPdhyJGhGSQGKRNJ45u9w==", + "version": "20.9.4", + "resolved": "https://registry.npmjs.org/@types/node/-/node-20.9.4.tgz", + "integrity": "sha512-wmyg8HUhcn6ACjsn8oKYjkN/zUzQeNtMy44weTJSM6p4MMzEOuKbA3OjJ267uPCOW7Xex9dyrNTful8XTQYoDA==", "dev": true, "requires": { "undici-types": "~5.26.4" @@ -7310,15 +7310,15 @@ "dev": true }, "@types/uuid": { - "version": "9.0.6", - "resolved": "https://registry.npmjs.org/@types/uuid/-/uuid-9.0.6.tgz", - "integrity": "sha512-BT2Krtx4xaO6iwzwMFUYvWBWkV2pr37zD68Vmp1CDV196MzczBRxuEpD6Pr395HAgebC/co7hOphs53r8V7jew==", + "version": "9.0.7", + "resolved": "https://registry.npmjs.org/@types/uuid/-/uuid-9.0.7.tgz", + "integrity": "sha512-WUtIVRUZ9i5dYXefDEAI7sh9/O7jGvHg7Df/5O/gtH3Yabe5odI3UWopVR1qbPXQtvOxWu3mM4XxlYeZtMWF4g==", "dev": true }, "@types/ws": { - "version": "8.5.7", - "resolved": "https://registry.npmjs.org/@types/ws/-/ws-8.5.7.tgz", - "integrity": "sha512-6UrLjiDUvn40CMrAubXuIVtj2PEfKDffJS7ychvnPU44j+KVeXmdHHTgqcM/dxLUTHxlXHiFM8Skmb8ozGdTnQ==", + "version": "8.5.10", + "resolved": "https://registry.npmjs.org/@types/ws/-/ws-8.5.10.tgz", + "integrity": "sha512-vmQSUcfalpIq0R9q7uTo2lXs6eGIpt9wtnLdMv9LVpIjCA/+ufZRozlVoVelIYixx1ugCBKDhn89vnsEGOCx9A==", "dev": true, "requires": { "@types/node": "*" @@ -7946,9 +7946,9 @@ } }, "cookie": { - "version": "0.5.0", - "resolved": "https://registry.npmjs.org/cookie/-/cookie-0.5.0.tgz", - "integrity": "sha512-YZ3GUyn/o8gfKJlnlX7g7xq4gyO6OSuhGPKaaGssGB2qgDUS0gPgtTvoyZLTt9Ab6dC4hfc9dV5arkvc/OCmrw==" + "version": "0.6.0", + "resolved": "https://registry.npmjs.org/cookie/-/cookie-0.6.0.tgz", + "integrity": "sha512-U71cyTamuh1CRNCfpGY6to28lxvNwPG4Guz/EVjgf3Jmzv0vlDp1atT9eS5dDjMYHucpHbWns6Lwf3BKz6svdw==" }, "create-require": { "version": "1.1.1", @@ -8162,15 +8162,15 @@ } }, "eslint": { - "version": "8.53.0", - "resolved": "https://registry.npmjs.org/eslint/-/eslint-8.53.0.tgz", - "integrity": "sha512-N4VuiPjXDUa4xVeV/GC/RV3hQW9Nw+Y463lkWaKKXKYMvmRiRDAtfpuPFLN+E1/6ZhyR8J2ig+eVREnYgUsiag==", + "version": "8.54.0", + "resolved": "https://registry.npmjs.org/eslint/-/eslint-8.54.0.tgz", + "integrity": "sha512-NY0DfAkM8BIZDVl6PgSa1ttZbx3xHgJzSNJKYcQglem6CppHyMhRIQkBVSSMaSRnLhig3jsDbEzOjwCVt4AmmA==", "dev": true, "requires": { "@eslint-community/eslint-utils": "^4.2.0", "@eslint-community/regexpp": "^4.6.1", "@eslint/eslintrc": "^2.1.3", - "@eslint/js": "8.53.0", + "@eslint/js": "8.54.0", "@humanwhocodes/config-array": "^0.11.13", "@humanwhocodes/module-importer": "^1.0.1", "@nodelib/fs.walk": "^1.2.8", diff --git a/superset-websocket/package.json b/superset-websocket/package.json index 9db110dac1750..d324dd4d46d85 100644 --- a/superset-websocket/package.json +++ b/superset-websocket/package.json @@ -16,8 +16,8 @@ }, "license": "Apache-2.0", "dependencies": { - "@types/lodash": "^4.14.200", - "cookie": "^0.5.0", + "@types/lodash": "^4.14.202", + "cookie": "^0.6.0", "hot-shots": "^10.0.0", "ioredis": "^4.28.0", "jsonwebtoken": "^9.0.2", @@ -27,16 +27,16 @@ "ws": "^8.14.2" }, "devDependencies": { - "@types/cookie": "^0.5.3", + "@types/cookie": "^0.5.4", "@types/ioredis": "^4.27.8", "@types/jest": "^27.0.2", - "@types/jsonwebtoken": "^9.0.4", - "@types/node": "^20.8.10", - "@types/uuid": "^9.0.6", - "@types/ws": "^8.5.7", + "@types/jsonwebtoken": "^9.0.5", + "@types/node": "^20.9.4", + "@types/uuid": "^9.0.7", + "@types/ws": "^8.5.10", "@typescript-eslint/eslint-plugin": "^5.61.0", "@typescript-eslint/parser": "^5.62.0", - "eslint": "^8.53.0", + "eslint": "^8.54.0", "eslint-config-prettier": "^9.0.0", "jest": "^27.3.1", "prettier": "^3.0.3", diff --git a/superset/annotation_layers/annotations/api.py b/superset/annotation_layers/annotations/api.py index 4c95b3c105a1f..0be6efbfa9fbd 100644 --- a/superset/annotation_layers/annotations/api.py +++ b/superset/annotation_layers/annotations/api.py @@ -24,22 +24,6 @@ from flask_babel import ngettext from marshmallow import ValidationError -from superset.annotation_layers.annotations.commands.create import ( - CreateAnnotationCommand, -) -from superset.annotation_layers.annotations.commands.delete import ( - DeleteAnnotationCommand, -) -from superset.annotation_layers.annotations.commands.exceptions import ( - AnnotationCreateFailedError, - AnnotationDeleteFailedError, - AnnotationInvalidError, - AnnotationNotFoundError, - AnnotationUpdateFailedError, -) -from superset.annotation_layers.annotations.commands.update import ( - UpdateAnnotationCommand, -) from superset.annotation_layers.annotations.filters import AnnotationAllTextFilter from superset.annotation_layers.annotations.schemas import ( AnnotationPostSchema, @@ -47,7 +31,17 @@ get_delete_ids_schema, openapi_spec_methods_override, ) -from superset.annotation_layers.commands.exceptions import AnnotationLayerNotFoundError +from superset.commands.annotation_layer.annotation.create import CreateAnnotationCommand +from superset.commands.annotation_layer.annotation.delete import DeleteAnnotationCommand +from superset.commands.annotation_layer.annotation.exceptions import ( + AnnotationCreateFailedError, + AnnotationDeleteFailedError, + AnnotationInvalidError, + AnnotationNotFoundError, + AnnotationUpdateFailedError, +) +from superset.commands.annotation_layer.annotation.update import UpdateAnnotationCommand +from superset.commands.annotation_layer.exceptions import AnnotationLayerNotFoundError from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP, RouteMethod from superset.models.annotations import Annotation from superset.views.base_api import ( diff --git a/superset/annotation_layers/api.py b/superset/annotation_layers/api.py index b7a3b301bc952..5606e944ef2ba 100644 --- a/superset/annotation_layers/api.py +++ b/superset/annotation_layers/api.py @@ -23,17 +23,6 @@ from flask_babel import ngettext from marshmallow import ValidationError -from superset.annotation_layers.commands.create import CreateAnnotationLayerCommand -from superset.annotation_layers.commands.delete import DeleteAnnotationLayerCommand -from superset.annotation_layers.commands.exceptions import ( - AnnotationLayerCreateFailedError, - AnnotationLayerDeleteFailedError, - AnnotationLayerDeleteIntegrityError, - AnnotationLayerInvalidError, - AnnotationLayerNotFoundError, - AnnotationLayerUpdateFailedError, -) -from superset.annotation_layers.commands.update import UpdateAnnotationLayerCommand from superset.annotation_layers.filters import AnnotationLayerAllTextFilter from superset.annotation_layers.schemas import ( AnnotationLayerPostSchema, @@ -41,6 +30,17 @@ get_delete_ids_schema, openapi_spec_methods_override, ) +from superset.commands.annotation_layer.create import CreateAnnotationLayerCommand +from superset.commands.annotation_layer.delete import DeleteAnnotationLayerCommand +from superset.commands.annotation_layer.exceptions import ( + AnnotationLayerCreateFailedError, + AnnotationLayerDeleteFailedError, + AnnotationLayerDeleteIntegrityError, + AnnotationLayerInvalidError, + AnnotationLayerNotFoundError, + AnnotationLayerUpdateFailedError, +) +from superset.commands.annotation_layer.update import UpdateAnnotationLayerCommand from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP, RouteMethod from superset.extensions import event_logger from superset.models.annotations import AnnotationLayer @@ -99,7 +99,7 @@ class AnnotationLayerRestApi(BaseSupersetModelRestApi): ] search_filters = {"name": [AnnotationLayerAllTextFilter]} - allowed_rel_fields = {"created_by"} + allowed_rel_fields = {"created_by", "changed_by"} apispec_parameter_schemas = { "get_delete_ids_schema": get_delete_ids_schema, diff --git a/superset/charts/api.py b/superset/charts/api.py index 768d3302915c2..191f09c66e7b4 100644 --- a/superset/charts/api.py +++ b/superset/charts/api.py @@ -32,21 +32,6 @@ from werkzeug.wsgi import FileWrapper from superset import app, is_feature_enabled, thumbnail_cache -from superset.charts.commands.create import CreateChartCommand -from superset.charts.commands.delete import DeleteChartCommand -from superset.charts.commands.exceptions import ( - ChartCreateFailedError, - ChartDeleteFailedError, - ChartForbiddenError, - ChartInvalidError, - ChartNotFoundError, - ChartUpdateFailedError, - DashboardsForbiddenError, -) -from superset.charts.commands.export import ExportChartsCommand -from superset.charts.commands.importers.dispatcher import ImportChartsCommand -from superset.charts.commands.update import UpdateChartCommand -from superset.charts.commands.warm_up_cache import ChartWarmUpCacheCommand from superset.charts.filters import ( ChartAllTextFilter, ChartCertifiedFilter, @@ -69,6 +54,21 @@ screenshot_query_schema, thumbnail_query_schema, ) +from superset.commands.chart.create import CreateChartCommand +from superset.commands.chart.delete import DeleteChartCommand +from superset.commands.chart.exceptions import ( + ChartCreateFailedError, + ChartDeleteFailedError, + ChartForbiddenError, + ChartInvalidError, + ChartNotFoundError, + ChartUpdateFailedError, + DashboardsForbiddenError, +) +from superset.commands.chart.export import ExportChartsCommand +from superset.commands.chart.importers.dispatcher import ImportChartsCommand +from superset.commands.chart.update import UpdateChartCommand +from superset.commands.chart.warm_up_cache import ChartWarmUpCacheCommand from superset.commands.exceptions import CommandException from superset.commands.importers.exceptions import ( IncorrectFormatError, @@ -273,7 +273,7 @@ def ensure_thumbnails_enabled(self) -> Optional[Response]: "created_by": RelatedFieldFilter("first_name", FilterRelatedOwners), } - allowed_rel_fields = {"owners", "created_by"} + allowed_rel_fields = {"owners", "created_by", "changed_by"} @expose("/", methods=("POST",)) @protect() diff --git a/superset/charts/data/api.py b/superset/charts/data/api.py index c8ed840c7c5c7..a62e6a2407451 100644 --- a/superset/charts/data/api.py +++ b/superset/charts/data/api.py @@ -30,19 +30,19 @@ from superset import is_feature_enabled, security_manager from superset.async_events.async_query_manager import AsyncQueryTokenException from superset.charts.api import ChartRestApi -from superset.charts.commands.exceptions import ( - ChartDataCacheLoadError, - ChartDataQueryFailedError, -) -from superset.charts.data.commands.create_async_job_command import ( - CreateAsyncChartDataJobCommand, -) -from superset.charts.data.commands.get_data_command import ChartDataCommand from superset.charts.data.query_context_cache_loader import QueryContextCacheLoader from superset.charts.post_processing import apply_post_process from superset.charts.schemas import ChartDataQueryContextSchema +from superset.commands.chart.data.create_async_job_command import ( + CreateAsyncChartDataJobCommand, +) +from superset.commands.chart.data.get_data_command import ChartDataCommand +from superset.commands.chart.exceptions import ( + ChartDataCacheLoadError, + ChartDataQueryFailedError, +) from superset.common.chart_data import ChartDataResultFormat, ChartDataResultType -from superset.connectors.base.models import BaseDatasource +from superset.connectors.sqla.models import BaseDatasource from superset.daos.exceptions import DatasourceNotFound from superset.exceptions import QueryObjectValidationError from superset.extensions import event_logger diff --git a/superset/charts/data/query_context_cache_loader.py b/superset/charts/data/query_context_cache_loader.py index 97fa733a3e4ad..1bdabd33f485d 100644 --- a/superset/charts/data/query_context_cache_loader.py +++ b/superset/charts/data/query_context_cache_loader.py @@ -17,7 +17,7 @@ from typing import Any from superset import cache -from superset.charts.commands.exceptions import ChartDataCacheLoadError +from superset.commands.chart.exceptions import ChartDataCacheLoadError class QueryContextCacheLoader: # pylint: disable=too-few-public-methods diff --git a/superset/charts/post_processing.py b/superset/charts/post_processing.py index 939714642fc5e..ebcae32f8f486 100644 --- a/superset/charts/post_processing.py +++ b/superset/charts/post_processing.py @@ -40,7 +40,7 @@ ) if TYPE_CHECKING: - from superset.connectors.base.models import BaseDatasource + from superset.connectors.sqla.models import BaseDatasource from superset.models.sql_lab import Query diff --git a/superset/charts/schemas.py b/superset/charts/schemas.py index 0ad68ceb492d4..48e0cbb3180c2 100644 --- a/superset/charts/schemas.py +++ b/superset/charts/schemas.py @@ -27,7 +27,7 @@ from superset import app from superset.common.chart_data import ChartDataResultFormat, ChartDataResultType from superset.db_engine_specs.base import builtin_time_grains -from superset.tags.models import TagTypes +from superset.tags.models import TagType from superset.utils import pandas_postprocessing, schema as utils from superset.utils.core import ( AnnotationType, @@ -146,7 +146,7 @@ class TagSchema(Schema): id = fields.Int() name = fields.String() - type = fields.Enum(TagTypes, by_value=True) + type = fields.Enum(TagType, by_value=True) class ChartEntityResponseSchema(Schema): diff --git a/superset/cli/importexport.py b/superset/cli/importexport.py index 5dde06d01ad91..0d76e535e814b 100755 --- a/superset/cli/importexport.py +++ b/superset/cli/importexport.py @@ -72,7 +72,7 @@ def import_directory(directory: str, overwrite: bool, force: bool) -> None: def export_dashboards(dashboard_file: Optional[str] = None) -> None: """Export dashboards to ZIP file""" # pylint: disable=import-outside-toplevel - from superset.dashboards.commands.export import ExportDashboardsCommand + from superset.commands.dashboard.export import ExportDashboardsCommand from superset.models.dashboard import Dashboard g.user = security_manager.find_user(username="admin") @@ -106,8 +106,8 @@ def export_dashboards(dashboard_file: Optional[str] = None) -> None: def export_datasources(datasource_file: Optional[str] = None) -> None: """Export datasources to ZIP file""" # pylint: disable=import-outside-toplevel + from superset.commands.dataset.export import ExportDatasetsCommand from superset.connectors.sqla.models import SqlaTable - from superset.datasets.commands.export import ExportDatasetsCommand g.user = security_manager.find_user(username="admin") @@ -144,10 +144,10 @@ def export_datasources(datasource_file: Optional[str] = None) -> None: def import_dashboards(path: str, username: Optional[str]) -> None: """Import dashboards from ZIP file""" # pylint: disable=import-outside-toplevel - from superset.commands.importers.v1.utils import get_contents_from_bundle - from superset.dashboards.commands.importers.dispatcher import ( + from superset.commands.dashboard.importers.dispatcher import ( ImportDashboardsCommand, ) + from superset.commands.importers.v1.utils import get_contents_from_bundle if username is not None: g.user = security_manager.find_user(username=username) @@ -176,10 +176,8 @@ def import_dashboards(path: str, username: Optional[str]) -> None: def import_datasources(path: str) -> None: """Import datasources from ZIP file""" # pylint: disable=import-outside-toplevel + from superset.commands.dataset.importers.dispatcher import ImportDatasetsCommand from superset.commands.importers.v1.utils import get_contents_from_bundle - from superset.datasets.commands.importers.dispatcher import ( - ImportDatasetsCommand, - ) if is_zipfile(path): with ZipFile(path) as bundle: @@ -304,7 +302,7 @@ def export_datasources( def import_dashboards(path: str, recursive: bool, username: str) -> None: """Import dashboards from JSON file""" # pylint: disable=import-outside-toplevel - from superset.dashboards.commands.importers.v0 import ImportDashboardsCommand + from superset.commands.dashboard.importers.v0 import ImportDashboardsCommand path_object = Path(path) files: list[Path] = [] @@ -353,7 +351,7 @@ def import_dashboards(path: str, recursive: bool, username: str) -> None: def import_datasources(path: str, sync: str, recursive: bool) -> None: """Import datasources from YAML""" # pylint: disable=import-outside-toplevel - from superset.datasets.commands.importers.v0 import ImportDatasetsCommand + from superset.commands.dataset.importers.v0 import ImportDatasetsCommand sync_array = sync.split(",") sync_columns = "columns" in sync_array diff --git a/superset/cli/thumbnails.py b/superset/cli/thumbnails.py index 325fab6853d60..0dd8edfb13027 100755 --- a/superset/cli/thumbnails.py +++ b/superset/cli/thumbnails.py @@ -62,7 +62,7 @@ def compute_thumbnails( dashboards_only: bool, charts_only: bool, force: bool, - model_id: int, + model_id: list[int], ) -> None: """Compute thumbnails""" # pylint: disable=import-outside-toplevel @@ -76,12 +76,12 @@ def compute_thumbnails( def compute_generic_thumbnail( friendly_type: str, model_cls: Union[type[Dashboard], type[Slice]], - model_id: int, + model_ids: list[int], compute_func: CallableTask, ) -> None: query = db.session.query(model_cls) - if model_id: - query = query.filter(model_cls.id.in_(model_id)) + if model_ids: + query = query.filter(model_cls.id.in_(model_ids)) dashboards = query.all() count = len(dashboards) for i, model in enumerate(dashboards): diff --git a/superset/cli/viz_migrations.py b/superset/cli/viz_migrations.py index 9e69135aea386..f24dd8f444cbf 100644 --- a/superset/cli/viz_migrations.py +++ b/superset/cli/viz_migrations.py @@ -24,11 +24,13 @@ class VizType(str, Enum): - TREEMAP = "treemap" - DUAL_LINE = "dual_line" AREA = "area" + BUBBLE = "bubble" + DUAL_LINE = "dual_line" + LINE = "line" PIVOT_TABLE = "pivot_table" SUNBURST = "sunburst" + TREEMAP = "treemap" @click.group() @@ -75,18 +77,22 @@ def migrate(viz_type: VizType, is_downgrade: bool = False) -> None: # pylint: disable=import-outside-toplevel from superset.migrations.shared.migrate_viz.processors import ( MigrateAreaChart, + MigrateBubbleChart, MigrateDualLine, + MigrateLineChart, MigratePivotTable, MigrateSunburst, MigrateTreeMap, ) migrations = { - VizType.TREEMAP: MigrateTreeMap, - VizType.DUAL_LINE: MigrateDualLine, VizType.AREA: MigrateAreaChart, + VizType.BUBBLE: MigrateBubbleChart, + VizType.DUAL_LINE: MigrateDualLine, + VizType.LINE: MigrateLineChart, VizType.PIVOT_TABLE: MigratePivotTable, VizType.SUNBURST: MigrateSunburst, + VizType.TREEMAP: MigrateTreeMap, } if is_downgrade: migrations[viz_type].downgrade(db.session) diff --git a/superset/annotation_layers/annotations/commands/__init__.py b/superset/commands/annotation_layer/__init__.py similarity index 100% rename from superset/annotation_layers/annotations/commands/__init__.py rename to superset/commands/annotation_layer/__init__.py diff --git a/superset/annotation_layers/commands/__init__.py b/superset/commands/annotation_layer/annotation/__init__.py similarity index 100% rename from superset/annotation_layers/commands/__init__.py rename to superset/commands/annotation_layer/annotation/__init__.py diff --git a/superset/annotation_layers/annotations/commands/create.py b/superset/commands/annotation_layer/annotation/create.py similarity index 92% rename from superset/annotation_layers/annotations/commands/create.py rename to superset/commands/annotation_layer/annotation/create.py index 25317762dabca..feed6162cacbe 100644 --- a/superset/annotation_layers/annotations/commands/create.py +++ b/superset/commands/annotation_layer/annotation/create.py @@ -21,15 +21,15 @@ from flask_appbuilder.models.sqla import Model from marshmallow import ValidationError -from superset.annotation_layers.annotations.commands.exceptions import ( +from superset.commands.annotation_layer.annotation.exceptions import ( AnnotationCreateFailedError, AnnotationDatesValidationError, AnnotationInvalidError, AnnotationUniquenessValidationError, ) -from superset.annotation_layers.commands.exceptions import AnnotationLayerNotFoundError +from superset.commands.annotation_layer.exceptions import AnnotationLayerNotFoundError from superset.commands.base import BaseCommand -from superset.daos.annotation import AnnotationDAO, AnnotationLayerDAO +from superset.daos.annotation_layer import AnnotationDAO, AnnotationLayerDAO from superset.daos.exceptions import DAOCreateFailedError logger = logging.getLogger(__name__) diff --git a/superset/annotation_layers/annotations/commands/delete.py b/superset/commands/annotation_layer/annotation/delete.py similarity index 93% rename from superset/annotation_layers/annotations/commands/delete.py rename to superset/commands/annotation_layer/annotation/delete.py index 2850f8cb96302..3f48ae2ceb120 100644 --- a/superset/annotation_layers/annotations/commands/delete.py +++ b/superset/commands/annotation_layer/annotation/delete.py @@ -17,12 +17,12 @@ import logging from typing import Optional -from superset.annotation_layers.annotations.commands.exceptions import ( +from superset.commands.annotation_layer.annotation.exceptions import ( AnnotationDeleteFailedError, AnnotationNotFoundError, ) from superset.commands.base import BaseCommand -from superset.daos.annotation import AnnotationDAO +from superset.daos.annotation_layer import AnnotationDAO from superset.daos.exceptions import DAODeleteFailedError from superset.models.annotations import Annotation diff --git a/superset/annotation_layers/annotations/commands/exceptions.py b/superset/commands/annotation_layer/annotation/exceptions.py similarity index 100% rename from superset/annotation_layers/annotations/commands/exceptions.py rename to superset/commands/annotation_layer/annotation/exceptions.py diff --git a/superset/annotation_layers/annotations/commands/update.py b/superset/commands/annotation_layer/annotation/update.py similarity index 93% rename from superset/annotation_layers/annotations/commands/update.py rename to superset/commands/annotation_layer/annotation/update.py index 76287d24a99db..9ba07fdcd68d2 100644 --- a/superset/annotation_layers/annotations/commands/update.py +++ b/superset/commands/annotation_layer/annotation/update.py @@ -21,16 +21,16 @@ from flask_appbuilder.models.sqla import Model from marshmallow import ValidationError -from superset.annotation_layers.annotations.commands.exceptions import ( +from superset.commands.annotation_layer.annotation.exceptions import ( AnnotationDatesValidationError, AnnotationInvalidError, AnnotationNotFoundError, AnnotationUniquenessValidationError, AnnotationUpdateFailedError, ) -from superset.annotation_layers.commands.exceptions import AnnotationLayerNotFoundError +from superset.commands.annotation_layer.exceptions import AnnotationLayerNotFoundError from superset.commands.base import BaseCommand -from superset.daos.annotation import AnnotationDAO, AnnotationLayerDAO +from superset.daos.annotation_layer import AnnotationDAO, AnnotationLayerDAO from superset.daos.exceptions import DAOUpdateFailedError from superset.models.annotations import Annotation diff --git a/superset/annotation_layers/commands/create.py b/superset/commands/annotation_layer/create.py similarity index 94% rename from superset/annotation_layers/commands/create.py rename to superset/commands/annotation_layer/create.py index 39ce752d2a1ad..6b87ad570363a 100644 --- a/superset/annotation_layers/commands/create.py +++ b/superset/commands/annotation_layer/create.py @@ -20,13 +20,13 @@ from flask_appbuilder.models.sqla import Model from marshmallow import ValidationError -from superset.annotation_layers.commands.exceptions import ( +from superset.commands.annotation_layer.exceptions import ( AnnotationLayerCreateFailedError, AnnotationLayerInvalidError, AnnotationLayerNameUniquenessValidationError, ) from superset.commands.base import BaseCommand -from superset.daos.annotation import AnnotationLayerDAO +from superset.daos.annotation_layer import AnnotationLayerDAO from superset.daos.exceptions import DAOCreateFailedError logger = logging.getLogger(__name__) diff --git a/superset/annotation_layers/commands/delete.py b/superset/commands/annotation_layer/delete.py similarity index 94% rename from superset/annotation_layers/commands/delete.py rename to superset/commands/annotation_layer/delete.py index 41c727054bd7f..a75ee42b772e0 100644 --- a/superset/annotation_layers/commands/delete.py +++ b/superset/commands/annotation_layer/delete.py @@ -17,13 +17,13 @@ import logging from typing import Optional -from superset.annotation_layers.commands.exceptions import ( +from superset.commands.annotation_layer.exceptions import ( AnnotationLayerDeleteFailedError, AnnotationLayerDeleteIntegrityError, AnnotationLayerNotFoundError, ) from superset.commands.base import BaseCommand -from superset.daos.annotation import AnnotationLayerDAO +from superset.daos.annotation_layer import AnnotationLayerDAO from superset.daos.exceptions import DAODeleteFailedError from superset.models.annotations import AnnotationLayer diff --git a/superset/annotation_layers/commands/exceptions.py b/superset/commands/annotation_layer/exceptions.py similarity index 100% rename from superset/annotation_layers/commands/exceptions.py rename to superset/commands/annotation_layer/exceptions.py diff --git a/superset/annotation_layers/commands/update.py b/superset/commands/annotation_layer/update.py similarity index 95% rename from superset/annotation_layers/commands/update.py rename to superset/commands/annotation_layer/update.py index e7f6963e820c4..d15440882b155 100644 --- a/superset/annotation_layers/commands/update.py +++ b/superset/commands/annotation_layer/update.py @@ -20,14 +20,14 @@ from flask_appbuilder.models.sqla import Model from marshmallow import ValidationError -from superset.annotation_layers.commands.exceptions import ( +from superset.commands.annotation_layer.exceptions import ( AnnotationLayerInvalidError, AnnotationLayerNameUniquenessValidationError, AnnotationLayerNotFoundError, AnnotationLayerUpdateFailedError, ) from superset.commands.base import BaseCommand -from superset.daos.annotation import AnnotationLayerDAO +from superset.daos.annotation_layer import AnnotationLayerDAO from superset.daos.exceptions import DAOUpdateFailedError from superset.models.annotations import AnnotationLayer diff --git a/superset/charts/commands/__init__.py b/superset/commands/chart/__init__.py similarity index 100% rename from superset/charts/commands/__init__.py rename to superset/commands/chart/__init__.py diff --git a/superset/charts/commands/create.py b/superset/commands/chart/create.py similarity index 98% rename from superset/charts/commands/create.py rename to superset/commands/chart/create.py index 876073e33543c..2b251029c3f38 100644 --- a/superset/charts/commands/create.py +++ b/superset/commands/chart/create.py @@ -23,13 +23,13 @@ from marshmallow import ValidationError from superset import security_manager -from superset.charts.commands.exceptions import ( +from superset.commands.base import BaseCommand, CreateMixin +from superset.commands.chart.exceptions import ( ChartCreateFailedError, ChartInvalidError, DashboardsForbiddenError, DashboardsNotFoundValidationError, ) -from superset.commands.base import BaseCommand, CreateMixin from superset.commands.utils import get_datasource_by_id from superset.daos.chart import ChartDAO from superset.daos.dashboard import DashboardDAO diff --git a/superset/charts/commands/importers/__init__.py b/superset/commands/chart/data/__init__.py similarity index 100% rename from superset/charts/commands/importers/__init__.py rename to superset/commands/chart/data/__init__.py diff --git a/superset/charts/data/commands/create_async_job_command.py b/superset/commands/chart/data/create_async_job_command.py similarity index 100% rename from superset/charts/data/commands/create_async_job_command.py rename to superset/commands/chart/data/create_async_job_command.py diff --git a/superset/charts/data/commands/get_data_command.py b/superset/commands/chart/data/get_data_command.py similarity index 97% rename from superset/charts/data/commands/get_data_command.py rename to superset/commands/chart/data/get_data_command.py index c791ace9de3ee..971c343cba4e8 100644 --- a/superset/charts/data/commands/get_data_command.py +++ b/superset/commands/chart/data/get_data_command.py @@ -19,11 +19,11 @@ from flask_babel import gettext as _ -from superset.charts.commands.exceptions import ( +from superset.commands.base import BaseCommand +from superset.commands.chart.exceptions import ( ChartDataCacheLoadError, ChartDataQueryFailedError, ) -from superset.commands.base import BaseCommand from superset.common.query_context import QueryContext from superset.exceptions import CacheLoadError diff --git a/superset/charts/commands/delete.py b/superset/commands/chart/delete.py similarity index 98% rename from superset/charts/commands/delete.py rename to superset/commands/chart/delete.py index a31d22be3e159..ee635f04af99a 100644 --- a/superset/charts/commands/delete.py +++ b/superset/commands/chart/delete.py @@ -20,13 +20,13 @@ from flask_babel import lazy_gettext as _ from superset import security_manager -from superset.charts.commands.exceptions import ( +from superset.commands.base import BaseCommand +from superset.commands.chart.exceptions import ( ChartDeleteFailedError, ChartDeleteFailedReportsExistError, ChartForbiddenError, ChartNotFoundError, ) -from superset.commands.base import BaseCommand from superset.daos.chart import ChartDAO from superset.daos.exceptions import DAODeleteFailedError from superset.daos.report import ReportScheduleDAO diff --git a/superset/charts/commands/exceptions.py b/superset/commands/chart/exceptions.py similarity index 100% rename from superset/charts/commands/exceptions.py rename to superset/commands/chart/exceptions.py diff --git a/superset/charts/commands/export.py b/superset/commands/chart/export.py similarity index 95% rename from superset/charts/commands/export.py rename to superset/commands/chart/export.py index c942aa96c9a69..fcb721c7032db 100644 --- a/superset/charts/commands/export.py +++ b/superset/commands/chart/export.py @@ -22,9 +22,9 @@ import yaml -from superset.charts.commands.exceptions import ChartNotFoundError +from superset.commands.chart.exceptions import ChartNotFoundError from superset.daos.chart import ChartDAO -from superset.datasets.commands.export import ExportDatasetsCommand +from superset.commands.dataset.export import ExportDatasetsCommand from superset.commands.export.models import ExportModelsCommand from superset.models.slice import Slice from superset.utils.dict_import_export import EXPORT_VERSION diff --git a/superset/charts/data/commands/__init__.py b/superset/commands/chart/importers/__init__.py similarity index 100% rename from superset/charts/data/commands/__init__.py rename to superset/commands/chart/importers/__init__.py diff --git a/superset/charts/commands/importers/dispatcher.py b/superset/commands/chart/importers/dispatcher.py similarity index 98% rename from superset/charts/commands/importers/dispatcher.py rename to superset/commands/chart/importers/dispatcher.py index fb5007a50ca29..6d2d31ccf4d77 100644 --- a/superset/charts/commands/importers/dispatcher.py +++ b/superset/commands/chart/importers/dispatcher.py @@ -20,8 +20,8 @@ from marshmallow.exceptions import ValidationError -from superset.charts.commands.importers import v1 from superset.commands.base import BaseCommand +from superset.commands.chart.importers import v1 from superset.commands.exceptions import CommandInvalidError from superset.commands.importers.exceptions import IncorrectVersionError diff --git a/superset/charts/commands/importers/v1/__init__.py b/superset/commands/chart/importers/v1/__init__.py similarity index 93% rename from superset/charts/commands/importers/v1/__init__.py rename to superset/commands/chart/importers/v1/__init__.py index 043018fa3b18a..783f300c074f7 100644 --- a/superset/charts/commands/importers/v1/__init__.py +++ b/superset/commands/chart/importers/v1/__init__.py @@ -20,15 +20,15 @@ from marshmallow import Schema from sqlalchemy.orm import Session -from superset.charts.commands.exceptions import ChartImportError -from superset.charts.commands.importers.v1.utils import import_chart from superset.charts.schemas import ImportV1ChartSchema +from superset.commands.chart.exceptions import ChartImportError +from superset.commands.chart.importers.v1.utils import import_chart +from superset.commands.database.importers.v1.utils import import_database +from superset.commands.dataset.importers.v1.utils import import_dataset from superset.commands.importers.v1 import ImportModelsCommand from superset.connectors.sqla.models import SqlaTable from superset.daos.chart import ChartDAO -from superset.databases.commands.importers.v1.utils import import_database from superset.databases.schemas import ImportV1DatabaseSchema -from superset.datasets.commands.importers.v1.utils import import_dataset from superset.datasets.schemas import ImportV1DatasetSchema diff --git a/superset/charts/commands/importers/v1/utils.py b/superset/commands/chart/importers/v1/utils.py similarity index 98% rename from superset/charts/commands/importers/v1/utils.py rename to superset/commands/chart/importers/v1/utils.py index 3ef0a2ed78b49..d27b631f97fde 100644 --- a/superset/charts/commands/importers/v1/utils.py +++ b/superset/commands/chart/importers/v1/utils.py @@ -75,7 +75,6 @@ def migrate_chart(config: dict[str, Any]) -> dict[str, Any]: if isclass(class_) and issubclass(class_, MigrateViz) and hasattr(class_, "source_viz_type") - and class_ != processors.MigrateAreaChart # incomplete } output = copy.deepcopy(config) diff --git a/superset/charts/commands/update.py b/superset/commands/chart/update.py similarity index 98% rename from superset/charts/commands/update.py rename to superset/commands/chart/update.py index 32fd49e7cd4a1..40b36ebcc521e 100644 --- a/superset/charts/commands/update.py +++ b/superset/commands/chart/update.py @@ -23,7 +23,8 @@ from marshmallow import ValidationError from superset import security_manager -from superset.charts.commands.exceptions import ( +from superset.commands.base import BaseCommand, UpdateMixin +from superset.commands.chart.exceptions import ( ChartForbiddenError, ChartInvalidError, ChartNotFoundError, @@ -31,7 +32,6 @@ DashboardsNotFoundValidationError, DatasourceTypeUpdateRequiredValidationError, ) -from superset.commands.base import BaseCommand, UpdateMixin from superset.commands.utils import get_datasource_by_id from superset.daos.chart import ChartDAO from superset.daos.dashboard import DashboardDAO diff --git a/superset/charts/commands/warm_up_cache.py b/superset/commands/chart/warm_up_cache.py similarity index 96% rename from superset/charts/commands/warm_up_cache.py rename to superset/commands/chart/warm_up_cache.py index a684ee5e77778..2e5c0ac3a3aed 100644 --- a/superset/charts/commands/warm_up_cache.py +++ b/superset/commands/chart/warm_up_cache.py @@ -21,12 +21,12 @@ import simplejson as json from flask import g -from superset.charts.commands.exceptions import ( +from superset.commands.base import BaseCommand +from superset.commands.chart.data.get_data_command import ChartDataCommand +from superset.commands.chart.exceptions import ( ChartInvalidError, WarmUpCacheChartNotFoundError, ) -from superset.charts.data.commands.get_data_command import ChartDataCommand -from superset.commands.base import BaseCommand from superset.extensions import db from superset.models.slice import Slice from superset.utils.core import error_msg_from_exception diff --git a/superset/connectors/base/__init__.py b/superset/commands/css/__init__.py similarity index 100% rename from superset/connectors/base/__init__.py rename to superset/commands/css/__init__.py diff --git a/superset/css_templates/commands/delete.py b/superset/commands/css/delete.py similarity index 97% rename from superset/css_templates/commands/delete.py rename to superset/commands/css/delete.py index 123658cb45acc..b8362f6b464dd 100644 --- a/superset/css_templates/commands/delete.py +++ b/superset/commands/css/delete.py @@ -18,7 +18,7 @@ from typing import Optional from superset.commands.base import BaseCommand -from superset.css_templates.commands.exceptions import ( +from superset.commands.css.exceptions import ( CssTemplateDeleteFailedError, CssTemplateNotFoundError, ) diff --git a/superset/css_templates/commands/exceptions.py b/superset/commands/css/exceptions.py similarity index 100% rename from superset/css_templates/commands/exceptions.py rename to superset/commands/css/exceptions.py diff --git a/superset/css_templates/commands/__init__.py b/superset/commands/dashboard/__init__.py similarity index 100% rename from superset/css_templates/commands/__init__.py rename to superset/commands/dashboard/__init__.py diff --git a/superset/dashboards/commands/create.py b/superset/commands/dashboard/create.py similarity index 98% rename from superset/dashboards/commands/create.py rename to superset/commands/dashboard/create.py index 4b5cd5fb04b55..1745391238d75 100644 --- a/superset/dashboards/commands/create.py +++ b/superset/commands/dashboard/create.py @@ -21,14 +21,14 @@ from marshmallow import ValidationError from superset.commands.base import BaseCommand, CreateMixin -from superset.commands.utils import populate_roles -from superset.daos.dashboard import DashboardDAO -from superset.daos.exceptions import DAOCreateFailedError -from superset.dashboards.commands.exceptions import ( +from superset.commands.dashboard.exceptions import ( DashboardCreateFailedError, DashboardInvalidError, DashboardSlugExistsValidationError, ) +from superset.commands.utils import populate_roles +from superset.daos.dashboard import DashboardDAO +from superset.daos.exceptions import DAOCreateFailedError logger = logging.getLogger(__name__) diff --git a/superset/dashboards/commands/delete.py b/superset/commands/dashboard/delete.py similarity index 98% rename from superset/dashboards/commands/delete.py rename to superset/commands/dashboard/delete.py index 7111758bb829b..13ffcb443c675 100644 --- a/superset/dashboards/commands/delete.py +++ b/superset/commands/dashboard/delete.py @@ -21,15 +21,15 @@ from superset import security_manager from superset.commands.base import BaseCommand -from superset.daos.dashboard import DashboardDAO -from superset.daos.exceptions import DAODeleteFailedError -from superset.daos.report import ReportScheduleDAO -from superset.dashboards.commands.exceptions import ( +from superset.commands.dashboard.exceptions import ( DashboardDeleteFailedError, DashboardDeleteFailedReportsExistError, DashboardForbiddenError, DashboardNotFoundError, ) +from superset.daos.dashboard import DashboardDAO +from superset.daos.exceptions import DAODeleteFailedError +from superset.daos.report import ReportScheduleDAO from superset.exceptions import SupersetSecurityException from superset.models.dashboard import Dashboard diff --git a/superset/dashboards/commands/__init__.py b/superset/commands/dashboard/embedded/__init__.py similarity index 100% rename from superset/dashboards/commands/__init__.py rename to superset/commands/dashboard/embedded/__init__.py diff --git a/superset/embedded_dashboard/commands/exceptions.py b/superset/commands/dashboard/embedded/exceptions.py similarity index 100% rename from superset/embedded_dashboard/commands/exceptions.py rename to superset/commands/dashboard/embedded/exceptions.py diff --git a/superset/dashboards/commands/exceptions.py b/superset/commands/dashboard/exceptions.py similarity index 100% rename from superset/dashboards/commands/exceptions.py rename to superset/commands/dashboard/exceptions.py diff --git a/superset/dashboards/commands/export.py b/superset/commands/dashboard/export.py similarity index 95% rename from superset/dashboards/commands/export.py rename to superset/commands/dashboard/export.py index 4e25e5c1fc1d7..fd06c60fa06c0 100644 --- a/superset/dashboards/commands/export.py +++ b/superset/commands/dashboard/export.py @@ -25,12 +25,12 @@ import yaml -from superset.charts.commands.export import ExportChartsCommand -from superset.dashboards.commands.exceptions import DashboardNotFoundError -from superset.dashboards.commands.importers.v1.utils import find_chart_uuids +from superset.commands.chart.export import ExportChartsCommand +from superset.commands.dashboard.exceptions import DashboardNotFoundError +from superset.commands.dashboard.importers.v1.utils import find_chart_uuids from superset.daos.dashboard import DashboardDAO from superset.commands.export.models import ExportModelsCommand -from superset.datasets.commands.export import ExportDatasetsCommand +from superset.commands.dataset.export import ExportDatasetsCommand from superset.daos.dataset import DatasetDAO from superset.models.dashboard import Dashboard from superset.models.slice import Slice diff --git a/superset/dashboards/commands/importers/__init__.py b/superset/commands/dashboard/filter_set/__init__.py similarity index 100% rename from superset/dashboards/commands/importers/__init__.py rename to superset/commands/dashboard/filter_set/__init__.py diff --git a/superset/dashboards/filter_sets/commands/base.py b/superset/commands/dashboard/filter_set/base.py similarity index 96% rename from superset/dashboards/filter_sets/commands/base.py rename to superset/commands/dashboard/filter_set/base.py index 8c53e8a818a52..24abe2509acba 100644 --- a/superset/dashboards/filter_sets/commands/base.py +++ b/superset/commands/dashboard/filter_set/base.py @@ -20,13 +20,13 @@ from flask_appbuilder.models.sqla import Model from superset import security_manager -from superset.common.not_authorized_object import NotAuthorizedException -from superset.daos.dashboard import DashboardDAO -from superset.dashboards.commands.exceptions import DashboardNotFoundError -from superset.dashboards.filter_sets.commands.exceptions import ( +from superset.commands.dashboard.exceptions import DashboardNotFoundError +from superset.commands.dashboard.filter_set.exceptions import ( FilterSetForbiddenError, FilterSetNotFoundError, ) +from superset.common.not_authorized_object import NotAuthorizedException +from superset.daos.dashboard import DashboardDAO from superset.dashboards.filter_sets.consts import USER_OWNER_TYPE from superset.models.dashboard import Dashboard from superset.models.filter_set import FilterSet diff --git a/superset/dashboards/filter_sets/commands/create.py b/superset/commands/dashboard/filter_set/create.py similarity index 95% rename from superset/dashboards/filter_sets/commands/create.py rename to superset/commands/dashboard/filter_set/create.py index d254e86d3cbbd..49edb3172e28a 100644 --- a/superset/dashboards/filter_sets/commands/create.py +++ b/superset/commands/dashboard/filter_set/create.py @@ -20,13 +20,13 @@ from flask_appbuilder.models.sqla import Model from superset import security_manager -from superset.daos.dashboard import FilterSetDAO -from superset.dashboards.filter_sets.commands.base import BaseFilterSetCommand -from superset.dashboards.filter_sets.commands.exceptions import ( +from superset.commands.dashboard.filter_set.base import BaseFilterSetCommand +from superset.commands.dashboard.filter_set.exceptions import ( DashboardIdInconsistencyError, FilterSetCreateFailedError, UserIsNotDashboardOwnerError, ) +from superset.daos.dashboard import FilterSetDAO from superset.dashboards.filter_sets.consts import ( DASHBOARD_ID_FIELD, DASHBOARD_OWNER_TYPE, diff --git a/superset/dashboards/filter_sets/commands/delete.py b/superset/commands/dashboard/filter_set/delete.py similarity index 90% rename from superset/dashboards/filter_sets/commands/delete.py rename to superset/commands/dashboard/filter_set/delete.py index edde4b9b459c7..ce2bf6fce49cf 100644 --- a/superset/dashboards/filter_sets/commands/delete.py +++ b/superset/commands/dashboard/filter_set/delete.py @@ -16,14 +16,14 @@ # under the License. import logging -from superset.daos.dashboard import FilterSetDAO -from superset.daos.exceptions import DAODeleteFailedError -from superset.dashboards.filter_sets.commands.base import BaseFilterSetCommand -from superset.dashboards.filter_sets.commands.exceptions import ( +from superset.commands.dashboard.filter_set.base import BaseFilterSetCommand +from superset.commands.dashboard.filter_set.exceptions import ( FilterSetDeleteFailedError, FilterSetForbiddenError, FilterSetNotFoundError, ) +from superset.daos.dashboard import FilterSetDAO +from superset.daos.exceptions import DAODeleteFailedError logger = logging.getLogger(__name__) @@ -38,7 +38,7 @@ def run(self) -> None: assert self._filter_set try: - FilterSetDAO.delete(self._filter_set) + FilterSetDAO.delete([self._filter_set]) except DAODeleteFailedError as err: raise FilterSetDeleteFailedError(str(self._filter_set_id), "") from err diff --git a/superset/dashboards/filter_sets/commands/exceptions.py b/superset/commands/dashboard/filter_set/exceptions.py similarity index 100% rename from superset/dashboards/filter_sets/commands/exceptions.py rename to superset/commands/dashboard/filter_set/exceptions.py diff --git a/superset/dashboards/filter_sets/commands/update.py b/superset/commands/dashboard/filter_set/update.py similarity index 91% rename from superset/dashboards/filter_sets/commands/update.py rename to superset/commands/dashboard/filter_set/update.py index a63c8d46f2b6b..5ce9f1fea63ac 100644 --- a/superset/dashboards/filter_sets/commands/update.py +++ b/superset/commands/dashboard/filter_set/update.py @@ -19,12 +19,10 @@ from flask_appbuilder.models.sqla import Model +from superset.commands.dashboard.filter_set.base import BaseFilterSetCommand +from superset.commands.dashboard.filter_set.exceptions import FilterSetUpdateFailedError from superset.daos.dashboard import FilterSetDAO from superset.daos.exceptions import DAOUpdateFailedError -from superset.dashboards.filter_sets.commands.base import BaseFilterSetCommand -from superset.dashboards.filter_sets.commands.exceptions import ( - FilterSetUpdateFailedError, -) from superset.dashboards.filter_sets.consts import OWNER_ID_FIELD, OWNER_TYPE_FIELD logger = logging.getLogger(__name__) diff --git a/superset/dashboards/filter_sets/commands/__init__.py b/superset/commands/dashboard/filter_state/__init__.py similarity index 100% rename from superset/dashboards/filter_sets/commands/__init__.py rename to superset/commands/dashboard/filter_state/__init__.py diff --git a/superset/dashboards/filter_state/commands/create.py b/superset/commands/dashboard/filter_state/create.py similarity index 87% rename from superset/dashboards/filter_state/commands/create.py rename to superset/commands/dashboard/filter_state/create.py index 48b5e4f5c2d2e..1f105ac5c27f4 100644 --- a/superset/dashboards/filter_state/commands/create.py +++ b/superset/commands/dashboard/filter_state/create.py @@ -18,12 +18,12 @@ from flask import session -from superset.dashboards.filter_state.commands.utils import check_access +from superset.commands.dashboard.filter_state.utils import check_access +from superset.commands.temporary_cache.create import CreateTemporaryCacheCommand +from superset.commands.temporary_cache.entry import Entry +from superset.commands.temporary_cache.parameters import CommandParameters from superset.extensions import cache_manager from superset.key_value.utils import random_key -from superset.temporary_cache.commands.create import CreateTemporaryCacheCommand -from superset.temporary_cache.commands.entry import Entry -from superset.temporary_cache.commands.parameters import CommandParameters from superset.temporary_cache.utils import cache_key from superset.utils.core import get_user_id diff --git a/superset/dashboards/filter_state/commands/delete.py b/superset/commands/dashboard/filter_state/delete.py similarity index 84% rename from superset/dashboards/filter_state/commands/delete.py rename to superset/commands/dashboard/filter_state/delete.py index 6086388a8ce44..8be7f44d9876f 100644 --- a/superset/dashboards/filter_state/commands/delete.py +++ b/superset/commands/dashboard/filter_state/delete.py @@ -16,12 +16,12 @@ # under the License. from flask import session -from superset.dashboards.filter_state.commands.utils import check_access +from superset.commands.dashboard.filter_state.utils import check_access +from superset.commands.temporary_cache.delete import DeleteTemporaryCacheCommand +from superset.commands.temporary_cache.entry import Entry +from superset.commands.temporary_cache.exceptions import TemporaryCacheAccessDeniedError +from superset.commands.temporary_cache.parameters import CommandParameters from superset.extensions import cache_manager -from superset.temporary_cache.commands.delete import DeleteTemporaryCacheCommand -from superset.temporary_cache.commands.entry import Entry -from superset.temporary_cache.commands.exceptions import TemporaryCacheAccessDeniedError -from superset.temporary_cache.commands.parameters import CommandParameters from superset.temporary_cache.utils import cache_key from superset.utils.core import get_user_id diff --git a/superset/dashboards/filter_state/commands/get.py b/superset/commands/dashboard/filter_state/get.py similarity index 89% rename from superset/dashboards/filter_state/commands/get.py rename to superset/commands/dashboard/filter_state/get.py index ca7ffa9879a9f..29104b5ee2d4b 100644 --- a/superset/dashboards/filter_state/commands/get.py +++ b/superset/commands/dashboard/filter_state/get.py @@ -18,10 +18,10 @@ from flask import current_app as app -from superset.dashboards.filter_state.commands.utils import check_access +from superset.commands.dashboard.filter_state.utils import check_access +from superset.commands.temporary_cache.get import GetTemporaryCacheCommand +from superset.commands.temporary_cache.parameters import CommandParameters from superset.extensions import cache_manager -from superset.temporary_cache.commands.get import GetTemporaryCacheCommand -from superset.temporary_cache.commands.parameters import CommandParameters from superset.temporary_cache.utils import cache_key diff --git a/superset/dashboards/filter_state/commands/update.py b/superset/commands/dashboard/filter_state/update.py similarity index 87% rename from superset/dashboards/filter_state/commands/update.py rename to superset/commands/dashboard/filter_state/update.py index c1dc529ccff58..80b8c26ede0e2 100644 --- a/superset/dashboards/filter_state/commands/update.py +++ b/superset/commands/dashboard/filter_state/update.py @@ -18,13 +18,13 @@ from flask import session -from superset.dashboards.filter_state.commands.utils import check_access +from superset.commands.dashboard.filter_state.utils import check_access +from superset.commands.temporary_cache.entry import Entry +from superset.commands.temporary_cache.exceptions import TemporaryCacheAccessDeniedError +from superset.commands.temporary_cache.parameters import CommandParameters +from superset.commands.temporary_cache.update import UpdateTemporaryCacheCommand from superset.extensions import cache_manager from superset.key_value.utils import random_key -from superset.temporary_cache.commands.entry import Entry -from superset.temporary_cache.commands.exceptions import TemporaryCacheAccessDeniedError -from superset.temporary_cache.commands.parameters import CommandParameters -from superset.temporary_cache.commands.update import UpdateTemporaryCacheCommand from superset.temporary_cache.utils import cache_key from superset.utils.core import get_user_id diff --git a/superset/dashboards/filter_state/commands/utils.py b/superset/commands/dashboard/filter_state/utils.py similarity index 91% rename from superset/dashboards/filter_state/commands/utils.py rename to superset/commands/dashboard/filter_state/utils.py index 7e52518249fcb..14f7eb789373a 100644 --- a/superset/dashboards/filter_state/commands/utils.py +++ b/superset/commands/dashboard/filter_state/utils.py @@ -15,15 +15,15 @@ # specific language governing permissions and limitations # under the License. -from superset.daos.dashboard import DashboardDAO -from superset.dashboards.commands.exceptions import ( +from superset.commands.dashboard.exceptions import ( DashboardAccessDeniedError, DashboardNotFoundError, ) -from superset.temporary_cache.commands.exceptions import ( +from superset.commands.temporary_cache.exceptions import ( TemporaryCacheAccessDeniedError, TemporaryCacheResourceNotFoundError, ) +from superset.daos.dashboard import DashboardDAO def check_access(resource_id: int) -> None: diff --git a/superset/dashboards/filter_state/commands/__init__.py b/superset/commands/dashboard/importers/__init__.py similarity index 100% rename from superset/dashboards/filter_state/commands/__init__.py rename to superset/commands/dashboard/importers/__init__.py diff --git a/superset/dashboards/commands/importers/dispatcher.py b/superset/commands/dashboard/importers/dispatcher.py similarity index 97% rename from superset/dashboards/commands/importers/dispatcher.py rename to superset/commands/dashboard/importers/dispatcher.py index d5323b4fe4dd1..061558cce95fe 100644 --- a/superset/dashboards/commands/importers/dispatcher.py +++ b/superset/commands/dashboard/importers/dispatcher.py @@ -21,9 +21,9 @@ from marshmallow.exceptions import ValidationError from superset.commands.base import BaseCommand +from superset.commands.dashboard.importers import v0, v1 from superset.commands.exceptions import CommandInvalidError from superset.commands.importers.exceptions import IncorrectVersionError -from superset.dashboards.commands.importers import v0, v1 logger = logging.getLogger(__name__) diff --git a/superset/dashboards/commands/importers/v0.py b/superset/commands/dashboard/importers/v0.py similarity index 99% rename from superset/dashboards/commands/importers/v0.py rename to superset/commands/dashboard/importers/v0.py index 012dbbc5c9663..4c2a18e5cc694 100644 --- a/superset/dashboards/commands/importers/v0.py +++ b/superset/commands/dashboard/importers/v0.py @@ -26,8 +26,8 @@ from superset import db from superset.commands.base import BaseCommand +from superset.commands.dataset.importers.v0 import import_dataset from superset.connectors.sqla.models import SqlaTable, SqlMetric, TableColumn -from superset.datasets.commands.importers.v0 import import_dataset from superset.exceptions import DashboardImportException from superset.models.dashboard import Dashboard from superset.models.slice import Slice diff --git a/superset/dashboards/commands/importers/v1/__init__.py b/superset/commands/dashboard/importers/v1/__init__.py similarity index 94% rename from superset/dashboards/commands/importers/v1/__init__.py rename to superset/commands/dashboard/importers/v1/__init__.py index 30e63da4e4a95..2717650e9e31b 100644 --- a/superset/dashboards/commands/importers/v1/__init__.py +++ b/superset/commands/dashboard/importers/v1/__init__.py @@ -21,21 +21,21 @@ from sqlalchemy.orm import Session from sqlalchemy.sql import select -from superset.charts.commands.importers.v1.utils import import_chart from superset.charts.schemas import ImportV1ChartSchema -from superset.commands.importers.v1 import ImportModelsCommand -from superset.daos.dashboard import DashboardDAO -from superset.dashboards.commands.exceptions import DashboardImportError -from superset.dashboards.commands.importers.v1.utils import ( +from superset.commands.chart.importers.v1.utils import import_chart +from superset.commands.dashboard.exceptions import DashboardImportError +from superset.commands.dashboard.importers.v1.utils import ( find_chart_uuids, find_native_filter_datasets, import_dashboard, update_id_refs, ) +from superset.commands.database.importers.v1.utils import import_database +from superset.commands.dataset.importers.v1.utils import import_dataset +from superset.commands.importers.v1 import ImportModelsCommand +from superset.daos.dashboard import DashboardDAO from superset.dashboards.schemas import ImportV1DashboardSchema -from superset.databases.commands.importers.v1.utils import import_database from superset.databases.schemas import ImportV1DatabaseSchema -from superset.datasets.commands.importers.v1.utils import import_dataset from superset.datasets.schemas import ImportV1DatasetSchema from superset.models.dashboard import dashboard_slices diff --git a/superset/dashboards/commands/importers/v1/utils.py b/superset/commands/dashboard/importers/v1/utils.py similarity index 100% rename from superset/dashboards/commands/importers/v1/utils.py rename to superset/commands/dashboard/importers/v1/utils.py diff --git a/superset/dashboards/permalink/commands/__init__.py b/superset/commands/dashboard/permalink/__init__.py similarity index 100% rename from superset/dashboards/permalink/commands/__init__.py rename to superset/commands/dashboard/permalink/__init__.py diff --git a/superset/dashboards/permalink/commands/base.py b/superset/commands/dashboard/permalink/base.py similarity index 100% rename from superset/dashboards/permalink/commands/base.py rename to superset/commands/dashboard/permalink/base.py diff --git a/superset/dashboards/permalink/commands/create.py b/superset/commands/dashboard/permalink/create.py similarity index 94% rename from superset/dashboards/permalink/commands/create.py rename to superset/commands/dashboard/permalink/create.py index 320003ff3da3b..3387d432d5e02 100644 --- a/superset/dashboards/permalink/commands/create.py +++ b/superset/commands/dashboard/permalink/create.py @@ -18,11 +18,11 @@ from sqlalchemy.exc import SQLAlchemyError +from superset.commands.dashboard.permalink.base import BaseDashboardPermalinkCommand +from superset.commands.key_value.upsert import UpsertKeyValueCommand from superset.daos.dashboard import DashboardDAO -from superset.dashboards.permalink.commands.base import BaseDashboardPermalinkCommand from superset.dashboards.permalink.exceptions import DashboardPermalinkCreateFailedError from superset.dashboards.permalink.types import DashboardPermalinkState -from superset.key_value.commands.upsert import UpsertKeyValueCommand from superset.key_value.exceptions import KeyValueCodecEncodeException from superset.key_value.utils import encode_permalink_key, get_deterministic_uuid from superset.utils.core import get_user_id diff --git a/superset/dashboards/permalink/commands/get.py b/superset/commands/dashboard/permalink/get.py similarity index 91% rename from superset/dashboards/permalink/commands/get.py rename to superset/commands/dashboard/permalink/get.py index 6b32a459a594b..32efa688813ce 100644 --- a/superset/dashboards/permalink/commands/get.py +++ b/superset/commands/dashboard/permalink/get.py @@ -19,12 +19,12 @@ from sqlalchemy.exc import SQLAlchemyError +from superset.commands.dashboard.exceptions import DashboardNotFoundError +from superset.commands.dashboard.permalink.base import BaseDashboardPermalinkCommand +from superset.commands.key_value.get import GetKeyValueCommand from superset.daos.dashboard import DashboardDAO -from superset.dashboards.commands.exceptions import DashboardNotFoundError -from superset.dashboards.permalink.commands.base import BaseDashboardPermalinkCommand from superset.dashboards.permalink.exceptions import DashboardPermalinkGetFailedError from superset.dashboards.permalink.types import DashboardPermalinkValue -from superset.key_value.commands.get import GetKeyValueCommand from superset.key_value.exceptions import ( KeyValueCodecDecodeException, KeyValueGetFailedError, diff --git a/superset/dashboards/commands/update.py b/superset/commands/dashboard/update.py similarity index 98% rename from superset/dashboards/commands/update.py rename to superset/commands/dashboard/update.py index f9975c0dd2f47..22dcad4b2c86b 100644 --- a/superset/dashboards/commands/update.py +++ b/superset/commands/dashboard/update.py @@ -23,16 +23,16 @@ from superset import security_manager from superset.commands.base import BaseCommand, UpdateMixin -from superset.commands.utils import populate_roles -from superset.daos.dashboard import DashboardDAO -from superset.daos.exceptions import DAOUpdateFailedError -from superset.dashboards.commands.exceptions import ( +from superset.commands.dashboard.exceptions import ( DashboardForbiddenError, DashboardInvalidError, DashboardNotFoundError, DashboardSlugExistsValidationError, DashboardUpdateFailedError, ) +from superset.commands.utils import populate_roles +from superset.daos.dashboard import DashboardDAO +from superset.daos.exceptions import DAOUpdateFailedError from superset.exceptions import SupersetSecurityException from superset.extensions import db from superset.models.dashboard import Dashboard diff --git a/superset/databases/commands/__init__.py b/superset/commands/database/__init__.py similarity index 100% rename from superset/databases/commands/__init__.py rename to superset/commands/database/__init__.py diff --git a/superset/databases/commands/create.py b/superset/commands/database/create.py similarity index 95% rename from superset/databases/commands/create.py rename to superset/commands/database/create.py index d3dfe59e5e7d5..a012e9b2a5768 100644 --- a/superset/databases/commands/create.py +++ b/superset/commands/database/create.py @@ -23,22 +23,22 @@ from superset import is_feature_enabled from superset.commands.base import BaseCommand -from superset.daos.database import DatabaseDAO -from superset.daos.exceptions import DAOCreateFailedError -from superset.databases.commands.exceptions import ( +from superset.commands.database.exceptions import ( DatabaseConnectionFailedError, DatabaseCreateFailedError, DatabaseExistsValidationError, DatabaseInvalidError, DatabaseRequiredFieldValidationError, ) -from superset.databases.commands.test_connection import TestConnectionDatabaseCommand -from superset.databases.ssh_tunnel.commands.create import CreateSSHTunnelCommand -from superset.databases.ssh_tunnel.commands.exceptions import ( +from superset.commands.database.ssh_tunnel.create import CreateSSHTunnelCommand +from superset.commands.database.ssh_tunnel.exceptions import ( SSHTunnelCreateFailedError, SSHTunnelingNotEnabledError, SSHTunnelInvalidError, ) +from superset.commands.database.test_connection import TestConnectionDatabaseCommand +from superset.daos.database import DatabaseDAO +from superset.daos.exceptions import DAOCreateFailedError from superset.exceptions import SupersetErrorsException from superset.extensions import db, event_logger, security_manager diff --git a/superset/databases/commands/delete.py b/superset/commands/database/delete.py similarity index 96% rename from superset/databases/commands/delete.py rename to superset/commands/database/delete.py index 254380a906040..2db408c76e661 100644 --- a/superset/databases/commands/delete.py +++ b/superset/commands/database/delete.py @@ -20,15 +20,15 @@ from flask_babel import lazy_gettext as _ from superset.commands.base import BaseCommand -from superset.daos.database import DatabaseDAO -from superset.daos.exceptions import DAODeleteFailedError -from superset.daos.report import ReportScheduleDAO -from superset.databases.commands.exceptions import ( +from superset.commands.database.exceptions import ( DatabaseDeleteDatasetsExistFailedError, DatabaseDeleteFailedError, DatabaseDeleteFailedReportsExistError, DatabaseNotFoundError, ) +from superset.daos.database import DatabaseDAO +from superset.daos.exceptions import DAODeleteFailedError +from superset.daos.report import ReportScheduleDAO from superset.models.core import Database logger = logging.getLogger(__name__) @@ -44,7 +44,7 @@ def run(self) -> None: assert self._model try: - DatabaseDAO.delete(self._model) + DatabaseDAO.delete([self._model]) except DAODeleteFailedError as ex: logger.exception(ex.exception) raise DatabaseDeleteFailedError() from ex diff --git a/superset/databases/commands/exceptions.py b/superset/commands/database/exceptions.py similarity index 100% rename from superset/databases/commands/exceptions.py rename to superset/commands/database/exceptions.py diff --git a/superset/databases/commands/export.py b/superset/commands/database/export.py similarity index 98% rename from superset/databases/commands/export.py rename to superset/commands/database/export.py index 71dc55a0268fb..82c22ea801948 100644 --- a/superset/databases/commands/export.py +++ b/superset/commands/database/export.py @@ -23,7 +23,7 @@ import yaml -from superset.databases.commands.exceptions import DatabaseNotFoundError +from superset.commands.database.exceptions import DatabaseNotFoundError from superset.daos.database import DatabaseDAO from superset.commands.export.models import ExportModelsCommand from superset.models.core import Database diff --git a/superset/databases/commands/importers/__init__.py b/superset/commands/database/importers/__init__.py similarity index 100% rename from superset/databases/commands/importers/__init__.py rename to superset/commands/database/importers/__init__.py diff --git a/superset/databases/commands/importers/dispatcher.py b/superset/commands/database/importers/dispatcher.py similarity index 97% rename from superset/databases/commands/importers/dispatcher.py rename to superset/commands/database/importers/dispatcher.py index 70031b09e4fe6..bdf487a75893f 100644 --- a/superset/databases/commands/importers/dispatcher.py +++ b/superset/commands/database/importers/dispatcher.py @@ -21,9 +21,9 @@ from marshmallow.exceptions import ValidationError from superset.commands.base import BaseCommand +from superset.commands.database.importers import v1 from superset.commands.exceptions import CommandInvalidError from superset.commands.importers.exceptions import IncorrectVersionError -from superset.databases.commands.importers import v1 logger = logging.getLogger(__name__) diff --git a/superset/databases/commands/importers/v1/__init__.py b/superset/commands/database/importers/v1/__init__.py similarity index 91% rename from superset/databases/commands/importers/v1/__init__.py rename to superset/commands/database/importers/v1/__init__.py index 585c2d54ca160..73b1bca5311fc 100644 --- a/superset/databases/commands/importers/v1/__init__.py +++ b/superset/commands/database/importers/v1/__init__.py @@ -20,12 +20,12 @@ from marshmallow import Schema from sqlalchemy.orm import Session +from superset.commands.database.exceptions import DatabaseImportError +from superset.commands.database.importers.v1.utils import import_database +from superset.commands.dataset.importers.v1.utils import import_dataset from superset.commands.importers.v1 import ImportModelsCommand from superset.daos.database import DatabaseDAO -from superset.databases.commands.exceptions import DatabaseImportError -from superset.databases.commands.importers.v1.utils import import_database from superset.databases.schemas import ImportV1DatabaseSchema -from superset.datasets.commands.importers.v1.utils import import_dataset from superset.datasets.schemas import ImportV1DatasetSchema diff --git a/superset/databases/commands/importers/v1/utils.py b/superset/commands/database/importers/v1/utils.py similarity index 100% rename from superset/databases/commands/importers/v1/utils.py rename to superset/commands/database/importers/v1/utils.py diff --git a/superset/databases/ssh_tunnel/commands/__init__.py b/superset/commands/database/ssh_tunnel/__init__.py similarity index 100% rename from superset/databases/ssh_tunnel/commands/__init__.py rename to superset/commands/database/ssh_tunnel/__init__.py diff --git a/superset/databases/ssh_tunnel/commands/create.py b/superset/commands/database/ssh_tunnel/create.py similarity index 98% rename from superset/databases/ssh_tunnel/commands/create.py rename to superset/commands/database/ssh_tunnel/create.py index 36f33e46f9fab..07209f010ba1d 100644 --- a/superset/databases/ssh_tunnel/commands/create.py +++ b/superset/commands/database/ssh_tunnel/create.py @@ -21,13 +21,13 @@ from marshmallow import ValidationError from superset.commands.base import BaseCommand -from superset.daos.database import SSHTunnelDAO -from superset.daos.exceptions import DAOCreateFailedError -from superset.databases.ssh_tunnel.commands.exceptions import ( +from superset.commands.database.ssh_tunnel.exceptions import ( SSHTunnelCreateFailedError, SSHTunnelInvalidError, SSHTunnelRequiredFieldValidationError, ) +from superset.daos.database import SSHTunnelDAO +from superset.daos.exceptions import DAOCreateFailedError from superset.extensions import db, event_logger logger = logging.getLogger(__name__) diff --git a/superset/databases/ssh_tunnel/commands/delete.py b/superset/commands/database/ssh_tunnel/delete.py similarity index 94% rename from superset/databases/ssh_tunnel/commands/delete.py rename to superset/commands/database/ssh_tunnel/delete.py index 04d6e68338901..b8919e6d7bae6 100644 --- a/superset/databases/ssh_tunnel/commands/delete.py +++ b/superset/commands/database/ssh_tunnel/delete.py @@ -19,13 +19,13 @@ from superset import is_feature_enabled from superset.commands.base import BaseCommand -from superset.daos.database import SSHTunnelDAO -from superset.daos.exceptions import DAODeleteFailedError -from superset.databases.ssh_tunnel.commands.exceptions import ( +from superset.commands.database.ssh_tunnel.exceptions import ( SSHTunnelDeleteFailedError, SSHTunnelingNotEnabledError, SSHTunnelNotFoundError, ) +from superset.daos.database import SSHTunnelDAO +from superset.daos.exceptions import DAODeleteFailedError from superset.databases.ssh_tunnel.models import SSHTunnel logger = logging.getLogger(__name__) @@ -43,7 +43,7 @@ def run(self) -> None: assert self._model try: - SSHTunnelDAO.delete(self._model) + SSHTunnelDAO.delete([self._model]) except DAODeleteFailedError as ex: raise SSHTunnelDeleteFailedError() from ex diff --git a/superset/databases/ssh_tunnel/commands/exceptions.py b/superset/commands/database/ssh_tunnel/exceptions.py similarity index 100% rename from superset/databases/ssh_tunnel/commands/exceptions.py rename to superset/commands/database/ssh_tunnel/exceptions.py diff --git a/superset/databases/ssh_tunnel/commands/update.py b/superset/commands/database/ssh_tunnel/update.py similarity index 97% rename from superset/databases/ssh_tunnel/commands/update.py rename to superset/commands/database/ssh_tunnel/update.py index 4e4edcb664b8a..ae7ee78afe799 100644 --- a/superset/databases/ssh_tunnel/commands/update.py +++ b/superset/commands/database/ssh_tunnel/update.py @@ -20,14 +20,14 @@ from flask_appbuilder.models.sqla import Model from superset.commands.base import BaseCommand -from superset.daos.database import SSHTunnelDAO -from superset.daos.exceptions import DAOUpdateFailedError -from superset.databases.ssh_tunnel.commands.exceptions import ( +from superset.commands.database.ssh_tunnel.exceptions import ( SSHTunnelInvalidError, SSHTunnelNotFoundError, SSHTunnelRequiredFieldValidationError, SSHTunnelUpdateFailedError, ) +from superset.daos.database import SSHTunnelDAO +from superset.daos.exceptions import DAOUpdateFailedError from superset.databases.ssh_tunnel.models import SSHTunnel logger = logging.getLogger(__name__) diff --git a/superset/databases/commands/tables.py b/superset/commands/database/tables.py similarity index 98% rename from superset/databases/commands/tables.py rename to superset/commands/database/tables.py index 6232470ece569..fa98bcbc7ec5a 100644 --- a/superset/databases/commands/tables.py +++ b/superset/commands/database/tables.py @@ -20,12 +20,12 @@ from sqlalchemy.orm import lazyload, load_only from superset.commands.base import BaseCommand -from superset.connectors.sqla.models import SqlaTable -from superset.daos.database import DatabaseDAO -from superset.databases.commands.exceptions import ( +from superset.commands.database.exceptions import ( DatabaseNotFoundError, DatabaseTablesUnexpectedError, ) +from superset.connectors.sqla.models import SqlaTable +from superset.daos.database import DatabaseDAO from superset.exceptions import SupersetException from superset.extensions import db, security_manager from superset.models.core import Database diff --git a/superset/databases/commands/test_connection.py b/superset/commands/database/test_connection.py similarity index 98% rename from superset/databases/commands/test_connection.py rename to superset/commands/database/test_connection.py index 49c5340dd25f9..0ffdf3ddd957c 100644 --- a/superset/databases/commands/test_connection.py +++ b/superset/commands/database/test_connection.py @@ -27,15 +27,13 @@ from superset import is_feature_enabled from superset.commands.base import BaseCommand -from superset.daos.database import DatabaseDAO, SSHTunnelDAO -from superset.databases.commands.exceptions import ( +from superset.commands.database.exceptions import ( DatabaseSecurityUnsafeError, DatabaseTestConnectionDriverError, DatabaseTestConnectionUnexpectedError, ) -from superset.databases.ssh_tunnel.commands.exceptions import ( - SSHTunnelingNotEnabledError, -) +from superset.commands.database.ssh_tunnel.exceptions import SSHTunnelingNotEnabledError +from superset.daos.database import DatabaseDAO, SSHTunnelDAO from superset.databases.ssh_tunnel.models import SSHTunnel from superset.databases.utils import make_url_safe from superset.errors import ErrorLevel, SupersetErrorType diff --git a/superset/databases/commands/update.py b/superset/commands/database/update.py similarity index 96% rename from superset/databases/commands/update.py rename to superset/commands/database/update.py index d8d86c6d2d62c..039d731d72d04 100644 --- a/superset/databases/commands/update.py +++ b/superset/commands/database/update.py @@ -22,23 +22,23 @@ from superset import is_feature_enabled from superset.commands.base import BaseCommand -from superset.daos.database import DatabaseDAO -from superset.daos.exceptions import DAOCreateFailedError, DAOUpdateFailedError -from superset.databases.commands.exceptions import ( +from superset.commands.database.exceptions import ( DatabaseConnectionFailedError, DatabaseExistsValidationError, DatabaseInvalidError, DatabaseNotFoundError, DatabaseUpdateFailedError, ) -from superset.databases.ssh_tunnel.commands.create import CreateSSHTunnelCommand -from superset.databases.ssh_tunnel.commands.exceptions import ( +from superset.commands.database.ssh_tunnel.create import CreateSSHTunnelCommand +from superset.commands.database.ssh_tunnel.exceptions import ( SSHTunnelCreateFailedError, SSHTunnelingNotEnabledError, SSHTunnelInvalidError, SSHTunnelUpdateFailedError, ) -from superset.databases.ssh_tunnel.commands.update import UpdateSSHTunnelCommand +from superset.commands.database.ssh_tunnel.update import UpdateSSHTunnelCommand +from superset.daos.database import DatabaseDAO +from superset.daos.exceptions import DAOCreateFailedError, DAOUpdateFailedError from superset.extensions import db, security_manager from superset.models.core import Database from superset.utils.core import DatasourceType diff --git a/superset/databases/commands/validate.py b/superset/commands/database/validate.py similarity index 98% rename from superset/databases/commands/validate.py rename to superset/commands/database/validate.py index 6ea412b490969..83bbc4e90a6f7 100644 --- a/superset/databases/commands/validate.py +++ b/superset/commands/database/validate.py @@ -21,13 +21,13 @@ from flask_babel import gettext as __ from superset.commands.base import BaseCommand -from superset.daos.database import DatabaseDAO -from superset.databases.commands.exceptions import ( +from superset.commands.database.exceptions import ( DatabaseOfflineError, DatabaseTestConnectionFailedError, InvalidEngineError, InvalidParametersError, ) +from superset.daos.database import DatabaseDAO from superset.databases.utils import make_url_safe from superset.db_engine_specs import get_engine_spec from superset.errors import ErrorLevel, SupersetError, SupersetErrorType diff --git a/superset/databases/commands/validate_sql.py b/superset/commands/database/validate_sql.py similarity index 98% rename from superset/databases/commands/validate_sql.py rename to superset/commands/database/validate_sql.py index 6fc0c3a39842d..9a00526bfaff3 100644 --- a/superset/databases/commands/validate_sql.py +++ b/superset/commands/database/validate_sql.py @@ -22,8 +22,7 @@ from flask_babel import gettext as __ from superset.commands.base import BaseCommand -from superset.daos.database import DatabaseDAO -from superset.databases.commands.exceptions import ( +from superset.commands.database.exceptions import ( DatabaseNotFoundError, NoValidatorConfigFoundError, NoValidatorFoundError, @@ -31,6 +30,7 @@ ValidatorSQLError, ValidatorSQLUnexpectedError, ) +from superset.daos.database import DatabaseDAO from superset.errors import ErrorLevel, SupersetError, SupersetErrorType from superset.models.core import Database from superset.sql_validators import get_validator_by_name diff --git a/superset/datasets/columns/commands/__init__.py b/superset/commands/dataset/__init__.py similarity index 100% rename from superset/datasets/columns/commands/__init__.py rename to superset/commands/dataset/__init__.py diff --git a/superset/datasets/commands/__init__.py b/superset/commands/dataset/columns/__init__.py similarity index 100% rename from superset/datasets/commands/__init__.py rename to superset/commands/dataset/columns/__init__.py diff --git a/superset/datasets/columns/commands/delete.py b/superset/commands/dataset/columns/delete.py similarity index 95% rename from superset/datasets/columns/commands/delete.py rename to superset/commands/dataset/columns/delete.py index 23b0d93b6a238..4739c2520f880 100644 --- a/superset/datasets/columns/commands/delete.py +++ b/superset/commands/dataset/columns/delete.py @@ -19,14 +19,14 @@ from superset import security_manager from superset.commands.base import BaseCommand -from superset.connectors.sqla.models import TableColumn -from superset.daos.dataset import DatasetColumnDAO, DatasetDAO -from superset.daos.exceptions import DAODeleteFailedError -from superset.datasets.columns.commands.exceptions import ( +from superset.commands.dataset.columns.exceptions import ( DatasetColumnDeleteFailedError, DatasetColumnForbiddenError, DatasetColumnNotFoundError, ) +from superset.connectors.sqla.models import TableColumn +from superset.daos.dataset import DatasetColumnDAO, DatasetDAO +from superset.daos.exceptions import DAODeleteFailedError from superset.exceptions import SupersetSecurityException logger = logging.getLogger(__name__) @@ -43,7 +43,7 @@ def run(self) -> None: assert self._model try: - DatasetColumnDAO.delete(self._model) + DatasetColumnDAO.delete([self._model]) except DAODeleteFailedError as ex: logger.exception(ex.exception) raise DatasetColumnDeleteFailedError() from ex diff --git a/superset/datasets/columns/commands/exceptions.py b/superset/commands/dataset/columns/exceptions.py similarity index 100% rename from superset/datasets/columns/commands/exceptions.py rename to superset/commands/dataset/columns/exceptions.py diff --git a/superset/datasets/commands/create.py b/superset/commands/dataset/create.py similarity index 98% rename from superset/datasets/commands/create.py rename to superset/commands/dataset/create.py index 8f486b0c9ab3d..1c354e835f8a2 100644 --- a/superset/datasets/commands/create.py +++ b/superset/commands/dataset/create.py @@ -22,15 +22,15 @@ from sqlalchemy.exc import SQLAlchemyError from superset.commands.base import BaseCommand, CreateMixin -from superset.daos.dataset import DatasetDAO -from superset.daos.exceptions import DAOCreateFailedError -from superset.datasets.commands.exceptions import ( +from superset.commands.dataset.exceptions import ( DatabaseNotFoundValidationError, DatasetCreateFailedError, DatasetExistsValidationError, DatasetInvalidError, TableNotFoundValidationError, ) +from superset.daos.dataset import DatasetDAO +from superset.daos.exceptions import DAOCreateFailedError from superset.extensions import db logger = logging.getLogger(__name__) diff --git a/superset/datasets/commands/delete.py b/superset/commands/dataset/delete.py similarity index 97% rename from superset/datasets/commands/delete.py rename to superset/commands/dataset/delete.py index 478267d01dd4d..4b7e61ab4c113 100644 --- a/superset/datasets/commands/delete.py +++ b/superset/commands/dataset/delete.py @@ -19,14 +19,14 @@ from superset import security_manager from superset.commands.base import BaseCommand -from superset.connectors.sqla.models import SqlaTable -from superset.daos.dataset import DatasetDAO -from superset.daos.exceptions import DAODeleteFailedError -from superset.datasets.commands.exceptions import ( +from superset.commands.dataset.exceptions import ( DatasetDeleteFailedError, DatasetForbiddenError, DatasetNotFoundError, ) +from superset.connectors.sqla.models import SqlaTable +from superset.daos.dataset import DatasetDAO +from superset.daos.exceptions import DAODeleteFailedError from superset.exceptions import SupersetSecurityException logger = logging.getLogger(__name__) diff --git a/superset/datasets/commands/duplicate.py b/superset/commands/dataset/duplicate.py similarity index 99% rename from superset/datasets/commands/duplicate.py rename to superset/commands/dataset/duplicate.py index 12ae96e0aee47..0ae47c35bca4d 100644 --- a/superset/datasets/commands/duplicate.py +++ b/superset/commands/dataset/duplicate.py @@ -23,16 +23,16 @@ from sqlalchemy.exc import SQLAlchemyError from superset.commands.base import BaseCommand, CreateMixin -from superset.commands.exceptions import DatasourceTypeInvalidError -from superset.connectors.sqla.models import SqlaTable, SqlMetric, TableColumn -from superset.daos.dataset import DatasetDAO -from superset.daos.exceptions import DAOCreateFailedError -from superset.datasets.commands.exceptions import ( +from superset.commands.dataset.exceptions import ( DatasetDuplicateFailedError, DatasetExistsValidationError, DatasetInvalidError, DatasetNotFoundError, ) +from superset.commands.exceptions import DatasourceTypeInvalidError +from superset.connectors.sqla.models import SqlaTable, SqlMetric, TableColumn +from superset.daos.dataset import DatasetDAO +from superset.daos.exceptions import DAOCreateFailedError from superset.errors import ErrorLevel, SupersetError, SupersetErrorType from superset.exceptions import SupersetErrorException from superset.extensions import db diff --git a/superset/datasets/commands/exceptions.py b/superset/commands/dataset/exceptions.py similarity index 100% rename from superset/datasets/commands/exceptions.py rename to superset/commands/dataset/exceptions.py diff --git a/superset/datasets/commands/export.py b/superset/commands/dataset/export.py similarity index 98% rename from superset/datasets/commands/export.py rename to superset/commands/dataset/export.py index 392265232204e..afecdd2fea261 100644 --- a/superset/datasets/commands/export.py +++ b/superset/commands/dataset/export.py @@ -25,7 +25,7 @@ from superset.commands.export.models import ExportModelsCommand from superset.connectors.sqla.models import SqlaTable from superset.daos.database import DatabaseDAO -from superset.datasets.commands.exceptions import DatasetNotFoundError +from superset.commands.dataset.exceptions import DatasetNotFoundError from superset.daos.dataset import DatasetDAO from superset.utils.dict_import_export import EXPORT_VERSION from superset.utils.file import get_filename diff --git a/superset/datasets/commands/importers/__init__.py b/superset/commands/dataset/importers/__init__.py similarity index 100% rename from superset/datasets/commands/importers/__init__.py rename to superset/commands/dataset/importers/__init__.py diff --git a/superset/datasets/commands/importers/dispatcher.py b/superset/commands/dataset/importers/dispatcher.py similarity index 97% rename from superset/datasets/commands/importers/dispatcher.py rename to superset/commands/dataset/importers/dispatcher.py index 6be8635da20a7..9138d4f971cb8 100644 --- a/superset/datasets/commands/importers/dispatcher.py +++ b/superset/commands/dataset/importers/dispatcher.py @@ -21,9 +21,9 @@ from marshmallow.exceptions import ValidationError from superset.commands.base import BaseCommand +from superset.commands.dataset.importers import v0, v1 from superset.commands.exceptions import CommandInvalidError from superset.commands.importers.exceptions import IncorrectVersionError -from superset.datasets.commands.importers import v0, v1 logger = logging.getLogger(__name__) diff --git a/superset/datasets/commands/importers/v0.py b/superset/commands/dataset/importers/v0.py similarity index 90% rename from superset/datasets/commands/importers/v0.py rename to superset/commands/dataset/importers/v0.py index a34d9be1acafb..d389a17651d44 100644 --- a/superset/datasets/commands/importers/v0.py +++ b/superset/commands/dataset/importers/v0.py @@ -25,11 +25,15 @@ from superset import db from superset.commands.base import BaseCommand +from superset.commands.database.exceptions import DatabaseNotFoundError +from superset.commands.dataset.exceptions import DatasetInvalidError from superset.commands.importers.exceptions import IncorrectVersionError -from superset.connectors.base.models import BaseColumn, BaseDatasource, BaseMetric -from superset.connectors.sqla.models import SqlaTable, SqlMetric, TableColumn -from superset.databases.commands.exceptions import DatabaseNotFoundError -from superset.datasets.commands.exceptions import DatasetInvalidError +from superset.connectors.sqla.models import ( + BaseDatasource, + SqlaTable, + SqlMetric, + TableColumn, +) from superset.models.core import Database from superset.utils.dict_import_export import DATABASES_KEY @@ -102,14 +106,8 @@ def lookup_sqla_metric(session: Session, metric: SqlMetric) -> SqlMetric: ) -def import_metric(session: Session, metric: BaseMetric) -> BaseMetric: - if isinstance(metric, SqlMetric): - lookup_metric = lookup_sqla_metric - else: - raise Exception( # pylint: disable=broad-exception-raised - f"Invalid metric type: {metric}" - ) - return import_simple_obj(session, metric, lookup_metric) +def import_metric(session: Session, metric: SqlMetric) -> SqlMetric: + return import_simple_obj(session, metric, lookup_sqla_metric) def lookup_sqla_column(session: Session, column: TableColumn) -> TableColumn: @@ -123,14 +121,8 @@ def lookup_sqla_column(session: Session, column: TableColumn) -> TableColumn: ) -def import_column(session: Session, column: BaseColumn) -> BaseColumn: - if isinstance(column, TableColumn): - lookup_column = lookup_sqla_column - else: - raise Exception( # pylint: disable=broad-exception-raised - f"Invalid column type: {column}" - ) - return import_simple_obj(session, column, lookup_column) +def import_column(session: Session, column: TableColumn) -> TableColumn: + return import_simple_obj(session, column, lookup_sqla_column) def import_datasource( # pylint: disable=too-many-arguments diff --git a/superset/datasets/commands/importers/v1/__init__.py b/superset/commands/dataset/importers/v1/__init__.py similarity index 92% rename from superset/datasets/commands/importers/v1/__init__.py rename to superset/commands/dataset/importers/v1/__init__.py index f46c137b7e8f7..600a39bf48d5b 100644 --- a/superset/datasets/commands/importers/v1/__init__.py +++ b/superset/commands/dataset/importers/v1/__init__.py @@ -20,12 +20,12 @@ from marshmallow import Schema from sqlalchemy.orm import Session +from superset.commands.database.importers.v1.utils import import_database +from superset.commands.dataset.exceptions import DatasetImportError +from superset.commands.dataset.importers.v1.utils import import_dataset from superset.commands.importers.v1 import ImportModelsCommand from superset.daos.dataset import DatasetDAO -from superset.databases.commands.importers.v1.utils import import_database from superset.databases.schemas import ImportV1DatabaseSchema -from superset.datasets.commands.exceptions import DatasetImportError -from superset.datasets.commands.importers.v1.utils import import_dataset from superset.datasets.schemas import ImportV1DatasetSchema diff --git a/superset/datasets/commands/importers/v1/utils.py b/superset/commands/dataset/importers/v1/utils.py similarity index 99% rename from superset/datasets/commands/importers/v1/utils.py rename to superset/commands/dataset/importers/v1/utils.py index c45f7a5655be0..c145cc50f91f9 100644 --- a/superset/datasets/commands/importers/v1/utils.py +++ b/superset/commands/dataset/importers/v1/utils.py @@ -29,9 +29,9 @@ from sqlalchemy.sql.visitors import VisitableType from superset import security_manager +from superset.commands.dataset.exceptions import DatasetForbiddenDataURI from superset.commands.exceptions import ImportFailedError from superset.connectors.sqla.models import SqlaTable -from superset.datasets.commands.exceptions import DatasetForbiddenDataURI from superset.models.core import Database logger = logging.getLogger(__name__) diff --git a/superset/datasets/metrics/commands/__init__.py b/superset/commands/dataset/metrics/__init__.py similarity index 100% rename from superset/datasets/metrics/commands/__init__.py rename to superset/commands/dataset/metrics/__init__.py diff --git a/superset/datasets/metrics/commands/delete.py b/superset/commands/dataset/metrics/delete.py similarity index 95% rename from superset/datasets/metrics/commands/delete.py rename to superset/commands/dataset/metrics/delete.py index 8f27e98a3dbd1..b48668852cafd 100644 --- a/superset/datasets/metrics/commands/delete.py +++ b/superset/commands/dataset/metrics/delete.py @@ -19,14 +19,14 @@ from superset import security_manager from superset.commands.base import BaseCommand -from superset.connectors.sqla.models import SqlMetric -from superset.daos.dataset import DatasetDAO, DatasetMetricDAO -from superset.daos.exceptions import DAODeleteFailedError -from superset.datasets.metrics.commands.exceptions import ( +from superset.commands.dataset.metrics.exceptions import ( DatasetMetricDeleteFailedError, DatasetMetricForbiddenError, DatasetMetricNotFoundError, ) +from superset.connectors.sqla.models import SqlMetric +from superset.daos.dataset import DatasetDAO, DatasetMetricDAO +from superset.daos.exceptions import DAODeleteFailedError from superset.exceptions import SupersetSecurityException logger = logging.getLogger(__name__) @@ -43,7 +43,7 @@ def run(self) -> None: assert self._model try: - DatasetMetricDAO.delete(self._model) + DatasetMetricDAO.delete([self._model]) except DAODeleteFailedError as ex: logger.exception(ex.exception) raise DatasetMetricDeleteFailedError() from ex diff --git a/superset/datasets/metrics/commands/exceptions.py b/superset/commands/dataset/metrics/exceptions.py similarity index 100% rename from superset/datasets/metrics/commands/exceptions.py rename to superset/commands/dataset/metrics/exceptions.py diff --git a/superset/datasets/commands/refresh.py b/superset/commands/dataset/refresh.py similarity index 97% rename from superset/datasets/commands/refresh.py rename to superset/commands/dataset/refresh.py index a25609636db0e..5976956d7cedf 100644 --- a/superset/datasets/commands/refresh.py +++ b/superset/commands/dataset/refresh.py @@ -21,13 +21,13 @@ from superset import security_manager from superset.commands.base import BaseCommand -from superset.connectors.sqla.models import SqlaTable -from superset.daos.dataset import DatasetDAO -from superset.datasets.commands.exceptions import ( +from superset.commands.dataset.exceptions import ( DatasetForbiddenError, DatasetNotFoundError, DatasetRefreshFailedError, ) +from superset.connectors.sqla.models import SqlaTable +from superset.daos.dataset import DatasetDAO from superset.exceptions import SupersetSecurityException logger = logging.getLogger(__name__) diff --git a/superset/datasets/commands/update.py b/superset/commands/dataset/update.py similarity index 99% rename from superset/datasets/commands/update.py rename to superset/commands/dataset/update.py index 8dcc4dfd5f606..8a72c24fd5f28 100644 --- a/superset/datasets/commands/update.py +++ b/superset/commands/dataset/update.py @@ -23,10 +23,7 @@ from superset import security_manager from superset.commands.base import BaseCommand, UpdateMixin -from superset.connectors.sqla.models import SqlaTable -from superset.daos.dataset import DatasetDAO -from superset.daos.exceptions import DAOUpdateFailedError -from superset.datasets.commands.exceptions import ( +from superset.commands.dataset.exceptions import ( DatabaseChangeValidationError, DatasetColumnNotFoundValidationError, DatasetColumnsDuplicateValidationError, @@ -40,6 +37,9 @@ DatasetNotFoundError, DatasetUpdateFailedError, ) +from superset.connectors.sqla.models import SqlaTable +from superset.daos.dataset import DatasetDAO +from superset.daos.exceptions import DAOUpdateFailedError from superset.exceptions import SupersetSecurityException logger = logging.getLogger(__name__) diff --git a/superset/datasets/commands/warm_up_cache.py b/superset/commands/dataset/warm_up_cache.py similarity index 89% rename from superset/datasets/commands/warm_up_cache.py rename to superset/commands/dataset/warm_up_cache.py index 64becc9cd63fd..97b00c4772ad5 100644 --- a/superset/datasets/commands/warm_up_cache.py +++ b/superset/commands/dataset/warm_up_cache.py @@ -18,10 +18,10 @@ from typing import Any, Optional -from superset.charts.commands.warm_up_cache import ChartWarmUpCacheCommand from superset.commands.base import BaseCommand +from superset.commands.chart.warm_up_cache import ChartWarmUpCacheCommand +from superset.commands.dataset.exceptions import WarmUpCacheTableNotFoundError from superset.connectors.sqla.models import SqlaTable -from superset.datasets.commands.exceptions import WarmUpCacheTableNotFoundError from superset.extensions import db from superset.models.core import Database from superset.models.slice import Slice @@ -45,7 +45,9 @@ def run(self) -> list[dict[str, Any]]: self.validate() return [ ChartWarmUpCacheCommand( - chart, self._dashboard_id, self._extra_filters + chart, + self._dashboard_id, + self._extra_filters, ).run() for chart in self._charts ] diff --git a/superset/embedded_dashboard/commands/__init__.py b/superset/commands/explore/__init__.py similarity index 100% rename from superset/embedded_dashboard/commands/__init__.py rename to superset/commands/explore/__init__.py diff --git a/superset/explore/commands/__init__.py b/superset/commands/explore/form_data/__init__.py similarity index 100% rename from superset/explore/commands/__init__.py rename to superset/commands/explore/form_data/__init__.py diff --git a/superset/explore/form_data/commands/create.py b/superset/commands/explore/form_data/create.py similarity index 91% rename from superset/explore/form_data/commands/create.py rename to superset/commands/explore/form_data/create.py index df0250f2fffb6..e85f840133cf9 100644 --- a/superset/explore/form_data/commands/create.py +++ b/superset/commands/explore/form_data/create.py @@ -20,12 +20,12 @@ from sqlalchemy.exc import SQLAlchemyError from superset.commands.base import BaseCommand -from superset.explore.form_data.commands.parameters import CommandParameters -from superset.explore.form_data.commands.state import TemporaryExploreState -from superset.explore.form_data.commands.utils import check_access +from superset.commands.explore.form_data.parameters import CommandParameters +from superset.commands.explore.form_data.state import TemporaryExploreState +from superset.commands.explore.form_data.utils import check_access +from superset.commands.temporary_cache.exceptions import TemporaryCacheCreateFailedError from superset.extensions import cache_manager from superset.key_value.utils import random_key -from superset.temporary_cache.commands.exceptions import TemporaryCacheCreateFailedError from superset.temporary_cache.utils import cache_key from superset.utils.core import DatasourceType, get_user_id from superset.utils.schema import validate_json diff --git a/superset/explore/form_data/commands/delete.py b/superset/commands/explore/form_data/delete.py similarity index 91% rename from superset/explore/form_data/commands/delete.py rename to superset/commands/explore/form_data/delete.py index bce13b719a7d4..d998f132d6c1e 100644 --- a/superset/explore/form_data/commands/delete.py +++ b/superset/commands/explore/form_data/delete.py @@ -22,14 +22,14 @@ from sqlalchemy.exc import SQLAlchemyError from superset.commands.base import BaseCommand -from superset.explore.form_data.commands.parameters import CommandParameters -from superset.explore.form_data.commands.state import TemporaryExploreState -from superset.explore.form_data.commands.utils import check_access -from superset.extensions import cache_manager -from superset.temporary_cache.commands.exceptions import ( +from superset.commands.explore.form_data.parameters import CommandParameters +from superset.commands.explore.form_data.state import TemporaryExploreState +from superset.commands.explore.form_data.utils import check_access +from superset.commands.temporary_cache.exceptions import ( TemporaryCacheAccessDeniedError, TemporaryCacheDeleteFailedError, ) +from superset.extensions import cache_manager from superset.temporary_cache.utils import cache_key from superset.utils.core import DatasourceType, get_user_id diff --git a/superset/explore/form_data/commands/get.py b/superset/commands/explore/form_data/get.py similarity index 89% rename from superset/explore/form_data/commands/get.py rename to superset/commands/explore/form_data/get.py index 53fd6ea6a9359..0153888d4e36b 100644 --- a/superset/explore/form_data/commands/get.py +++ b/superset/commands/explore/form_data/get.py @@ -22,11 +22,11 @@ from sqlalchemy.exc import SQLAlchemyError from superset.commands.base import BaseCommand -from superset.explore.form_data.commands.parameters import CommandParameters -from superset.explore.form_data.commands.state import TemporaryExploreState -from superset.explore.form_data.commands.utils import check_access +from superset.commands.explore.form_data.parameters import CommandParameters +from superset.commands.explore.form_data.state import TemporaryExploreState +from superset.commands.explore.form_data.utils import check_access +from superset.commands.temporary_cache.exceptions import TemporaryCacheGetFailedError from superset.extensions import cache_manager -from superset.temporary_cache.commands.exceptions import TemporaryCacheGetFailedError from superset.utils.core import DatasourceType logger = logging.getLogger(__name__) diff --git a/superset/explore/form_data/commands/parameters.py b/superset/commands/explore/form_data/parameters.py similarity index 100% rename from superset/explore/form_data/commands/parameters.py rename to superset/commands/explore/form_data/parameters.py diff --git a/superset/explore/form_data/commands/state.py b/superset/commands/explore/form_data/state.py similarity index 100% rename from superset/explore/form_data/commands/state.py rename to superset/commands/explore/form_data/state.py diff --git a/superset/explore/form_data/commands/update.py b/superset/commands/explore/form_data/update.py similarity index 93% rename from superset/explore/form_data/commands/update.py rename to superset/commands/explore/form_data/update.py index ace57350c450f..fbb6ee07199cc 100644 --- a/superset/explore/form_data/commands/update.py +++ b/superset/commands/explore/form_data/update.py @@ -22,15 +22,15 @@ from sqlalchemy.exc import SQLAlchemyError from superset.commands.base import BaseCommand -from superset.explore.form_data.commands.parameters import CommandParameters -from superset.explore.form_data.commands.state import TemporaryExploreState -from superset.explore.form_data.commands.utils import check_access -from superset.extensions import cache_manager -from superset.key_value.utils import random_key -from superset.temporary_cache.commands.exceptions import ( +from superset.commands.explore.form_data.parameters import CommandParameters +from superset.commands.explore.form_data.state import TemporaryExploreState +from superset.commands.explore.form_data.utils import check_access +from superset.commands.temporary_cache.exceptions import ( TemporaryCacheAccessDeniedError, TemporaryCacheUpdateFailedError, ) +from superset.extensions import cache_manager +from superset.key_value.utils import random_key from superset.temporary_cache.utils import cache_key from superset.utils.core import DatasourceType, get_user_id from superset.utils.schema import validate_json diff --git a/superset/explore/form_data/commands/utils.py b/superset/commands/explore/form_data/utils.py similarity index 90% rename from superset/explore/form_data/commands/utils.py rename to superset/commands/explore/form_data/utils.py index e4a843dc6284f..45b46fb8b3014 100644 --- a/superset/explore/form_data/commands/utils.py +++ b/superset/commands/explore/form_data/utils.py @@ -16,19 +16,19 @@ # under the License. from typing import Optional -from superset.charts.commands.exceptions import ( +from superset.commands.chart.exceptions import ( ChartAccessDeniedError, ChartNotFoundError, ) -from superset.datasets.commands.exceptions import ( +from superset.commands.dataset.exceptions import ( DatasetAccessDeniedError, DatasetNotFoundError, ) -from superset.explore.utils import check_access as explore_check_access -from superset.temporary_cache.commands.exceptions import ( +from superset.commands.temporary_cache.exceptions import ( TemporaryCacheAccessDeniedError, TemporaryCacheResourceNotFoundError, ) +from superset.explore.utils import check_access as explore_check_access from superset.utils.core import DatasourceType diff --git a/superset/explore/commands/get.py b/superset/commands/explore/get.py similarity index 94% rename from superset/explore/commands/get.py rename to superset/commands/explore/get.py index d348b16251b97..bb8f5a85e9e8a 100644 --- a/superset/explore/commands/get.py +++ b/superset/commands/explore/get.py @@ -26,18 +26,17 @@ from superset import db from superset.commands.base import BaseCommand -from superset.connectors.base.models import BaseDatasource -from superset.connectors.sqla.models import SqlaTable +from superset.commands.explore.form_data.get import GetFormDataCommand +from superset.commands.explore.form_data.parameters import ( + CommandParameters as FormDataCommandParameters, +) +from superset.commands.explore.parameters import CommandParameters +from superset.commands.explore.permalink.get import GetExplorePermalinkCommand +from superset.connectors.sqla.models import BaseDatasource, SqlaTable from superset.daos.datasource import DatasourceDAO from superset.daos.exceptions import DatasourceNotFound from superset.exceptions import SupersetException -from superset.explore.commands.parameters import CommandParameters from superset.explore.exceptions import WrongEndpointError -from superset.explore.form_data.commands.get import GetFormDataCommand -from superset.explore.form_data.commands.parameters import ( - CommandParameters as FormDataCommandParameters, -) -from superset.explore.permalink.commands.get import GetExplorePermalinkCommand from superset.explore.permalink.exceptions import ExplorePermalinkGetFailedError from superset.utils import core as utils from superset.views.utils import ( diff --git a/superset/explore/commands/parameters.py b/superset/commands/explore/parameters.py similarity index 100% rename from superset/explore/commands/parameters.py rename to superset/commands/explore/parameters.py diff --git a/superset/explore/form_data/commands/__init__.py b/superset/commands/explore/permalink/__init__.py similarity index 100% rename from superset/explore/form_data/commands/__init__.py rename to superset/commands/explore/permalink/__init__.py diff --git a/superset/explore/permalink/commands/base.py b/superset/commands/explore/permalink/base.py similarity index 100% rename from superset/explore/permalink/commands/base.py rename to superset/commands/explore/permalink/base.py diff --git a/superset/explore/permalink/commands/create.py b/superset/commands/explore/permalink/create.py similarity index 95% rename from superset/explore/permalink/commands/create.py rename to superset/commands/explore/permalink/create.py index 97a8bcbf09ed4..befb1d5a47e28 100644 --- a/superset/explore/permalink/commands/create.py +++ b/superset/commands/explore/permalink/create.py @@ -19,10 +19,10 @@ from sqlalchemy.exc import SQLAlchemyError -from superset.explore.permalink.commands.base import BaseExplorePermalinkCommand +from superset.commands.explore.permalink.base import BaseExplorePermalinkCommand +from superset.commands.key_value.create import CreateKeyValueCommand from superset.explore.permalink.exceptions import ExplorePermalinkCreateFailedError from superset.explore.utils import check_access as check_chart_access -from superset.key_value.commands.create import CreateKeyValueCommand from superset.key_value.exceptions import KeyValueCodecEncodeException from superset.key_value.utils import encode_permalink_key from superset.utils.core import DatasourceType diff --git a/superset/explore/permalink/commands/get.py b/superset/commands/explore/permalink/get.py similarity index 93% rename from superset/explore/permalink/commands/get.py rename to superset/commands/explore/permalink/get.py index 1aa093b380581..4c01db1ccab49 100644 --- a/superset/explore/permalink/commands/get.py +++ b/superset/commands/explore/permalink/get.py @@ -19,12 +19,12 @@ from sqlalchemy.exc import SQLAlchemyError -from superset.datasets.commands.exceptions import DatasetNotFoundError -from superset.explore.permalink.commands.base import BaseExplorePermalinkCommand +from superset.commands.dataset.exceptions import DatasetNotFoundError +from superset.commands.explore.permalink.base import BaseExplorePermalinkCommand +from superset.commands.key_value.get import GetKeyValueCommand from superset.explore.permalink.exceptions import ExplorePermalinkGetFailedError from superset.explore.permalink.types import ExplorePermalinkValue from superset.explore.utils import check_access as check_chart_access -from superset.key_value.commands.get import GetKeyValueCommand from superset.key_value.exceptions import ( KeyValueCodecDecodeException, KeyValueGetFailedError, diff --git a/superset/commands/export/assets.py b/superset/commands/export/assets.py index 1bd2cf6d61ffa..61d805acafc19 100644 --- a/superset/commands/export/assets.py +++ b/superset/commands/export/assets.py @@ -20,12 +20,12 @@ import yaml -from superset.charts.commands.export import ExportChartsCommand from superset.commands.base import BaseCommand -from superset.dashboards.commands.export import ExportDashboardsCommand -from superset.databases.commands.export import ExportDatabasesCommand -from superset.datasets.commands.export import ExportDatasetsCommand -from superset.queries.saved_queries.commands.export import ExportSavedQueriesCommand +from superset.commands.chart.export import ExportChartsCommand +from superset.commands.dashboard.export import ExportDashboardsCommand +from superset.commands.database.export import ExportDatabasesCommand +from superset.commands.dataset.export import ExportDatasetsCommand +from superset.commands.query.export import ExportSavedQueriesCommand from superset.utils.dict_import_export import EXPORT_VERSION METADATA_FILE_NAME = "metadata.yaml" diff --git a/superset/commands/importers/v1/assets.py b/superset/commands/importers/v1/assets.py index 4c8971315c270..b6bc29e0fa4c9 100644 --- a/superset/commands/importers/v1/assets.py +++ b/superset/commands/importers/v1/assets.py @@ -22,29 +22,27 @@ from sqlalchemy.sql import delete, insert from superset import db -from superset.charts.commands.importers.v1.utils import import_chart from superset.charts.schemas import ImportV1ChartSchema from superset.commands.base import BaseCommand +from superset.commands.chart.importers.v1.utils import import_chart +from superset.commands.dashboard.importers.v1.utils import ( + find_chart_uuids, + import_dashboard, + update_id_refs, +) +from superset.commands.database.importers.v1.utils import import_database +from superset.commands.dataset.importers.v1.utils import import_dataset from superset.commands.exceptions import CommandInvalidError, ImportFailedError from superset.commands.importers.v1.utils import ( load_configs, load_metadata, validate_metadata_type, ) -from superset.dashboards.commands.importers.v1.utils import ( - find_chart_uuids, - import_dashboard, - update_id_refs, -) +from superset.commands.query.importers.v1.utils import import_saved_query from superset.dashboards.schemas import ImportV1DashboardSchema -from superset.databases.commands.importers.v1.utils import import_database from superset.databases.schemas import ImportV1DatabaseSchema -from superset.datasets.commands.importers.v1.utils import import_dataset from superset.datasets.schemas import ImportV1DatasetSchema from superset.models.dashboard import dashboard_slices -from superset.queries.saved_queries.commands.importers.v1.utils import ( - import_saved_query, -) from superset.queries.saved_queries.schemas import ImportV1SavedQuerySchema diff --git a/superset/commands/importers/v1/examples.py b/superset/commands/importers/v1/examples.py index 737be25f8a742..94194921ac762 100644 --- a/superset/commands/importers/v1/examples.py +++ b/superset/commands/importers/v1/examples.py @@ -22,24 +22,24 @@ from sqlalchemy.sql import select from superset import db -from superset.charts.commands.importers.v1 import ImportChartsCommand -from superset.charts.commands.importers.v1.utils import import_chart from superset.charts.schemas import ImportV1ChartSchema -from superset.commands.exceptions import CommandException -from superset.commands.importers.v1 import ImportModelsCommand -from superset.daos.base import BaseDAO -from superset.dashboards.commands.importers.v1 import ImportDashboardsCommand -from superset.dashboards.commands.importers.v1.utils import ( +from superset.commands.chart.importers.v1 import ImportChartsCommand +from superset.commands.chart.importers.v1.utils import import_chart +from superset.commands.dashboard.importers.v1 import ImportDashboardsCommand +from superset.commands.dashboard.importers.v1.utils import ( find_chart_uuids, import_dashboard, update_id_refs, ) +from superset.commands.database.importers.v1 import ImportDatabasesCommand +from superset.commands.database.importers.v1.utils import import_database +from superset.commands.dataset.importers.v1 import ImportDatasetsCommand +from superset.commands.dataset.importers.v1.utils import import_dataset +from superset.commands.exceptions import CommandException +from superset.commands.importers.v1 import ImportModelsCommand +from superset.daos.base import BaseDAO from superset.dashboards.schemas import ImportV1DashboardSchema -from superset.databases.commands.importers.v1 import ImportDatabasesCommand -from superset.databases.commands.importers.v1.utils import import_database from superset.databases.schemas import ImportV1DatabaseSchema -from superset.datasets.commands.importers.v1 import ImportDatasetsCommand -from superset.datasets.commands.importers.v1.utils import import_dataset from superset.datasets.schemas import ImportV1DatasetSchema from superset.models.dashboard import dashboard_slices from superset.utils.core import get_example_default_schema diff --git a/superset/explore/permalink/commands/__init__.py b/superset/commands/key_value/__init__.py similarity index 100% rename from superset/explore/permalink/commands/__init__.py rename to superset/commands/key_value/__init__.py diff --git a/superset/key_value/commands/create.py b/superset/commands/key_value/create.py similarity index 100% rename from superset/key_value/commands/create.py rename to superset/commands/key_value/create.py diff --git a/superset/key_value/commands/delete.py b/superset/commands/key_value/delete.py similarity index 92% rename from superset/key_value/commands/delete.py rename to superset/commands/key_value/delete.py index b3cf84be07515..8b9095c09c9b2 100644 --- a/superset/key_value/commands/delete.py +++ b/superset/commands/key_value/delete.py @@ -57,13 +57,7 @@ def validate(self) -> None: def delete(self) -> bool: filter_ = get_filter(self.resource, self.key) - entry = ( - db.session.query(KeyValueEntry) - .filter_by(**filter_) - .autoflush(False) - .first() - ) - if entry: + if entry := db.session.query(KeyValueEntry).filter_by(**filter_).first(): db.session.delete(entry) db.session.commit() return True diff --git a/superset/key_value/commands/delete_expired.py b/superset/commands/key_value/delete_expired.py similarity index 100% rename from superset/key_value/commands/delete_expired.py rename to superset/commands/key_value/delete_expired.py diff --git a/superset/key_value/commands/get.py b/superset/commands/key_value/get.py similarity index 93% rename from superset/key_value/commands/get.py rename to superset/commands/key_value/get.py index 9d659f3bc7c06..8a7a250f1c088 100644 --- a/superset/key_value/commands/get.py +++ b/superset/commands/key_value/get.py @@ -66,12 +66,7 @@ def validate(self) -> None: def get(self) -> Optional[Any]: filter_ = get_filter(self.resource, self.key) - entry = ( - db.session.query(KeyValueEntry) - .filter_by(**filter_) - .autoflush(False) - .first() - ) + entry = db.session.query(KeyValueEntry).filter_by(**filter_).first() if entry and (entry.expires_on is None or entry.expires_on > datetime.now()): return self.codec.decode(entry.value) return None diff --git a/superset/key_value/commands/update.py b/superset/commands/key_value/update.py similarity index 94% rename from superset/key_value/commands/update.py rename to superset/commands/key_value/update.py index becd6d9ca8d01..ca940adf60282 100644 --- a/superset/key_value/commands/update.py +++ b/superset/commands/key_value/update.py @@ -77,17 +77,13 @@ def validate(self) -> None: def update(self) -> Optional[Key]: filter_ = get_filter(self.resource, self.key) entry: KeyValueEntry = ( - db.session.query(KeyValueEntry) - .filter_by(**filter_) - .autoflush(False) - .first() + db.session.query(KeyValueEntry).filter_by(**filter_).first() ) if entry: entry.value = self.codec.encode(self.value) entry.expires_on = self.expires_on entry.changed_on = datetime.now() entry.changed_by_fk = get_user_id() - db.session.merge(entry) db.session.commit() return Key(id=entry.id, uuid=entry.uuid) diff --git a/superset/key_value/commands/upsert.py b/superset/commands/key_value/upsert.py similarity index 93% rename from superset/key_value/commands/upsert.py rename to superset/commands/key_value/upsert.py index c5668f11610ab..84f02cb9cd223 100644 --- a/superset/key_value/commands/upsert.py +++ b/superset/commands/key_value/upsert.py @@ -24,7 +24,7 @@ from superset import db from superset.commands.base import BaseCommand -from superset.key_value.commands.create import CreateKeyValueCommand +from superset.commands.key_value.create import CreateKeyValueCommand from superset.key_value.exceptions import ( KeyValueCreateFailedError, KeyValueUpsertFailedError, @@ -81,17 +81,13 @@ def validate(self) -> None: def upsert(self) -> Key: filter_ = get_filter(self.resource, self.key) entry: KeyValueEntry = ( - db.session.query(KeyValueEntry) - .filter_by(**filter_) - .autoflush(False) - .first() + db.session.query(KeyValueEntry).filter_by(**filter_).first() ) if entry: entry.value = self.codec.encode(self.value) entry.expires_on = self.expires_on entry.changed_on = datetime.now() entry.changed_by_fk = get_user_id() - db.session.merge(entry) db.session.commit() return Key(entry.id, entry.uuid) diff --git a/superset/key_value/commands/__init__.py b/superset/commands/query/__init__.py similarity index 100% rename from superset/key_value/commands/__init__.py rename to superset/commands/query/__init__.py diff --git a/superset/queries/saved_queries/commands/delete.py b/superset/commands/query/delete.py similarity index 96% rename from superset/queries/saved_queries/commands/delete.py rename to superset/commands/query/delete.py index 40b73658e0afa..978f30c5c4a87 100644 --- a/superset/queries/saved_queries/commands/delete.py +++ b/superset/commands/query/delete.py @@ -18,13 +18,13 @@ from typing import Optional from superset.commands.base import BaseCommand -from superset.daos.exceptions import DAODeleteFailedError -from superset.daos.query import SavedQueryDAO -from superset.models.dashboard import Dashboard -from superset.queries.saved_queries.commands.exceptions import ( +from superset.commands.query.exceptions import ( SavedQueryDeleteFailedError, SavedQueryNotFoundError, ) +from superset.daos.exceptions import DAODeleteFailedError +from superset.daos.query import SavedQueryDAO +from superset.models.dashboard import Dashboard logger = logging.getLogger(__name__) diff --git a/superset/queries/saved_queries/commands/exceptions.py b/superset/commands/query/exceptions.py similarity index 100% rename from superset/queries/saved_queries/commands/exceptions.py rename to superset/commands/query/exceptions.py diff --git a/superset/queries/saved_queries/commands/export.py b/superset/commands/query/export.py similarity index 97% rename from superset/queries/saved_queries/commands/export.py rename to superset/commands/query/export.py index 1b85cda796a91..a8fa8acbf0525 100644 --- a/superset/queries/saved_queries/commands/export.py +++ b/superset/commands/query/export.py @@ -25,7 +25,7 @@ from superset.commands.export.models import ExportModelsCommand from superset.models.sql_lab import SavedQuery -from superset.queries.saved_queries.commands.exceptions import SavedQueryNotFoundError +from superset.commands.query.exceptions import SavedQueryNotFoundError from superset.daos.query import SavedQueryDAO from superset.utils.dict_import_export import EXPORT_VERSION diff --git a/superset/queries/saved_queries/commands/__init__.py b/superset/commands/query/importers/__init__.py similarity index 100% rename from superset/queries/saved_queries/commands/__init__.py rename to superset/commands/query/importers/__init__.py diff --git a/superset/queries/saved_queries/commands/importers/dispatcher.py b/superset/commands/query/importers/dispatcher.py similarity index 97% rename from superset/queries/saved_queries/commands/importers/dispatcher.py rename to superset/commands/query/importers/dispatcher.py index c2208f0e2af0a..438ea8351f722 100644 --- a/superset/queries/saved_queries/commands/importers/dispatcher.py +++ b/superset/commands/query/importers/dispatcher.py @@ -23,7 +23,7 @@ from superset.commands.base import BaseCommand from superset.commands.exceptions import CommandInvalidError from superset.commands.importers.exceptions import IncorrectVersionError -from superset.queries.saved_queries.commands.importers import v1 +from superset.commands.query.importers import v1 logger = logging.getLogger(__name__) diff --git a/superset/queries/saved_queries/commands/importers/v1/__init__.py b/superset/commands/query/importers/v1/__init__.py similarity index 91% rename from superset/queries/saved_queries/commands/importers/v1/__init__.py rename to superset/commands/query/importers/v1/__init__.py index c8a159c7f5cfe..fa1f21b6fcc5d 100644 --- a/superset/queries/saved_queries/commands/importers/v1/__init__.py +++ b/superset/commands/query/importers/v1/__init__.py @@ -20,15 +20,13 @@ from marshmallow import Schema from sqlalchemy.orm import Session +from superset.commands.database.importers.v1.utils import import_database from superset.commands.importers.v1 import ImportModelsCommand +from superset.commands.query.exceptions import SavedQueryImportError +from superset.commands.query.importers.v1.utils import import_saved_query from superset.connectors.sqla.models import SqlaTable from superset.daos.query import SavedQueryDAO -from superset.databases.commands.importers.v1.utils import import_database from superset.databases.schemas import ImportV1DatabaseSchema -from superset.queries.saved_queries.commands.exceptions import SavedQueryImportError -from superset.queries.saved_queries.commands.importers.v1.utils import ( - import_saved_query, -) from superset.queries.saved_queries.schemas import ImportV1SavedQuerySchema diff --git a/superset/queries/saved_queries/commands/importers/v1/utils.py b/superset/commands/query/importers/v1/utils.py similarity index 100% rename from superset/queries/saved_queries/commands/importers/v1/utils.py rename to superset/commands/query/importers/v1/utils.py diff --git a/superset/queries/saved_queries/commands/importers/__init__.py b/superset/commands/report/__init__.py similarity index 100% rename from superset/queries/saved_queries/commands/importers/__init__.py rename to superset/commands/report/__init__.py diff --git a/superset/reports/commands/alert.py b/superset/commands/report/alert.py similarity index 99% rename from superset/reports/commands/alert.py rename to superset/commands/report/alert.py index 2c36d3589ca88..68013a2c005b0 100644 --- a/superset/reports/commands/alert.py +++ b/superset/commands/report/alert.py @@ -29,7 +29,7 @@ from superset import app, jinja_context, security_manager from superset.commands.base import BaseCommand -from superset.reports.commands.exceptions import ( +from superset.commands.report.exceptions import ( AlertQueryError, AlertQueryInvalidTypeError, AlertQueryMultipleColumnsError, diff --git a/superset/reports/commands/base.py b/superset/commands/report/base.py similarity index 98% rename from superset/reports/commands/base.py rename to superset/commands/report/base.py index da871ef17c3a0..3b2f280816a4d 100644 --- a/superset/reports/commands/base.py +++ b/superset/commands/report/base.py @@ -20,9 +20,7 @@ from marshmallow import ValidationError from superset.commands.base import BaseCommand -from superset.daos.chart import ChartDAO -from superset.daos.dashboard import DashboardDAO -from superset.reports.commands.exceptions import ( +from superset.commands.report.exceptions import ( ChartNotFoundValidationError, ChartNotSavedValidationError, DashboardNotFoundValidationError, @@ -30,6 +28,8 @@ ReportScheduleEitherChartOrDashboardError, ReportScheduleOnlyChartOrDashboardError, ) +from superset.daos.chart import ChartDAO +from superset.daos.dashboard import DashboardDAO from superset.reports.models import ReportCreationMethod logger = logging.getLogger(__name__) diff --git a/superset/reports/commands/create.py b/superset/commands/report/create.py similarity index 97% rename from superset/reports/commands/create.py rename to superset/commands/report/create.py index 177e01c33b1c4..aa9bfefc6e963 100644 --- a/superset/reports/commands/create.py +++ b/superset/commands/report/create.py @@ -22,11 +22,8 @@ from marshmallow import ValidationError from superset.commands.base import CreateMixin -from superset.daos.database import DatabaseDAO -from superset.daos.exceptions import DAOCreateFailedError -from superset.daos.report import ReportScheduleDAO -from superset.reports.commands.base import BaseReportScheduleCommand -from superset.reports.commands.exceptions import ( +from superset.commands.report.base import BaseReportScheduleCommand +from superset.commands.report.exceptions import ( DatabaseNotFoundValidationError, ReportScheduleAlertRequiredDatabaseValidationError, ReportScheduleCreateFailedError, @@ -35,6 +32,9 @@ ReportScheduleNameUniquenessValidationError, ReportScheduleRequiredTypeValidationError, ) +from superset.daos.database import DatabaseDAO +from superset.daos.exceptions import DAOCreateFailedError +from superset.daos.report import ReportScheduleDAO from superset.reports.models import ( ReportCreationMethod, ReportSchedule, diff --git a/superset/reports/commands/delete.py b/superset/commands/report/delete.py similarity index 97% rename from superset/reports/commands/delete.py rename to superset/commands/report/delete.py index 2cdac17c4d007..87ea4b99dd017 100644 --- a/superset/reports/commands/delete.py +++ b/superset/commands/report/delete.py @@ -19,14 +19,14 @@ from superset import security_manager from superset.commands.base import BaseCommand -from superset.daos.exceptions import DAODeleteFailedError -from superset.daos.report import ReportScheduleDAO -from superset.exceptions import SupersetSecurityException -from superset.reports.commands.exceptions import ( +from superset.commands.report.exceptions import ( ReportScheduleDeleteFailedError, ReportScheduleForbiddenError, ReportScheduleNotFoundError, ) +from superset.daos.exceptions import DAODeleteFailedError +from superset.daos.report import ReportScheduleDAO +from superset.exceptions import SupersetSecurityException from superset.reports.models import ReportSchedule logger = logging.getLogger(__name__) diff --git a/superset/reports/commands/exceptions.py b/superset/commands/report/exceptions.py similarity index 100% rename from superset/reports/commands/exceptions.py rename to superset/commands/report/exceptions.py diff --git a/superset/reports/commands/execute.py b/superset/commands/report/execute.py similarity index 99% rename from superset/reports/commands/execute.py rename to superset/commands/report/execute.py index 301bac4531575..d4b53e30dd846 100644 --- a/superset/reports/commands/execute.py +++ b/superset/commands/report/execute.py @@ -26,20 +26,10 @@ from superset import app, security_manager from superset.commands.base import BaseCommand +from superset.commands.dashboard.permalink.create import CreateDashboardPermalinkCommand from superset.commands.exceptions import CommandException -from superset.common.chart_data import ChartDataResultFormat, ChartDataResultType -from superset.daos.report import ( - REPORT_SCHEDULE_ERROR_NOTIFICATION_MARKER, - ReportScheduleDAO, -) -from superset.dashboards.permalink.commands.create import ( - CreateDashboardPermalinkCommand, -) -from superset.errors import ErrorLevel, SupersetError, SupersetErrorType -from superset.exceptions import SupersetErrorsException, SupersetException -from superset.extensions import feature_flag_manager, machine_auth_provider_factory -from superset.reports.commands.alert import AlertCommand -from superset.reports.commands.exceptions import ( +from superset.commands.report.alert import AlertCommand +from superset.commands.report.exceptions import ( ReportScheduleAlertGracePeriodError, ReportScheduleClientErrorsException, ReportScheduleCsvFailedError, @@ -56,6 +46,14 @@ ReportScheduleUnexpectedError, ReportScheduleWorkingTimeoutError, ) +from superset.common.chart_data import ChartDataResultFormat, ChartDataResultType +from superset.daos.report import ( + REPORT_SCHEDULE_ERROR_NOTIFICATION_MARKER, + ReportScheduleDAO, +) +from superset.errors import ErrorLevel, SupersetError, SupersetErrorType +from superset.exceptions import SupersetErrorsException, SupersetException +from superset.extensions import feature_flag_manager, machine_auth_provider_factory from superset.reports.models import ( ReportDataFormat, ReportExecutionLog, @@ -123,8 +121,6 @@ def update_report_schedule(self, state: ReportState) -> None: self._report_schedule.last_state = state self._report_schedule.last_eval_dttm = datetime.utcnow() - - self._session.merge(self._report_schedule) self._session.commit() def create_log(self, error_message: Optional[str] = None) -> None: diff --git a/superset/reports/commands/log_prune.py b/superset/commands/report/log_prune.py similarity index 96% rename from superset/reports/commands/log_prune.py rename to superset/commands/report/log_prune.py index 09d999541483a..3a9883c9f1009 100644 --- a/superset/reports/commands/log_prune.py +++ b/superset/commands/report/log_prune.py @@ -18,9 +18,9 @@ from datetime import datetime, timedelta from superset.commands.base import BaseCommand +from superset.commands.report.exceptions import ReportSchedulePruneLogError from superset.daos.exceptions import DAODeleteFailedError from superset.daos.report import ReportScheduleDAO -from superset.reports.commands.exceptions import ReportSchedulePruneLogError from superset.reports.models import ReportSchedule from superset.utils.celery import session_scope diff --git a/superset/reports/commands/update.py b/superset/commands/report/update.py similarity index 97% rename from superset/reports/commands/update.py rename to superset/commands/report/update.py index 7c3351e5ece12..a33ba6b59a726 100644 --- a/superset/reports/commands/update.py +++ b/superset/commands/report/update.py @@ -23,12 +23,8 @@ from superset import security_manager from superset.commands.base import UpdateMixin -from superset.daos.database import DatabaseDAO -from superset.daos.exceptions import DAOUpdateFailedError -from superset.daos.report import ReportScheduleDAO -from superset.exceptions import SupersetSecurityException -from superset.reports.commands.base import BaseReportScheduleCommand -from superset.reports.commands.exceptions import ( +from superset.commands.report.base import BaseReportScheduleCommand +from superset.commands.report.exceptions import ( DatabaseNotFoundValidationError, ReportScheduleForbiddenError, ReportScheduleInvalidError, @@ -36,6 +32,10 @@ ReportScheduleNotFoundError, ReportScheduleUpdateFailedError, ) +from superset.daos.database import DatabaseDAO +from superset.daos.exceptions import DAOUpdateFailedError +from superset.daos.report import ReportScheduleDAO +from superset.exceptions import SupersetSecurityException from superset.reports.models import ReportSchedule, ReportScheduleType, ReportState logger = logging.getLogger(__name__) diff --git a/superset/reports/commands/__init__.py b/superset/commands/security/__init__.py similarity index 100% rename from superset/reports/commands/__init__.py rename to superset/commands/security/__init__.py diff --git a/superset/row_level_security/commands/create.py b/superset/commands/security/create.py similarity index 100% rename from superset/row_level_security/commands/create.py rename to superset/commands/security/create.py diff --git a/superset/row_level_security/commands/delete.py b/superset/commands/security/delete.py similarity index 96% rename from superset/row_level_security/commands/delete.py rename to superset/commands/security/delete.py index d669f7d90f7d7..2c19c5f89b78c 100644 --- a/superset/row_level_security/commands/delete.py +++ b/superset/commands/security/delete.py @@ -18,13 +18,13 @@ import logging from superset.commands.base import BaseCommand -from superset.daos.exceptions import DAODeleteFailedError -from superset.daos.security import RLSDAO -from superset.reports.models import ReportSchedule -from superset.row_level_security.commands.exceptions import ( +from superset.commands.security.exceptions import ( RLSRuleNotFoundError, RuleDeleteFailedError, ) +from superset.daos.exceptions import DAODeleteFailedError +from superset.daos.security import RLSDAO +from superset.reports.models import ReportSchedule logger = logging.getLogger(__name__) diff --git a/superset/row_level_security/commands/exceptions.py b/superset/commands/security/exceptions.py similarity index 100% rename from superset/row_level_security/commands/exceptions.py rename to superset/commands/security/exceptions.py diff --git a/superset/row_level_security/commands/update.py b/superset/commands/security/update.py similarity index 96% rename from superset/row_level_security/commands/update.py rename to superset/commands/security/update.py index bc5ef368bacfd..f3a6cea607bd7 100644 --- a/superset/row_level_security/commands/update.py +++ b/superset/commands/security/update.py @@ -21,12 +21,12 @@ from superset.commands.base import BaseCommand from superset.commands.exceptions import DatasourceNotFoundValidationError +from superset.commands.security.exceptions import RLSRuleNotFoundError from superset.commands.utils import populate_roles from superset.connectors.sqla.models import RowLevelSecurityFilter, SqlaTable from superset.daos.exceptions import DAOUpdateFailedError from superset.daos.security import RLSDAO from superset.extensions import db -from superset.row_level_security.commands.exceptions import RLSRuleNotFoundError logger = logging.getLogger(__name__) diff --git a/superset/row_level_security/commands/__init__.py b/superset/commands/sql_lab/__init__.py similarity index 100% rename from superset/row_level_security/commands/__init__.py rename to superset/commands/sql_lab/__init__.py diff --git a/superset/sqllab/commands/estimate.py b/superset/commands/sql_lab/estimate.py similarity index 100% rename from superset/sqllab/commands/estimate.py rename to superset/commands/sql_lab/estimate.py diff --git a/superset/sqllab/commands/execute.py b/superset/commands/sql_lab/execute.py similarity index 100% rename from superset/sqllab/commands/execute.py rename to superset/commands/sql_lab/execute.py diff --git a/superset/sqllab/commands/export.py b/superset/commands/sql_lab/export.py similarity index 100% rename from superset/sqllab/commands/export.py rename to superset/commands/sql_lab/export.py diff --git a/superset/sqllab/commands/results.py b/superset/commands/sql_lab/results.py similarity index 100% rename from superset/sqllab/commands/results.py rename to superset/commands/sql_lab/results.py diff --git a/superset/sqllab/commands/__init__.py b/superset/commands/tag/__init__.py similarity index 100% rename from superset/sqllab/commands/__init__.py rename to superset/commands/tag/__init__.py diff --git a/superset/tags/commands/create.py b/superset/commands/tag/create.py similarity index 92% rename from superset/tags/commands/create.py rename to superset/commands/tag/create.py index cd3bcc176b2b6..ea23b8d59da10 100644 --- a/superset/tags/commands/create.py +++ b/superset/commands/tag/create.py @@ -19,18 +19,18 @@ from superset import db, security_manager from superset.commands.base import BaseCommand, CreateMixin +from superset.commands.tag.exceptions import TagCreateFailedError, TagInvalidError +from superset.commands.tag.utils import to_object_model, to_object_type from superset.daos.exceptions import DAOCreateFailedError from superset.daos.tag import TagDAO from superset.exceptions import SupersetSecurityException -from superset.tags.commands.exceptions import TagCreateFailedError, TagInvalidError -from superset.tags.commands.utils import to_object_model, to_object_type -from superset.tags.models import ObjectTypes, TagTypes +from superset.tags.models import ObjectType, TagType logger = logging.getLogger(__name__) class CreateCustomTagCommand(CreateMixin, BaseCommand): - def __init__(self, object_type: ObjectTypes, object_id: int, tags: list[str]): + def __init__(self, object_type: ObjectType, object_id: int, tags: list[str]): self._object_type = object_type self._object_id = object_id self._tags = tags @@ -76,7 +76,7 @@ def run(self) -> tuple[set[tuple[str, int]], set[tuple[str, int]]]: try: tag_name = self._properties["name"] - tag = TagDAO.get_by_name(tag_name.strip(), TagTypes.custom) + tag = TagDAO.get_by_name(tag_name.strip(), TagType.custom) TagDAO.create_tag_relationship( objects_to_tag=self._properties.get("objects_to_tag", []), tag=tag, diff --git a/superset/tags/commands/delete.py b/superset/commands/tag/delete.py similarity index 94% rename from superset/tags/commands/delete.py rename to superset/commands/tag/delete.py index 4b92e40ff5820..c4f22390095dc 100644 --- a/superset/tags/commands/delete.py +++ b/superset/commands/tag/delete.py @@ -17,24 +17,24 @@ import logging from superset.commands.base import BaseCommand -from superset.daos.exceptions import DAODeleteFailedError -from superset.daos.tag import TagDAO -from superset.tags.commands.exceptions import ( +from superset.commands.tag.exceptions import ( TagDeleteFailedError, TaggedObjectDeleteFailedError, TaggedObjectNotFoundError, TagInvalidError, TagNotFoundError, ) -from superset.tags.commands.utils import to_object_type -from superset.tags.models import ObjectTypes +from superset.commands.tag.utils import to_object_type +from superset.daos.exceptions import DAODeleteFailedError +from superset.daos.tag import TagDAO +from superset.tags.models import ObjectType from superset.views.base import DeleteMixin logger = logging.getLogger(__name__) class DeleteTaggedObjectCommand(DeleteMixin, BaseCommand): - def __init__(self, object_type: ObjectTypes, object_id: int, tag: str): + def __init__(self, object_type: ObjectType, object_id: int, tag: str): self._object_type = object_type self._object_id = object_id self._tag = tag diff --git a/superset/tags/commands/exceptions.py b/superset/commands/tag/exceptions.py similarity index 100% rename from superset/tags/commands/exceptions.py rename to superset/commands/tag/exceptions.py diff --git a/superset/tags/commands/update.py b/superset/commands/tag/update.py similarity index 94% rename from superset/tags/commands/update.py rename to superset/commands/tag/update.py index 182376438b6c5..431bf93c4de8c 100644 --- a/superset/tags/commands/update.py +++ b/superset/commands/tag/update.py @@ -21,9 +21,9 @@ from superset import db from superset.commands.base import BaseCommand, UpdateMixin +from superset.commands.tag.exceptions import TagInvalidError, TagNotFoundError +from superset.commands.tag.utils import to_object_type from superset.daos.tag import TagDAO -from superset.tags.commands.exceptions import TagInvalidError, TagNotFoundError -from superset.tags.commands.utils import to_object_type from superset.tags.models import Tag logger = logging.getLogger(__name__) diff --git a/superset/tags/commands/utils.py b/superset/commands/tag/utils.py similarity index 79% rename from superset/tags/commands/utils.py rename to superset/commands/tag/utils.py index 028465d83a4ae..c3929cc41bc0f 100644 --- a/superset/tags/commands/utils.py +++ b/superset/commands/tag/utils.py @@ -23,25 +23,25 @@ from superset.models.dashboard import Dashboard from superset.models.slice import Slice from superset.models.sql_lab import SavedQuery -from superset.tags.models import ObjectTypes +from superset.tags.models import ObjectType -def to_object_type(object_type: Union[ObjectTypes, int, str]) -> Optional[ObjectTypes]: - if isinstance(object_type, ObjectTypes): +def to_object_type(object_type: Union[ObjectType, int, str]) -> Optional[ObjectType]: + if isinstance(object_type, ObjectType): return object_type - for type_ in ObjectTypes: + for type_ in ObjectType: if object_type in [type_.value, type_.name]: return type_ return None def to_object_model( - object_type: ObjectTypes, object_id: int + object_type: ObjectType, object_id: int ) -> Optional[Union[Dashboard, SavedQuery, Slice]]: - if ObjectTypes.dashboard == object_type: + if ObjectType.dashboard == object_type: return DashboardDAO.find_by_id(object_id) - if ObjectTypes.query == object_type: + if ObjectType.query == object_type: return SavedQueryDAO.find_by_id(object_id) - if ObjectTypes.chart == object_type: + if ObjectType.chart == object_type: return ChartDAO.find_by_id(object_id) return None diff --git a/superset/tags/commands/__init__.py b/superset/commands/temporary_cache/__init__.py similarity index 100% rename from superset/tags/commands/__init__.py rename to superset/commands/temporary_cache/__init__.py diff --git a/superset/temporary_cache/commands/create.py b/superset/commands/temporary_cache/create.py similarity index 92% rename from superset/temporary_cache/commands/create.py rename to superset/commands/temporary_cache/create.py index af3b5350f652f..e43d48e54c1a9 100644 --- a/superset/temporary_cache/commands/create.py +++ b/superset/commands/temporary_cache/create.py @@ -20,8 +20,8 @@ from sqlalchemy.exc import SQLAlchemyError from superset.commands.base import BaseCommand -from superset.temporary_cache.commands.exceptions import TemporaryCacheCreateFailedError -from superset.temporary_cache.commands.parameters import CommandParameters +from superset.commands.temporary_cache.exceptions import TemporaryCacheCreateFailedError +from superset.commands.temporary_cache.parameters import CommandParameters logger = logging.getLogger(__name__) diff --git a/superset/temporary_cache/commands/delete.py b/superset/commands/temporary_cache/delete.py similarity index 92% rename from superset/temporary_cache/commands/delete.py rename to superset/commands/temporary_cache/delete.py index 1281c8debf1fe..d35b184d8744d 100644 --- a/superset/temporary_cache/commands/delete.py +++ b/superset/commands/temporary_cache/delete.py @@ -20,8 +20,8 @@ from sqlalchemy.exc import SQLAlchemyError from superset.commands.base import BaseCommand -from superset.temporary_cache.commands.exceptions import TemporaryCacheDeleteFailedError -from superset.temporary_cache.commands.parameters import CommandParameters +from superset.commands.temporary_cache.exceptions import TemporaryCacheDeleteFailedError +from superset.commands.temporary_cache.parameters import CommandParameters logger = logging.getLogger(__name__) diff --git a/superset/temporary_cache/commands/entry.py b/superset/commands/temporary_cache/entry.py similarity index 100% rename from superset/temporary_cache/commands/entry.py rename to superset/commands/temporary_cache/entry.py diff --git a/superset/temporary_cache/commands/exceptions.py b/superset/commands/temporary_cache/exceptions.py similarity index 100% rename from superset/temporary_cache/commands/exceptions.py rename to superset/commands/temporary_cache/exceptions.py diff --git a/superset/temporary_cache/commands/get.py b/superset/commands/temporary_cache/get.py similarity index 92% rename from superset/temporary_cache/commands/get.py rename to superset/commands/temporary_cache/get.py index 8c220b9c04583..fa16977a8e96c 100644 --- a/superset/temporary_cache/commands/get.py +++ b/superset/commands/temporary_cache/get.py @@ -21,8 +21,8 @@ from sqlalchemy.exc import SQLAlchemyError from superset.commands.base import BaseCommand -from superset.temporary_cache.commands.exceptions import TemporaryCacheGetFailedError -from superset.temporary_cache.commands.parameters import CommandParameters +from superset.commands.temporary_cache.exceptions import TemporaryCacheGetFailedError +from superset.commands.temporary_cache.parameters import CommandParameters logger = logging.getLogger(__name__) diff --git a/superset/temporary_cache/commands/parameters.py b/superset/commands/temporary_cache/parameters.py similarity index 100% rename from superset/temporary_cache/commands/parameters.py rename to superset/commands/temporary_cache/parameters.py diff --git a/superset/temporary_cache/commands/update.py b/superset/commands/temporary_cache/update.py similarity index 92% rename from superset/temporary_cache/commands/update.py rename to superset/commands/temporary_cache/update.py index 92af8c14f20af..90b1c3d48f509 100644 --- a/superset/temporary_cache/commands/update.py +++ b/superset/commands/temporary_cache/update.py @@ -21,8 +21,8 @@ from sqlalchemy.exc import SQLAlchemyError from superset.commands.base import BaseCommand -from superset.temporary_cache.commands.exceptions import TemporaryCacheUpdateFailedError -from superset.temporary_cache.commands.parameters import CommandParameters +from superset.commands.temporary_cache.exceptions import TemporaryCacheUpdateFailedError +from superset.commands.temporary_cache.parameters import CommandParameters logger = logging.getLogger(__name__) diff --git a/superset/commands/utils.py b/superset/commands/utils.py index 02b6b5f383516..8cfeab3c1148d 100644 --- a/superset/commands/utils.py +++ b/superset/commands/utils.py @@ -33,7 +33,7 @@ from superset.utils.core import DatasourceType, get_user_id if TYPE_CHECKING: - from superset.connectors.base.models import BaseDatasource + from superset.connectors.sqla.models import BaseDatasource def populate_owners( diff --git a/superset/common/query_actions.py b/superset/common/query_actions.py index 22c778b77be67..d73a99d0271c1 100644 --- a/superset/common/query_actions.py +++ b/superset/common/query_actions.py @@ -24,7 +24,7 @@ from superset import app from superset.common.chart_data import ChartDataResultType from superset.common.db_query_status import QueryStatus -from superset.connectors.base.models import BaseDatasource +from superset.connectors.sqla.models import BaseDatasource from superset.exceptions import QueryObjectValidationError from superset.utils.core import ( extract_column_dtype, diff --git a/superset/common/query_context.py b/superset/common/query_context.py index 1a8d3c518b07a..4f517cd90557a 100644 --- a/superset/common/query_context.py +++ b/superset/common/query_context.py @@ -30,7 +30,7 @@ from superset.models.slice import Slice if TYPE_CHECKING: - from superset.connectors.base.models import BaseDatasource + from superset.connectors.sqla.models import BaseDatasource from superset.models.helpers import QueryResult diff --git a/superset/common/query_context_factory.py b/superset/common/query_context_factory.py index d6510ccd9a434..708907d4a91ab 100644 --- a/superset/common/query_context_factory.py +++ b/superset/common/query_context_factory.py @@ -29,7 +29,7 @@ from superset.utils.core import DatasourceDict, DatasourceType, is_adhoc_column if TYPE_CHECKING: - from superset.connectors.base.models import BaseDatasource + from superset.connectors.sqla.models import BaseDatasource config = app.config diff --git a/superset/common/query_context_processor.py b/superset/common/query_context_processor.py index 5a0468b671b39..5b1414d53b396 100644 --- a/superset/common/query_context_processor.py +++ b/superset/common/query_context_processor.py @@ -36,9 +36,9 @@ get_since_until_from_query_object, get_since_until_from_time_range, ) -from superset.connectors.base.models import BaseDatasource +from superset.connectors.sqla.models import BaseDatasource from superset.constants import CacheRegion, TimeGrain -from superset.daos.annotation import AnnotationLayerDAO +from superset.daos.annotation_layer import AnnotationLayerDAO from superset.daos.chart import ChartDAO from superset.exceptions import ( InvalidPostProcessingError, @@ -682,7 +682,7 @@ def get_viz_annotation_data( annotation_layer: dict[str, Any], force: bool ) -> dict[str, Any]: # pylint: disable=import-outside-toplevel - from superset.charts.data.commands.get_data_command import ChartDataCommand + from superset.commands.chart.data.get_data_command import ChartDataCommand if not (chart := ChartDAO.find_by_id(annotation_layer["value"])): raise QueryObjectValidationError(_("The chart does not exist")) diff --git a/superset/common/query_object.py b/superset/common/query_object.py index 1e826761ecba4..989df5775b2e7 100644 --- a/superset/common/query_object.py +++ b/superset/common/query_object.py @@ -49,7 +49,7 @@ from superset.utils.hashing import md5_sha_from_dict if TYPE_CHECKING: - from superset.connectors.base.models import BaseDatasource + from superset.connectors.sqla.models import BaseDatasource logger = logging.getLogger(__name__) diff --git a/superset/common/query_object_factory.py b/superset/common/query_object_factory.py index d993eca279093..d2aa140dfe933 100644 --- a/superset/common/query_object_factory.py +++ b/superset/common/query_object_factory.py @@ -35,7 +35,7 @@ if TYPE_CHECKING: from sqlalchemy.orm import sessionmaker - from superset.connectors.base.models import BaseDatasource + from superset.connectors.sqla.models import BaseDatasource from superset.daos.datasource import DatasourceDAO diff --git a/superset/common/tags.py b/superset/common/tags.py index c7b06bdd4b44b..ce5c5ab195744 100644 --- a/superset/common/tags.py +++ b/superset/common/tags.py @@ -22,7 +22,7 @@ from sqlalchemy.sql import and_, func, join, literal, select from superset.extensions import db -from superset.tags.models import ObjectTypes, TagTypes +from superset.tags.models import ObjectType, TagType def add_types_to_charts( @@ -35,7 +35,7 @@ def add_types_to_charts( [ tag.c.id.label("tag_id"), slices.c.id.label("object_id"), - literal(ObjectTypes.chart.name).label("object_type"), + literal(ObjectType.chart.name).label("object_type"), ] ) .select_from( @@ -67,7 +67,7 @@ def add_types_to_dashboards( [ tag.c.id.label("tag_id"), dashboard_table.c.id.label("object_id"), - literal(ObjectTypes.dashboard.name).label("object_type"), + literal(ObjectType.dashboard.name).label("object_type"), ] ) .select_from( @@ -99,7 +99,7 @@ def add_types_to_saved_queries( [ tag.c.id.label("tag_id"), saved_query.c.id.label("object_id"), - literal(ObjectTypes.query.name).label("object_type"), + literal(ObjectType.query.name).label("object_type"), ] ) .select_from( @@ -131,7 +131,7 @@ def add_types_to_datasets( [ tag.c.id.label("tag_id"), tables.c.id.label("object_id"), - literal(ObjectTypes.dataset.name).label("object_type"), + literal(ObjectType.dataset.name).label("object_type"), ] ) .select_from( @@ -221,9 +221,9 @@ def add_types(metadata: MetaData) -> None: # add a tag for each object type insert = tag.insert() - for type_ in ObjectTypes.__members__: + for type_ in ObjectType.__members__: with contextlib.suppress(IntegrityError): # already exists - db.session.execute(insert, name=f"type:{type_}", type=TagTypes.type) + db.session.execute(insert, name=f"type:{type_}", type=TagType.type) add_types_to_charts(metadata, tag, tagged_object, columns) add_types_to_dashboards(metadata, tag, tagged_object, columns) @@ -241,7 +241,7 @@ def add_owners_to_charts( [ tag.c.id.label("tag_id"), slices.c.id.label("object_id"), - literal(ObjectTypes.chart.name).label("object_type"), + literal(ObjectType.chart.name).label("object_type"), ] ) .select_from( @@ -277,7 +277,7 @@ def add_owners_to_dashboards( [ tag.c.id.label("tag_id"), dashboard_table.c.id.label("object_id"), - literal(ObjectTypes.dashboard.name).label("object_type"), + literal(ObjectType.dashboard.name).label("object_type"), ] ) .select_from( @@ -313,7 +313,7 @@ def add_owners_to_saved_queries( [ tag.c.id.label("tag_id"), saved_query.c.id.label("object_id"), - literal(ObjectTypes.query.name).label("object_type"), + literal(ObjectType.query.name).label("object_type"), ] ) .select_from( @@ -349,7 +349,7 @@ def add_owners_to_datasets( [ tag.c.id.label("tag_id"), tables.c.id.label("object_id"), - literal(ObjectTypes.dataset.name).label("object_type"), + literal(ObjectType.dataset.name).label("object_type"), ] ) .select_from( @@ -444,7 +444,7 @@ def add_owners(metadata: MetaData) -> None: insert = tag.insert() for (id_,) in db.session.execute(ids): with contextlib.suppress(IntegrityError): # already exists - db.session.execute(insert, name=f"owner:{id_}", type=TagTypes.owner) + db.session.execute(insert, name=f"owner:{id_}", type=TagType.owner) add_owners_to_charts(metadata, tag, tagged_object, columns) add_owners_to_dashboards(metadata, tag, tagged_object, columns) add_owners_to_saved_queries(metadata, tag, tagged_object, columns) @@ -482,7 +482,7 @@ def add_favorites(metadata: MetaData) -> None: insert = tag.insert() for (id_,) in db.session.execute(ids): with contextlib.suppress(IntegrityError): # already exists - db.session.execute(insert, name=f"favorited_by:{id_}", type=TagTypes.type) + db.session.execute(insert, name=f"favorited_by:{id_}", type=TagType.type) favstars = ( select( [ diff --git a/superset/config.py b/superset/config.py index 561c37b8c2f72..45e001187ceaa 100644 --- a/superset/config.py +++ b/superset/config.py @@ -1425,6 +1425,7 @@ def EMAIL_HEADER_MUTATOR( # pylint: disable=invalid-name,unused-argument # If you want Talisman, how do you want it configured?? TALISMAN_CONFIG = { "content_security_policy": { + "base-uri": ["'self'"], "default-src": [ "'self'", "https://*.clarity.ms", @@ -1461,10 +1462,12 @@ def EMAIL_HEADER_MUTATOR( # pylint: disable=invalid-name,unused-argument }, "content_security_policy_nonce_in": ["script-src"], "force_https": False, + "session_cookie_secure": False, } # React requires `eval` to work correctly in dev mode TALISMAN_DEV_CONFIG = { "content_security_policy": { + "base-uri": ["'self'"], "default-src": [ "'self'", "https://*.clarity.ms", @@ -1501,6 +1504,7 @@ def EMAIL_HEADER_MUTATOR( # pylint: disable=invalid-name,unused-argument }, "content_security_policy_nonce_in": ["script-src"], "force_https": False, + "session_cookie_secure": False, } # diff --git a/superset/connectors/base/models.py b/superset/connectors/base/models.py deleted file mode 100644 index d5386c7a66c3e..0000000000000 --- a/superset/connectors/base/models.py +++ /dev/null @@ -1,776 +0,0 @@ -# Licensed to the Apache Software Foundation (ASF) under one -# or more contributor license agreements. See the NOTICE file -# distributed with this work for additional information -# regarding copyright ownership. The ASF licenses this file -# to you under the Apache License, Version 2.0 (the -# "License"); you may not use this file except in compliance -# with the License. You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, -# software distributed under the License is distributed on an -# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY -# KIND, either express or implied. See the License for the -# specific language governing permissions and limitations -# under the License. -from __future__ import annotations - -import builtins -import json -import logging -from collections.abc import Hashable -from datetime import datetime -from json.decoder import JSONDecodeError -from typing import Any, TYPE_CHECKING - -from flask_appbuilder.security.sqla.models import User -from flask_babel import gettext as __ -from sqlalchemy import and_, Boolean, Column, Integer, String, Text -from sqlalchemy.ext.declarative import declared_attr -from sqlalchemy.orm import foreign, Query, relationship, RelationshipProperty, Session -from sqlalchemy.sql import literal_column - -from superset import security_manager -from superset.constants import EMPTY_STRING, NULL_STRING -from superset.datasets.commands.exceptions import DatasetNotFoundError -from superset.models.helpers import AuditMixinNullable, ImportExportMixin, QueryResult -from superset.models.slice import Slice -from superset.superset_typing import ( - FilterValue, - FilterValues, - QueryObjectDict, - ResultSetColumnType, -) -from superset.utils import core as utils -from superset.utils.backports import StrEnum -from superset.utils.core import GenericDataType, MediumText - -if TYPE_CHECKING: - from superset.db_engine_specs.base import BaseEngineSpec - -logger = logging.getLogger(__name__) - -METRIC_FORM_DATA_PARAMS = [ - "metric", - "metric_2", - "metrics", - "metrics_b", - "percent_metrics", - "secondary_metric", - "size", - "timeseries_limit_metric", - "x", - "y", -] - -COLUMN_FORM_DATA_PARAMS = [ - "all_columns", - "all_columns_x", - "columns", - "entity", - "groupby", - "order_by_cols", - "series", -] - - -class DatasourceKind(StrEnum): - VIRTUAL = "virtual" - PHYSICAL = "physical" - - -class BaseDatasource( - AuditMixinNullable, ImportExportMixin -): # pylint: disable=too-many-public-methods - """A common interface to objects that are queryable - (tables and datasources)""" - - # --------------------------------------------------------------- - # class attributes to define when deriving BaseDatasource - # --------------------------------------------------------------- - __tablename__: str | None = None # {connector_name}_datasource - baselink: str | None = None # url portion pointing to ModelView endpoint - - @property - def column_class(self) -> type[BaseColumn]: - # link to derivative of BaseColumn - raise NotImplementedError() - - @property - def metric_class(self) -> type[BaseMetric]: - # link to derivative of BaseMetric - raise NotImplementedError() - - owner_class: User | None = None - - # Used to do code highlighting when displaying the query in the UI - query_language: str | None = None - - # Only some datasources support Row Level Security - is_rls_supported: bool = False - - @property - def name(self) -> str: - # can be a Column or a property pointing to one - raise NotImplementedError() - - # --------------------------------------------------------------- - - # Columns - id = Column(Integer, primary_key=True) - description = Column(Text) - default_endpoint = Column(Text) - is_featured = Column(Boolean, default=False) # TODO deprecating - filter_select_enabled = Column(Boolean, default=True) - offset = Column(Integer, default=0) - cache_timeout = Column(Integer) - params = Column(String(1000)) - perm = Column(String(1000)) - schema_perm = Column(String(1000)) - is_managed_externally = Column(Boolean, nullable=False, default=False) - external_url = Column(Text, nullable=True) - - sql: str | None = None - owners: list[User] - update_from_object_fields: list[str] - - extra_import_fields = ["is_managed_externally", "external_url"] - - @property - def kind(self) -> DatasourceKind: - return DatasourceKind.VIRTUAL if self.sql else DatasourceKind.PHYSICAL - - @property - def owners_data(self) -> list[dict[str, Any]]: - return [ - { - "first_name": o.first_name, - "last_name": o.last_name, - "username": o.username, - "id": o.id, - } - for o in self.owners - ] - - @property - def is_virtual(self) -> bool: - return self.kind == DatasourceKind.VIRTUAL - - @declared_attr - def slices(self) -> RelationshipProperty: - return relationship( - "Slice", - overlaps="table", - primaryjoin=lambda: and_( - foreign(Slice.datasource_id) == self.id, - foreign(Slice.datasource_type) == self.type, - ), - ) - - columns: list[BaseColumn] = [] - metrics: list[BaseMetric] = [] - - @property - def type(self) -> str: - raise NotImplementedError() - - @property - def uid(self) -> str: - """Unique id across datasource types""" - return f"{self.id}__{self.type}" - - @property - def column_names(self) -> list[str]: - return sorted([c.column_name for c in self.columns], key=lambda x: x or "") - - @property - def columns_types(self) -> dict[str, str]: - return {c.column_name: c.type for c in self.columns} - - @property - def main_dttm_col(self) -> str: - return "timestamp" - - @property - def datasource_name(self) -> str: - raise NotImplementedError() - - @property - def connection(self) -> str | None: - """String representing the context of the Datasource""" - return None - - @property - def schema(self) -> str | None: - """String representing the schema of the Datasource (if it applies)""" - return None - - @property - def filterable_column_names(self) -> list[str]: - return sorted([c.column_name for c in self.columns if c.filterable]) - - @property - def dttm_cols(self) -> list[str]: - return [] - - @property - def url(self) -> str: - return f"/{self.baselink}/edit/{self.id}" - - @property - def explore_url(self) -> str: - if self.default_endpoint: - return self.default_endpoint - return f"/explore/?datasource_type={self.type}&datasource_id={self.id}" - - @property - def column_formats(self) -> dict[str, str | None]: - return {m.metric_name: m.d3format for m in self.metrics if m.d3format} - - @property - def currency_formats(self) -> dict[str, dict[str, str | None] | None]: - return {m.metric_name: m.currency_json for m in self.metrics if m.currency_json} - - def add_missing_metrics(self, metrics: list[BaseMetric]) -> None: - existing_metrics = {m.metric_name for m in self.metrics} - for metric in metrics: - if metric.metric_name not in existing_metrics: - metric.table_id = self.id - self.metrics.append(metric) - - @property - def short_data(self) -> dict[str, Any]: - """Data representation of the datasource sent to the frontend""" - return { - "edit_url": self.url, - "id": self.id, - "uid": self.uid, - "schema": self.schema, - "name": self.name, - "type": self.type, - "connection": self.connection, - "creator": str(self.created_by), - } - - @property - def select_star(self) -> str | None: - pass - - @property - def order_by_choices(self) -> list[tuple[str, str]]: - choices = [] - # self.column_names return sorted column_names - for column_name in self.column_names: - column_name = str(column_name or "") - choices.append( - (json.dumps([column_name, True]), f"{column_name} " + __("[asc]")) - ) - choices.append( - (json.dumps([column_name, False]), f"{column_name} " + __("[desc]")) - ) - return choices - - @property - def verbose_map(self) -> dict[str, str]: - verb_map = {"__timestamp": "Time"} - verb_map.update( - {o.metric_name: o.verbose_name or o.metric_name for o in self.metrics} - ) - verb_map.update( - {o.column_name: o.verbose_name or o.column_name for o in self.columns} - ) - return verb_map - - @property - def data(self) -> dict[str, Any]: - """Data representation of the datasource sent to the frontend""" - return { - # simple fields - "id": self.id, - "uid": self.uid, - "column_formats": self.column_formats, - "currency_formats": self.currency_formats, - "description": self.description, - "database": self.database.data, # pylint: disable=no-member - "default_endpoint": self.default_endpoint, - "filter_select": self.filter_select_enabled, # TODO deprecate - "filter_select_enabled": self.filter_select_enabled, - "name": self.name, - "datasource_name": self.datasource_name, - "table_name": self.datasource_name, - "type": self.type, - "schema": self.schema, - "offset": self.offset, - "cache_timeout": self.cache_timeout, - "params": self.params, - "perm": self.perm, - "edit_url": self.url, - # sqla-specific - "sql": self.sql, - # one to many - "columns": [o.data for o in self.columns], - "metrics": [o.data for o in self.metrics], - # TODO deprecate, move logic to JS - "order_by_choices": self.order_by_choices, - "owners": [owner.id for owner in self.owners], - "verbose_map": self.verbose_map, - "select_star": self.select_star, - } - - def data_for_slices( # pylint: disable=too-many-locals - self, slices: list[Slice] - ) -> dict[str, Any]: - """ - The representation of the datasource containing only the required data - to render the provided slices. - - Used to reduce the payload when loading a dashboard. - """ - data = self.data - metric_names = set() - column_names = set() - for slc in slices: - form_data = slc.form_data - # pull out all required metrics from the form_data - for metric_param in METRIC_FORM_DATA_PARAMS: - for metric in utils.as_list(form_data.get(metric_param) or []): - metric_names.add(utils.get_metric_name(metric)) - if utils.is_adhoc_metric(metric): - column = metric.get("column") or {} - if column_name := column.get("column_name"): - column_names.add(column_name) - - # Columns used in query filters - column_names.update( - filter_["subject"] - for filter_ in form_data.get("adhoc_filters") or [] - if filter_.get("clause") == "WHERE" and filter_.get("subject") - ) - - # columns used by Filter Box - column_names.update( - filter_config["column"] - for filter_config in form_data.get("filter_configs") or [] - if "column" in filter_config - ) - - # for legacy dashboard imports which have the wrong query_context in them - try: - query_context = slc.get_query_context() - except DatasetNotFoundError: - query_context = None - - # legacy charts don't have query_context charts - if query_context: - column_names.update( - [ - utils.get_column_name(column) - for query in query_context.queries - for column in query.columns - ] - or [] - ) - else: - _columns = [ - utils.get_column_name(column) - if utils.is_adhoc_column(column) - else column - for column_param in COLUMN_FORM_DATA_PARAMS - for column in utils.as_list(form_data.get(column_param) or []) - ] - column_names.update(_columns) - - filtered_metrics = [ - metric - for metric in data["metrics"] - if metric["metric_name"] in metric_names - ] - - filtered_columns: list[Column] = [] - column_types: set[GenericDataType] = set() - for column in data["columns"]: - generic_type = column.get("type_generic") - if generic_type is not None: - column_types.add(generic_type) - if column["column_name"] in column_names: - filtered_columns.append(column) - - data["column_types"] = list(column_types) - del data["description"] - data.update({"metrics": filtered_metrics}) - data.update({"columns": filtered_columns}) - verbose_map = {"__timestamp": "Time"} - verbose_map.update( - { - metric["metric_name"]: metric["verbose_name"] or metric["metric_name"] - for metric in filtered_metrics - } - ) - verbose_map.update( - { - column["column_name"]: column["verbose_name"] or column["column_name"] - for column in filtered_columns - } - ) - data["verbose_map"] = verbose_map - - return data - - @staticmethod - def filter_values_handler( # pylint: disable=too-many-arguments - values: FilterValues | None, - operator: str, - target_generic_type: GenericDataType, - target_native_type: str | None = None, - is_list_target: bool = False, - db_engine_spec: builtins.type[BaseEngineSpec] | None = None, - db_extra: dict[str, Any] | None = None, - ) -> FilterValues | None: - if values is None: - return None - - def handle_single_value(value: FilterValue | None) -> FilterValue | None: - if operator == utils.FilterOperator.TEMPORAL_RANGE: - return value - if ( - isinstance(value, (float, int)) - and target_generic_type == utils.GenericDataType.TEMPORAL - and target_native_type is not None - and db_engine_spec is not None - ): - value = db_engine_spec.convert_dttm( - target_type=target_native_type, - dttm=datetime.utcfromtimestamp(value / 1000), - db_extra=db_extra, - ) - value = literal_column(value) - if isinstance(value, str): - value = value.strip("\t\n") - - if ( - target_generic_type == utils.GenericDataType.NUMERIC - and operator - not in { - utils.FilterOperator.ILIKE, - utils.FilterOperator.LIKE, - } - ): - # For backwards compatibility and edge cases - # where a column data type might have changed - return utils.cast_to_num(value) - if value == NULL_STRING: - return None - if value == EMPTY_STRING: - return "" - if target_generic_type == utils.GenericDataType.BOOLEAN: - return utils.cast_to_boolean(value) - return value - - if isinstance(values, (list, tuple)): - values = [handle_single_value(v) for v in values] # type: ignore - else: - values = handle_single_value(values) - if is_list_target and not isinstance(values, (tuple, list)): - values = [values] # type: ignore - elif not is_list_target and isinstance(values, (tuple, list)): - values = values[0] if values else None - return values - - def external_metadata(self) -> list[ResultSetColumnType]: - """Returns column information from the external system""" - raise NotImplementedError() - - def get_query_str(self, query_obj: QueryObjectDict) -> str: - """Returns a query as a string - - This is used to be displayed to the user so that they can - understand what is taking place behind the scene""" - raise NotImplementedError() - - def query(self, query_obj: QueryObjectDict) -> QueryResult: - """Executes the query and returns a dataframe - - query_obj is a dictionary representing Superset's query interface. - Should return a ``superset.models.helpers.QueryResult`` - """ - raise NotImplementedError() - - def values_for_column(self, column_name: str, limit: int = 10000) -> list[Any]: - """Given a column, returns an iterable of distinct values - - This is used to populate the dropdown showing a list of - values in filters in the explore view""" - raise NotImplementedError() - - @staticmethod - def default_query(qry: Query) -> Query: - return qry - - def get_column(self, column_name: str | None) -> BaseColumn | None: - if not column_name: - return None - for col in self.columns: - if col.column_name == column_name: - return col - return None - - @staticmethod - def get_fk_many_from_list( - object_list: list[Any], - fkmany: list[Column], - fkmany_class: builtins.type[BaseColumn | BaseMetric], - key_attr: str, - ) -> list[Column]: - """Update ORM one-to-many list from object list - - Used for syncing metrics and columns using the same code""" - - object_dict = {o.get(key_attr): o for o in object_list} - - # delete fks that have been removed - fkmany = [o for o in fkmany if getattr(o, key_attr) in object_dict] - - # sync existing fks - for fk in fkmany: - obj = object_dict.get(getattr(fk, key_attr)) - if obj: - for attr in fkmany_class.update_from_object_fields: - setattr(fk, attr, obj.get(attr)) - - # create new fks - new_fks = [] - orm_keys = [getattr(o, key_attr) for o in fkmany] - for obj in object_list: - key = obj.get(key_attr) - if key not in orm_keys: - del obj["id"] - orm_kwargs = {} - for k in obj: - if k in fkmany_class.update_from_object_fields and k in obj: - orm_kwargs[k] = obj[k] - new_obj = fkmany_class(**orm_kwargs) - new_fks.append(new_obj) - fkmany += new_fks - return fkmany - - def update_from_object(self, obj: dict[str, Any]) -> None: - """Update datasource from a data structure - - The UI's table editor crafts a complex data structure that - contains most of the datasource's properties as well as - an array of metrics and columns objects. This method - receives the object from the UI and syncs the datasource to - match it. Since the fields are different for the different - connectors, the implementation uses ``update_from_object_fields`` - which can be defined for each connector and - defines which fields should be synced""" - for attr in self.update_from_object_fields: - setattr(self, attr, obj.get(attr)) - - self.owners = obj.get("owners", []) - - # Syncing metrics - metrics = ( - self.get_fk_many_from_list( - obj["metrics"], self.metrics, self.metric_class, "metric_name" - ) - if self.metric_class and "metrics" in obj - else [] - ) - self.metrics = metrics - - # Syncing columns - self.columns = ( - self.get_fk_many_from_list( - obj["columns"], self.columns, self.column_class, "column_name" - ) - if self.column_class and "columns" in obj - else [] - ) - - def get_extra_cache_keys( - self, query_obj: QueryObjectDict # pylint: disable=unused-argument - ) -> list[Hashable]: - """If a datasource needs to provide additional keys for calculation of - cache keys, those can be provided via this method - - :param query_obj: The dict representation of a query object - :return: list of keys - """ - return [] - - def __hash__(self) -> int: - return hash(self.uid) - - def __eq__(self, other: object) -> bool: - if not isinstance(other, BaseDatasource): - return NotImplemented - return self.uid == other.uid - - def raise_for_access(self) -> None: - """ - Raise an exception if the user cannot access the resource. - - :raises SupersetSecurityException: If the user cannot access the resource - """ - - security_manager.raise_for_access(datasource=self) - - @classmethod - def get_datasource_by_name( - cls, session: Session, datasource_name: str, schema: str, database_name: str - ) -> BaseDatasource | None: - raise NotImplementedError() - - -class BaseColumn(AuditMixinNullable, ImportExportMixin): - """Interface for column""" - - __tablename__: str | None = None # {connector_name}_column - - id = Column(Integer, primary_key=True) - column_name = Column(String(255), nullable=False) - verbose_name = Column(String(1024)) - is_active = Column(Boolean, default=True) - type = Column(Text) - advanced_data_type = Column(String(255)) - groupby = Column(Boolean, default=True) - filterable = Column(Boolean, default=True) - description = Column(MediumText()) - is_dttm = None - - # [optional] Set this to support import/export functionality - export_fields: list[Any] = [] - - def __repr__(self) -> str: - return str(self.column_name) - - bool_types = ("BOOL",) - num_types = ( - "DOUBLE", - "FLOAT", - "INT", - "BIGINT", - "NUMBER", - "LONG", - "REAL", - "NUMERIC", - "DECIMAL", - "MONEY", - ) - date_types = ("DATE", "TIME") - str_types = ("VARCHAR", "STRING", "CHAR") - - @property - def is_numeric(self) -> bool: - return self.type and any(map(lambda t: t in self.type.upper(), self.num_types)) - - @property - def is_temporal(self) -> bool: - return self.type and any(map(lambda t: t in self.type.upper(), self.date_types)) - - @property - def is_string(self) -> bool: - return self.type and any(map(lambda t: t in self.type.upper(), self.str_types)) - - @property - def is_boolean(self) -> bool: - return self.type and any(map(lambda t: t in self.type.upper(), self.bool_types)) - - @property - def type_generic(self) -> utils.GenericDataType | None: - if self.is_string: - return utils.GenericDataType.STRING - if self.is_boolean: - return utils.GenericDataType.BOOLEAN - if self.is_numeric: - return utils.GenericDataType.NUMERIC - if self.is_temporal: - return utils.GenericDataType.TEMPORAL - return None - - @property - def expression(self) -> Column: - raise NotImplementedError() - - @property - def python_date_format(self) -> Column: - raise NotImplementedError() - - @property - def data(self) -> dict[str, Any]: - attrs = ( - "id", - "column_name", - "verbose_name", - "description", - "expression", - "filterable", - "groupby", - "is_dttm", - "type", - "advanced_data_type", - ) - return {s: getattr(self, s) for s in attrs if hasattr(self, s)} - - -class BaseMetric(AuditMixinNullable, ImportExportMixin): - """Interface for Metrics""" - - __tablename__: str | None = None # {connector_name}_metric - - id = Column(Integer, primary_key=True) - metric_name = Column(String(255), nullable=False) - verbose_name = Column(String(1024)) - metric_type = Column(String(32)) - description = Column(MediumText()) - d3format = Column(String(128)) - currency = Column(String(128)) - warning_text = Column(Text) - - """ - The interface should also declare a datasource relationship pointing - to a derivative of BaseDatasource, along with a FK - - datasource_name = Column( - String(255), - ForeignKey('datasources.datasource_name')) - datasource = relationship( - # needs to be altered to point to {Connector}Datasource - 'BaseDatasource', - backref=backref('metrics', cascade='all, delete-orphan'), - enable_typechecks=False) - """ - - @property - def currency_json(self) -> dict[str, str | None] | None: - try: - return json.loads(self.currency or "{}") or None - except (TypeError, JSONDecodeError) as exc: - logger.error( - "Unable to load currency json: %r. Leaving empty.", exc, exc_info=True - ) - return None - - @property - def perm(self) -> str | None: - raise NotImplementedError() - - @property - def expression(self) -> Column: - raise NotImplementedError() - - @property - def data(self) -> dict[str, Any]: - attrs = ( - "id", - "metric_name", - "verbose_name", - "description", - "expression", - "warning_text", - "d3format", - "currency", - ) - return {s: getattr(self, s) for s in attrs} diff --git a/superset/connectors/base/views.py b/superset/connectors/base/views.py deleted file mode 100644 index ae5013ebbf4e9..0000000000000 --- a/superset/connectors/base/views.py +++ /dev/null @@ -1,48 +0,0 @@ -# Licensed to the Apache Software Foundation (ASF) under one -# or more contributor license agreements. See the NOTICE file -# distributed with this work for additional information -# regarding copyright ownership. The ASF licenses this file -# to you under the Apache License, Version 2.0 (the -# "License"); you may not use this file except in compliance -# with the License. You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, -# software distributed under the License is distributed on an -# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY -# KIND, either express or implied. See the License for the -# specific language governing permissions and limitations -# under the License. -from typing import Any - -from flask import Markup -from flask_appbuilder.fieldwidgets import BS3TextFieldWidget - -from superset.connectors.base.models import BaseDatasource -from superset.exceptions import SupersetException -from superset.views.base import SupersetModelView - - -class BS3TextFieldROWidget( # pylint: disable=too-few-public-methods - BS3TextFieldWidget -): - """ - Custom read only text field widget. - """ - - def __call__(self, field: Any, **kwargs: Any) -> Markup: - kwargs["readonly"] = "true" - return super().__call__(field, **kwargs) - - -class DatasourceModelView(SupersetModelView): - def pre_delete(self, item: BaseDatasource) -> None: - if item.slices: - raise SupersetException( - Markup( - "Cannot delete a datasource that has slices attached to it." - "Here's the list of associated charts: " - + "".join([i.slice_name for i in item.slices]) - ) - ) diff --git a/superset/connectors/sqla/models.py b/superset/connectors/sqla/models.py index e366940ff2592..598bc6741b07c 100644 --- a/superset/connectors/sqla/models.py +++ b/superset/connectors/sqla/models.py @@ -17,6 +17,7 @@ # pylint: disable=too-many-lines from __future__ import annotations +import builtins import dataclasses import json import logging @@ -25,6 +26,7 @@ from collections.abc import Hashable from dataclasses import dataclass, field from datetime import datetime, timedelta +from json.decoder import JSONDecodeError from typing import Any, Callable, cast import dateutil.parser @@ -34,7 +36,8 @@ import sqlparse from flask import escape, Markup from flask_appbuilder import Model -from flask_babel import lazy_gettext as _ +from flask_appbuilder.security.sqla.models import User +from flask_babel import gettext as __, lazy_gettext as _ from jinja2.exceptions import TemplateError from sqlalchemy import ( and_, @@ -46,16 +49,17 @@ inspect, Integer, or_, - select, String, Table, Text, update, ) from sqlalchemy.engine.base import Connection +from sqlalchemy.ext.declarative import declared_attr from sqlalchemy.ext.hybrid import hybrid_property from sqlalchemy.orm import ( backref, + foreign, Mapped, Query, reconstructor, @@ -71,13 +75,14 @@ from sqlalchemy.sql.selectable import Alias, TableClause from superset import app, db, is_feature_enabled, security_manager +from superset.commands.dataset.exceptions import DatasetNotFoundError from superset.common.db_query_status import QueryStatus -from superset.connectors.base.models import BaseColumn, BaseDatasource, BaseMetric from superset.connectors.sqla.utils import ( get_columns_description, get_physical_table_metadata, get_virtual_table_metadata, ) +from superset.constants import EMPTY_STRING, NULL_STRING from superset.db_engine_specs.base import BaseEngineSpec, TimestampExpression from superset.exceptions import ( ColumnNotFoundException, @@ -98,19 +103,24 @@ AuditMixinNullable, CertificationMixin, ExploreMixin, + ImportExportMixin, QueryResult, QueryStringExtended, validate_adhoc_subquery, ) +from superset.models.slice import Slice from superset.sql_parse import ParsedQuery, sanitize_clause from superset.superset_typing import ( AdhocColumn, AdhocMetric, + FilterValue, + FilterValues, Metric, QueryObjectDict, ResultSetColumnType, ) from superset.utils import core as utils +from superset.utils.backports import StrEnum from superset.utils.core import GenericDataType, MediumText config = app.config @@ -135,6 +145,565 @@ class MetadataResult: modified: list[str] = field(default_factory=list) +logger = logging.getLogger(__name__) + +METRIC_FORM_DATA_PARAMS = [ + "metric", + "metric_2", + "metrics", + "metrics_b", + "percent_metrics", + "secondary_metric", + "size", + "timeseries_limit_metric", + "x", + "y", +] + +COLUMN_FORM_DATA_PARAMS = [ + "all_columns", + "all_columns_x", + "columns", + "entity", + "groupby", + "order_by_cols", + "series", +] + + +class DatasourceKind(StrEnum): + VIRTUAL = "virtual" + PHYSICAL = "physical" + + +class BaseDatasource( + AuditMixinNullable, ImportExportMixin +): # pylint: disable=too-many-public-methods + """A common interface to objects that are queryable + (tables and datasources)""" + + # --------------------------------------------------------------- + # class attributes to define when deriving BaseDatasource + # --------------------------------------------------------------- + __tablename__: str | None = None # {connector_name}_datasource + baselink: str | None = None # url portion pointing to ModelView endpoint + + owner_class: User | None = None + + # Used to do code highlighting when displaying the query in the UI + query_language: str | None = None + + # Only some datasources support Row Level Security + is_rls_supported: bool = False + + @property + def name(self) -> str: + # can be a Column or a property pointing to one + raise NotImplementedError() + + # --------------------------------------------------------------- + + # Columns + id = Column(Integer, primary_key=True) + description = Column(Text) + default_endpoint = Column(Text) + is_featured = Column(Boolean, default=False) # TODO deprecating + filter_select_enabled = Column(Boolean, default=True) + offset = Column(Integer, default=0) + cache_timeout = Column(Integer) + params = Column(String(1000)) + perm = Column(String(1000)) + schema_perm = Column(String(1000)) + is_managed_externally = Column(Boolean, nullable=False, default=False) + external_url = Column(Text, nullable=True) + + sql: str | None = None + owners: list[User] + update_from_object_fields: list[str] + + extra_import_fields = ["is_managed_externally", "external_url"] + + @property + def kind(self) -> DatasourceKind: + return DatasourceKind.VIRTUAL if self.sql else DatasourceKind.PHYSICAL + + @property + def owners_data(self) -> list[dict[str, Any]]: + return [ + { + "first_name": o.first_name, + "last_name": o.last_name, + "username": o.username, + "id": o.id, + } + for o in self.owners + ] + + @property + def is_virtual(self) -> bool: + return self.kind == DatasourceKind.VIRTUAL + + @declared_attr + def slices(self) -> RelationshipProperty: + return relationship( + "Slice", + overlaps="table", + primaryjoin=lambda: and_( + foreign(Slice.datasource_id) == self.id, + foreign(Slice.datasource_type) == self.type, + ), + ) + + columns: list[TableColumn] = [] + metrics: list[SqlMetric] = [] + + @property + def type(self) -> str: + raise NotImplementedError() + + @property + def uid(self) -> str: + """Unique id across datasource types""" + return f"{self.id}__{self.type}" + + @property + def column_names(self) -> list[str]: + return sorted([c.column_name for c in self.columns], key=lambda x: x or "") + + @property + def columns_types(self) -> dict[str, str]: + return {c.column_name: c.type for c in self.columns} + + @property + def main_dttm_col(self) -> str: + return "timestamp" + + @property + def datasource_name(self) -> str: + raise NotImplementedError() + + @property + def connection(self) -> str | None: + """String representing the context of the Datasource""" + return None + + @property + def schema(self) -> str | None: + """String representing the schema of the Datasource (if it applies)""" + return None + + @property + def filterable_column_names(self) -> list[str]: + return sorted([c.column_name for c in self.columns if c.filterable]) + + @property + def dttm_cols(self) -> list[str]: + return [] + + @property + def url(self) -> str: + return f"/{self.baselink}/edit/{self.id}" + + @property + def explore_url(self) -> str: + if self.default_endpoint: + return self.default_endpoint + return f"/explore/?datasource_type={self.type}&datasource_id={self.id}" + + @property + def column_formats(self) -> dict[str, str | None]: + return {m.metric_name: m.d3format for m in self.metrics if m.d3format} + + @property + def currency_formats(self) -> dict[str, dict[str, str | None] | None]: + return {m.metric_name: m.currency_json for m in self.metrics if m.currency_json} + + def add_missing_metrics(self, metrics: list[SqlMetric]) -> None: + existing_metrics = {m.metric_name for m in self.metrics} + for metric in metrics: + if metric.metric_name not in existing_metrics: + metric.table_id = self.id + self.metrics.append(metric) + + @property + def short_data(self) -> dict[str, Any]: + """Data representation of the datasource sent to the frontend""" + return { + "edit_url": self.url, + "id": self.id, + "uid": self.uid, + "schema": self.schema, + "name": self.name, + "type": self.type, + "connection": self.connection, + "creator": str(self.created_by), + } + + @property + def select_star(self) -> str | None: + pass + + @property + def order_by_choices(self) -> list[tuple[str, str]]: + choices = [] + # self.column_names return sorted column_names + for column_name in self.column_names: + column_name = str(column_name or "") + choices.append( + (json.dumps([column_name, True]), f"{column_name} " + __("[asc]")) + ) + choices.append( + (json.dumps([column_name, False]), f"{column_name} " + __("[desc]")) + ) + return choices + + @property + def verbose_map(self) -> dict[str, str]: + verb_map = {"__timestamp": "Time"} + verb_map.update( + {o.metric_name: o.verbose_name or o.metric_name for o in self.metrics} + ) + verb_map.update( + {o.column_name: o.verbose_name or o.column_name for o in self.columns} + ) + return verb_map + + @property + def data(self) -> dict[str, Any]: + """Data representation of the datasource sent to the frontend""" + return { + # simple fields + "id": self.id, + "uid": self.uid, + "column_formats": self.column_formats, + "currency_formats": self.currency_formats, + "description": self.description, + "database": self.database.data, # pylint: disable=no-member + "default_endpoint": self.default_endpoint, + "filter_select": self.filter_select_enabled, # TODO deprecate + "filter_select_enabled": self.filter_select_enabled, + "name": self.name, + "datasource_name": self.datasource_name, + "table_name": self.datasource_name, + "type": self.type, + "schema": self.schema, + "offset": self.offset, + "cache_timeout": self.cache_timeout, + "params": self.params, + "perm": self.perm, + "edit_url": self.url, + # sqla-specific + "sql": self.sql, + # one to many + "columns": [o.data for o in self.columns], + "metrics": [o.data for o in self.metrics], + # TODO deprecate, move logic to JS + "order_by_choices": self.order_by_choices, + "owners": [owner.id for owner in self.owners], + "verbose_map": self.verbose_map, + "select_star": self.select_star, + } + + def data_for_slices( # pylint: disable=too-many-locals + self, slices: list[Slice] + ) -> dict[str, Any]: + """ + The representation of the datasource containing only the required data + to render the provided slices. + + Used to reduce the payload when loading a dashboard. + """ + data = self.data + metric_names = set() + column_names = set() + for slc in slices: + form_data = slc.form_data + # pull out all required metrics from the form_data + for metric_param in METRIC_FORM_DATA_PARAMS: + for metric in utils.as_list(form_data.get(metric_param) or []): + metric_names.add(utils.get_metric_name(metric)) + if utils.is_adhoc_metric(metric): + column_ = metric.get("column") or {} + if column_name := column_.get("column_name"): + column_names.add(column_name) + + # Columns used in query filters + column_names.update( + filter_["subject"] + for filter_ in form_data.get("adhoc_filters") or [] + if filter_.get("clause") == "WHERE" and filter_.get("subject") + ) + + # columns used by Filter Box + column_names.update( + filter_config["column"] + for filter_config in form_data.get("filter_configs") or [] + if "column" in filter_config + ) + + # for legacy dashboard imports which have the wrong query_context in them + try: + query_context = slc.get_query_context() + except DatasetNotFoundError: + query_context = None + + # legacy charts don't have query_context charts + if query_context: + column_names.update( + [ + utils.get_column_name(column_) + for query in query_context.queries + for column_ in query.columns + ] + or [] + ) + else: + _columns = [ + utils.get_column_name(column_) + if utils.is_adhoc_column(column_) + else column_ + for column_param in COLUMN_FORM_DATA_PARAMS + for column_ in utils.as_list(form_data.get(column_param) or []) + ] + column_names.update(_columns) + + filtered_metrics = [ + metric + for metric in data["metrics"] + if metric["metric_name"] in metric_names + ] + + filtered_columns: list[Column] = [] + column_types: set[GenericDataType] = set() + for column_ in data["columns"]: + generic_type = column_.get("type_generic") + if generic_type is not None: + column_types.add(generic_type) + if column_["column_name"] in column_names: + filtered_columns.append(column_) + + data["column_types"] = list(column_types) + del data["description"] + data.update({"metrics": filtered_metrics}) + data.update({"columns": filtered_columns}) + verbose_map = {"__timestamp": "Time"} + verbose_map.update( + { + metric["metric_name"]: metric["verbose_name"] or metric["metric_name"] + for metric in filtered_metrics + } + ) + verbose_map.update( + { + column_["column_name"]: column_["verbose_name"] + or column_["column_name"] + for column_ in filtered_columns + } + ) + data["verbose_map"] = verbose_map + + return data + + @staticmethod + def filter_values_handler( # pylint: disable=too-many-arguments + values: FilterValues | None, + operator: str, + target_generic_type: GenericDataType, + target_native_type: str | None = None, + is_list_target: bool = False, + db_engine_spec: builtins.type[BaseEngineSpec] | None = None, + db_extra: dict[str, Any] | None = None, + ) -> FilterValues | None: + if values is None: + return None + + def handle_single_value(value: FilterValue | None) -> FilterValue | None: + if operator == utils.FilterOperator.TEMPORAL_RANGE: + return value + if ( + isinstance(value, (float, int)) + and target_generic_type == utils.GenericDataType.TEMPORAL + and target_native_type is not None + and db_engine_spec is not None + ): + value = db_engine_spec.convert_dttm( + target_type=target_native_type, + dttm=datetime.utcfromtimestamp(value / 1000), + db_extra=db_extra, + ) + value = literal_column(value) + if isinstance(value, str): + value = value.strip("\t\n") + + if ( + target_generic_type == utils.GenericDataType.NUMERIC + and operator + not in { + utils.FilterOperator.ILIKE, + utils.FilterOperator.LIKE, + } + ): + # For backwards compatibility and edge cases + # where a column data type might have changed + return utils.cast_to_num(value) + if value == NULL_STRING: + return None + if value == EMPTY_STRING: + return "" + if target_generic_type == utils.GenericDataType.BOOLEAN: + return utils.cast_to_boolean(value) + return value + + if isinstance(values, (list, tuple)): + values = [handle_single_value(v) for v in values] # type: ignore + else: + values = handle_single_value(values) + if is_list_target and not isinstance(values, (tuple, list)): + values = [values] # type: ignore + elif not is_list_target and isinstance(values, (tuple, list)): + values = values[0] if values else None + return values + + def external_metadata(self) -> list[ResultSetColumnType]: + """Returns column information from the external system""" + raise NotImplementedError() + + def get_query_str(self, query_obj: QueryObjectDict) -> str: + """Returns a query as a string + + This is used to be displayed to the user so that they can + understand what is taking place behind the scene""" + raise NotImplementedError() + + def query(self, query_obj: QueryObjectDict) -> QueryResult: + """Executes the query and returns a dataframe + + query_obj is a dictionary representing Superset's query interface. + Should return a ``superset.models.helpers.QueryResult`` + """ + raise NotImplementedError() + + @staticmethod + def default_query(qry: Query) -> Query: + return qry + + def get_column(self, column_name: str | None) -> TableColumn | None: + if not column_name: + return None + for col in self.columns: + if col.column_name == column_name: + return col + return None + + @staticmethod + def get_fk_many_from_list( + object_list: list[Any], + fkmany: list[Column], + fkmany_class: builtins.type[TableColumn | SqlMetric], + key_attr: str, + ) -> list[Column]: + """Update ORM one-to-many list from object list + + Used for syncing metrics and columns using the same code""" + + object_dict = {o.get(key_attr): o for o in object_list} + + # delete fks that have been removed + fkmany = [o for o in fkmany if getattr(o, key_attr) in object_dict] + + # sync existing fks + for fk in fkmany: + obj = object_dict.get(getattr(fk, key_attr)) + if obj: + for attr in fkmany_class.update_from_object_fields: + setattr(fk, attr, obj.get(attr)) + + # create new fks + new_fks = [] + orm_keys = [getattr(o, key_attr) for o in fkmany] + for obj in object_list: + key = obj.get(key_attr) + if key not in orm_keys: + del obj["id"] + orm_kwargs = {} + for k in obj: + if k in fkmany_class.update_from_object_fields and k in obj: + orm_kwargs[k] = obj[k] + new_obj = fkmany_class(**orm_kwargs) + new_fks.append(new_obj) + fkmany += new_fks + return fkmany + + def update_from_object(self, obj: dict[str, Any]) -> None: + """Update datasource from a data structure + + The UI's table editor crafts a complex data structure that + contains most of the datasource's properties as well as + an array of metrics and columns objects. This method + receives the object from the UI and syncs the datasource to + match it. Since the fields are different for the different + connectors, the implementation uses ``update_from_object_fields`` + which can be defined for each connector and + defines which fields should be synced""" + for attr in self.update_from_object_fields: + setattr(self, attr, obj.get(attr)) + + self.owners = obj.get("owners", []) + + # Syncing metrics + metrics = ( + self.get_fk_many_from_list( + obj["metrics"], self.metrics, SqlMetric, "metric_name" + ) + if "metrics" in obj + else [] + ) + self.metrics = metrics + + # Syncing columns + self.columns = ( + self.get_fk_many_from_list( + obj["columns"], self.columns, TableColumn, "column_name" + ) + if "columns" in obj + else [] + ) + + def get_extra_cache_keys( + self, query_obj: QueryObjectDict # pylint: disable=unused-argument + ) -> list[Hashable]: + """If a datasource needs to provide additional keys for calculation of + cache keys, those can be provided via this method + + :param query_obj: The dict representation of a query object + :return: list of keys + """ + return [] + + def __hash__(self) -> int: + return hash(self.uid) + + def __eq__(self, other: object) -> bool: + if not isinstance(other, BaseDatasource): + return NotImplemented + return self.uid == other.uid + + def raise_for_access(self) -> None: + """ + Raise an exception if the user cannot access the resource. + + :raises SupersetSecurityException: If the user cannot access the resource + """ + + security_manager.raise_for_access(datasource=self) + + @classmethod + def get_datasource_by_name( + cls, session: Session, datasource_name: str, schema: str, database_name: str + ) -> BaseDatasource | None: + raise NotImplementedError() + + class AnnotationDatasource(BaseDatasource): """Dummy object so we can query annotations using 'Viz' objects just like regular datasources. @@ -188,22 +757,33 @@ def values_for_column(self, column_name: str, limit: int = 10000) -> list[Any]: raise NotImplementedError() -class TableColumn(Model, BaseColumn, CertificationMixin): +class TableColumn(Model, AuditMixinNullable, ImportExportMixin, CertificationMixin): """ORM object for table columns, each table can have multiple columns""" __tablename__ = "table_columns" __table_args__ = (UniqueConstraint("table_id", "column_name"),) + + id = Column(Integer, primary_key=True) + column_name = Column(String(255), nullable=False) + verbose_name = Column(String(1024)) + is_active = Column(Boolean, default=True) + type = Column(Text) + advanced_data_type = Column(String(255)) + groupby = Column(Boolean, default=True) + filterable = Column(Boolean, default=True) + description = Column(MediumText()) table_id = Column(Integer, ForeignKey("tables.id", ondelete="CASCADE")) - table: Mapped[SqlaTable] = relationship( - "SqlaTable", - back_populates="columns", - ) is_dttm = Column(Boolean, default=False) expression = Column(MediumText()) python_date_format = Column(String(255)) extra = Column(Text) + table: Mapped[SqlaTable] = relationship( + "SqlaTable", + back_populates="columns", + ) + export_fields = [ "table_id", "column_name", @@ -247,6 +827,9 @@ def init_on_load(self) -> None: self._database = None + def __repr__(self) -> str: + return str(self.column_name) + @property def is_boolean(self) -> bool: """ @@ -285,7 +868,7 @@ def database(self) -> Database: return self.table.database if self.table else self._database @property - def db_engine_spec(self) -> type[BaseEngineSpec]: + def db_engine_spec(self) -> builtins.type[BaseEngineSpec]: return self.database.db_engine_spec @property @@ -367,44 +950,50 @@ def get_timestamp_expression( @property def data(self) -> dict[str, Any]: attrs = ( - "id", + "advanced_data_type", + "certification_details", + "certified_by", "column_name", - "verbose_name", "description", "expression", "filterable", "groupby", + "id", + "is_certified", "is_dttm", + "python_date_format", "type", "type_generic", - "advanced_data_type", - "python_date_format", - "is_certified", - "certified_by", - "certification_details", + "verbose_name", "warning_markdown", ) - attr_dict = {s: getattr(self, s) for s in attrs if hasattr(self, s)} + return {s: getattr(self, s) for s in attrs if hasattr(self, s)} - attr_dict.update(super().data) - return attr_dict - - -class SqlMetric(Model, BaseMetric, CertificationMixin): +class SqlMetric(Model, AuditMixinNullable, ImportExportMixin, CertificationMixin): """ORM object for metrics, each table can have multiple metrics""" __tablename__ = "sql_metrics" __table_args__ = (UniqueConstraint("table_id", "metric_name"),) + + id = Column(Integer, primary_key=True) + metric_name = Column(String(255), nullable=False) + verbose_name = Column(String(1024)) + metric_type = Column(String(32)) + description = Column(MediumText()) + d3format = Column(String(128)) + currency = Column(String(128)) + warning_text = Column(Text) table_id = Column(Integer, ForeignKey("tables.id", ondelete="CASCADE")) + expression = Column(MediumText(), nullable=False) + extra = Column(Text) + table: Mapped[SqlaTable] = relationship( "SqlaTable", back_populates="metrics", ) - expression = Column(MediumText(), nullable=False) - extra = Column(Text) export_fields = [ "metric_name", @@ -450,18 +1039,34 @@ def perm(self) -> str | None: def get_perm(self) -> str | None: return self.perm + @property + def currency_json(self) -> dict[str, str | None] | None: + try: + return json.loads(self.currency or "{}") or None + except (TypeError, JSONDecodeError) as exc: + logger.error( + "Unable to load currency json: %r. Leaving empty.", exc, exc_info=True + ) + return None + @property def data(self) -> dict[str, Any]: attrs = ( - "is_certified", - "certified_by", "certification_details", + "certified_by", + "currency", + "d3format", + "description", + "expression", + "id", + "is_certified", + "metric_name", "warning_markdown", + "warning_text", + "verbose_name", ) - attr_dict = {s: getattr(self, s) for s in attrs} - attr_dict.update(super().data) - return attr_dict + return {s: getattr(self, s) for s in attrs} sqlatable_user = Table( @@ -793,34 +1398,6 @@ def get_fetch_values_predicate( ) ) from ex - def values_for_column(self, column_name: str, limit: int = 10000) -> list[Any]: - """Runs query against sqla to retrieve some - sample values for the given column. - """ - cols = {col.column_name: col for col in self.columns} - target_col = cols[column_name] - tp = self.get_template_processor() - tbl, cte = self.get_from_clause(tp) - - qry = ( - select([target_col.get_sqla_col(template_processor=tp)]) - .select_from(tbl) - .distinct() - ) - if limit: - qry = qry.limit(limit) - - if self.fetch_values_predicate: - qry = qry.where(self.get_fetch_values_predicate(template_processor=tp)) - - with self.database.get_sqla_engine_with_context() as engine: - sql = qry.compile(engine, compile_kwargs={"literal_binds": True}) - sql = self._apply_cte(sql, cte) - sql = self.mutate_query_from_config(sql) - - df = pd.read_sql_query(sql=sql, con=engine) - return df[column_name].to_list() - def mutate_query_from_config(self, sql: str) -> str: """Apply config's SQL_QUERY_MUTATOR diff --git a/superset/connectors/sqla/views.py b/superset/connectors/sqla/views.py index 1ba10f18b216a..36eebcb3f7e16 100644 --- a/superset/connectors/sqla/views.py +++ b/superset/connectors/sqla/views.py @@ -28,7 +28,6 @@ from wtforms.validators import DataRequired, Regexp from superset import db -from superset.connectors.base.views import DatasourceModelView from superset.connectors.sqla import models from superset.constants import MODEL_VIEW_RW_METHOD_PERMISSION_MAP, RouteMethod from superset.superset_typing import FlaskResponse @@ -282,7 +281,7 @@ def list(self) -> FlaskResponse: class TableModelView( # pylint: disable=too-many-ancestors - DatasourceModelView, DeleteMixin, YamlExportMixin + SupersetModelView, DeleteMixin, YamlExportMixin ): datamodel = SQLAInterface(models.SqlaTable) class_permission_name = "Dataset" diff --git a/superset/css_templates/api.py b/superset/css_templates/api.py index ee5d5fac703ef..ac222da66f815 100644 --- a/superset/css_templates/api.py +++ b/superset/css_templates/api.py @@ -22,12 +22,12 @@ from flask_appbuilder.models.sqla.interface import SQLAInterface from flask_babel import ngettext -from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP, RouteMethod -from superset.css_templates.commands.delete import DeleteCssTemplateCommand -from superset.css_templates.commands.exceptions import ( +from superset.commands.css.delete import DeleteCssTemplateCommand +from superset.commands.css.exceptions import ( CssTemplateDeleteFailedError, CssTemplateNotFoundError, ) +from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP, RouteMethod from superset.css_templates.filters import CssTemplateAllTextFilter from superset.css_templates.schemas import ( get_delete_ids_schema, @@ -54,6 +54,10 @@ class CssTemplateRestApi(BaseSupersetModelRestApi): allow_browser_login = True show_columns = [ + "changed_on_delta_humanized", + "changed_by.first_name", + "changed_by.id", + "changed_by.last_name", "created_by.first_name", "created_by.id", "created_by.last_name", @@ -79,7 +83,7 @@ class CssTemplateRestApi(BaseSupersetModelRestApi): order_columns = ["template_name"] search_filters = {"template_name": [CssTemplateAllTextFilter]} - allowed_rel_fields = {"created_by"} + allowed_rel_fields = {"created_by", "changed_by"} apispec_parameter_schemas = { "get_delete_ids_schema": get_delete_ids_schema, diff --git a/superset/daos/annotation.py b/superset/daos/annotation_layer.py similarity index 100% rename from superset/daos/annotation.py rename to superset/daos/annotation_layer.py diff --git a/superset/daos/base.py b/superset/daos/base.py index d2c1842c17fdb..1133a76a1ed06 100644 --- a/superset/daos/base.py +++ b/superset/daos/base.py @@ -16,7 +16,7 @@ # under the License. from __future__ import annotations -from typing import Any, cast, Generic, get_args, TypeVar +from typing import Any, Generic, get_args, TypeVar from flask_appbuilder.models.filters import BaseFilter from flask_appbuilder.models.sqla import Model @@ -30,7 +30,6 @@ DAOUpdateFailedError, ) from superset.extensions import db -from superset.utils.core import as_list T = TypeVar("T", bound=Model) @@ -197,9 +196,9 @@ def update( return item # type: ignore @classmethod - def delete(cls, item_or_items: T | list[T], commit: bool = True) -> None: + def delete(cls, items: list[T], commit: bool = True) -> None: """ - Delete the specified item(s) including their associated relationships. + Delete the specified items including their associated relationships. Note that bulk deletion via `delete` is not invoked in the base class as this does not dispatch the ORM `after_delete` event which may be required to augment @@ -209,12 +208,12 @@ def delete(cls, item_or_items: T | list[T], commit: bool = True) -> None: Subclasses may invoke bulk deletion but are responsible for instrumenting any post-deletion logic. - :param items: The item(s) to delete + :param items: The items to delete :param commit: Whether to commit the transaction :raises DAODeleteFailedError: If the deletion failed :see: https://docs.sqlalchemy.org/en/latest/orm/queryguide/dml.html """ - items = cast(list[T], as_list(item_or_items)) + try: for item in items: db.session.delete(item) diff --git a/superset/daos/chart.py b/superset/daos/chart.py index 7eae38cb0ecad..eb8b3e809e492 100644 --- a/superset/daos/chart.py +++ b/superset/daos/chart.py @@ -28,7 +28,7 @@ from superset.utils.core import get_user_id if TYPE_CHECKING: - from superset.connectors.base.models import BaseDatasource + from superset.connectors.sqla.models import BaseDatasource logger = logging.getLogger(__name__) diff --git a/superset/daos/dashboard.py b/superset/daos/dashboard.py index 77f2dd9f343bf..b98252070dc4f 100644 --- a/superset/daos/dashboard.py +++ b/superset/daos/dashboard.py @@ -25,12 +25,12 @@ from flask_appbuilder.models.sqla.interface import SQLAInterface from superset import is_feature_enabled, security_manager -from superset.daos.base import BaseDAO -from superset.dashboards.commands.exceptions import ( +from superset.commands.dashboard.exceptions import ( DashboardAccessDeniedError, DashboardForbiddenError, DashboardNotFoundError, ) +from superset.daos.base import BaseDAO from superset.dashboards.filter_sets.consts import ( DASHBOARD_ID_FIELD, DESCRIPTION_FIELD, diff --git a/superset/daos/tag.py b/superset/daos/tag.py index 2acd221a35f72..e4aa89181644d 100644 --- a/superset/daos/tag.py +++ b/superset/daos/tag.py @@ -21,21 +21,21 @@ from flask import g from sqlalchemy.exc import SQLAlchemyError +from superset.commands.tag.exceptions import TagNotFoundError +from superset.commands.tag.utils import to_object_type from superset.daos.base import BaseDAO -from superset.daos.exceptions import DAOCreateFailedError, DAODeleteFailedError +from superset.daos.exceptions import DAODeleteFailedError from superset.exceptions import MissingUserContextException from superset.extensions import db from superset.models.dashboard import Dashboard from superset.models.slice import Slice from superset.models.sql_lab import SavedQuery -from superset.tags.commands.exceptions import TagNotFoundError -from superset.tags.commands.utils import to_object_type from superset.tags.models import ( get_tag, - ObjectTypes, + ObjectType, Tag, TaggedObject, - TagTypes, + TagType, user_favorite_tag_table, ) from superset.utils.core import get_user_id @@ -46,25 +46,13 @@ class TagDAO(BaseDAO[Tag]): # base_filter = TagAccessFilter - @staticmethod - def validate_tag_name(tag_name: str) -> bool: - invalid_characters = [":", ","] - for invalid_character in invalid_characters: - if invalid_character in tag_name: - return False - return True - @staticmethod def create_custom_tagged_objects( - object_type: ObjectTypes, object_id: int, tag_names: list[str] + object_type: ObjectType, object_id: int, tag_names: list[str] ) -> None: tagged_objects = [] for name in tag_names: - if not TagDAO.validate_tag_name(name): - raise DAOCreateFailedError( - message="Invalid Tag Name (cannot contain ':' or ',')" - ) - type_ = TagTypes.custom + type_ = TagType.custom tag_name = name.strip() tag = TagDAO.get_by_name(tag_name, type_) tagged_objects.append( @@ -76,7 +64,7 @@ def create_custom_tagged_objects( @staticmethod def delete_tagged_object( - object_type: ObjectTypes, object_id: int, tag_name: str + object_type: ObjectType, object_id: int, tag_name: str ) -> None: """ deletes a tagged object by the object_id, object_type, and tag_name @@ -128,7 +116,7 @@ def delete_tags(tag_names: list[str]) -> None: raise DAODeleteFailedError(exception=ex) from ex @staticmethod - def get_by_name(name: str, type_: TagTypes = TagTypes.custom) -> Tag: + def get_by_name(name: str, type_: TagType = TagType.custom) -> Tag: """ returns a tag if one exists by that name, none otherwise. important!: Creates a tag by that name if the tag is not found. @@ -152,7 +140,7 @@ def find_by_name(name: str) -> Tag: @staticmethod def find_tagged_object( - object_type: ObjectTypes, object_id: int, tag_id: int + object_type: ObjectType, object_id: int, tag_id: int ) -> TaggedObject: """ returns a tagged object if one exists by that name, none otherwise. @@ -167,6 +155,14 @@ def find_tagged_object( .first() ) + @staticmethod + def get_tagged_objects_by_tag_id( + tag_ids: Optional[list[int]], obj_types: Optional[list[str]] = None + ) -> list[dict[str, Any]]: + tags = db.session.query(Tag).filter(Tag.id.in_(tag_ids)).all() + tag_names = [tag.name for tag in tags] + return TagDAO.get_tagged_objects_for_tags(tag_names, obj_types) + @staticmethod def get_tagged_objects_for_tags( tags: Optional[list[str]] = None, obj_types: Optional[list[str]] = None @@ -185,7 +181,7 @@ def get_tagged_objects_for_tags( TaggedObject, and_( TaggedObject.object_id == Dashboard.id, - TaggedObject.object_type == ObjectTypes.dashboard, + TaggedObject.object_type == ObjectType.dashboard, ), ) .join(Tag, TaggedObject.tag_id == Tag.id) @@ -195,7 +191,7 @@ def get_tagged_objects_for_tags( results.extend( { "id": obj.id, - "type": ObjectTypes.dashboard.name, + "type": ObjectType.dashboard.name, "name": obj.dashboard_title, "url": obj.url, "changed_on": obj.changed_on, @@ -215,7 +211,7 @@ def get_tagged_objects_for_tags( TaggedObject, and_( TaggedObject.object_id == Slice.id, - TaggedObject.object_type == ObjectTypes.chart, + TaggedObject.object_type == ObjectType.chart, ), ) .join(Tag, TaggedObject.tag_id == Tag.id) @@ -224,7 +220,7 @@ def get_tagged_objects_for_tags( results.extend( { "id": obj.id, - "type": ObjectTypes.chart.name, + "type": ObjectType.chart.name, "name": obj.slice_name, "url": obj.url, "changed_on": obj.changed_on, @@ -244,7 +240,7 @@ def get_tagged_objects_for_tags( TaggedObject, and_( TaggedObject.object_id == SavedQuery.id, - TaggedObject.object_type == ObjectTypes.query, + TaggedObject.object_type == ObjectType.query, ), ) .join(Tag, TaggedObject.tag_id == Tag.id) @@ -253,7 +249,7 @@ def get_tagged_objects_for_tags( results.extend( { "id": obj.id, - "type": ObjectTypes.query.name, + "type": ObjectType.query.name, "name": obj.label, "url": obj.url(), "changed_on": obj.changed_on, @@ -363,7 +359,7 @@ def favorited_ids(tags: list[Tag]) -> list[int]: @staticmethod def create_tag_relationship( - objects_to_tag: list[tuple[ObjectTypes, int]], + objects_to_tag: list[tuple[ObjectType, int]], tag: Tag, bulk_create: bool = False, ) -> None: @@ -373,7 +369,7 @@ def create_tag_relationship( and an id, and creates a TaggedObject for each one, associating it with the provided tag. All created TaggedObjects are collected in a list. Args: - objects_to_tag (List[Tuple[ObjectTypes, int]]): A list of tuples, each + objects_to_tag (List[Tuple[ObjectType, int]]): A list of tuples, each containing an ObjectType and an id, representing the objects to be tagged. tag (Tag): The tag to be associated with the specified objects. @@ -409,7 +405,9 @@ def create_tag_relationship( for object_type, object_id in tagged_objects_to_delete: # delete objects that were removed TagDAO.delete_tagged_object( - object_type, object_id, tag.name # type: ignore + object_type, # type: ignore + object_id, + tag.name, ) db.session.add_all(tagged_objects) diff --git a/superset/dashboards/api.py b/superset/dashboards/api.py index b2aa43b0ee41b..cf75a644fbb73 100644 --- a/superset/dashboards/api.py +++ b/superset/dashboards/api.py @@ -35,13 +35,9 @@ from superset import is_feature_enabled, thumbnail_cache from superset.charts.schemas import ChartEntityResponseSchema -from superset.commands.importers.exceptions import NoValidFilesFoundError -from superset.commands.importers.v1.utils import get_contents_from_bundle -from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP, RouteMethod -from superset.daos.dashboard import DashboardDAO, EmbeddedDashboardDAO -from superset.dashboards.commands.create import CreateDashboardCommand -from superset.dashboards.commands.delete import DeleteDashboardCommand -from superset.dashboards.commands.exceptions import ( +from superset.commands.dashboard.create import CreateDashboardCommand +from superset.commands.dashboard.delete import DeleteDashboardCommand +from superset.commands.dashboard.exceptions import ( DashboardAccessDeniedError, DashboardCreateFailedError, DashboardDeleteFailedError, @@ -50,9 +46,13 @@ DashboardNotFoundError, DashboardUpdateFailedError, ) -from superset.dashboards.commands.export import ExportDashboardsCommand -from superset.dashboards.commands.importers.dispatcher import ImportDashboardsCommand -from superset.dashboards.commands.update import UpdateDashboardCommand +from superset.commands.dashboard.export import ExportDashboardsCommand +from superset.commands.dashboard.importers.dispatcher import ImportDashboardsCommand +from superset.commands.dashboard.update import UpdateDashboardCommand +from superset.commands.importers.exceptions import NoValidFilesFoundError +from superset.commands.importers.v1.utils import get_contents_from_bundle +from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP, RouteMethod +from superset.daos.dashboard import DashboardDAO, EmbeddedDashboardDAO from superset.dashboards.filters import ( DashboardAccessFilter, DashboardCertifiedFilter, @@ -261,7 +261,7 @@ def ensure_thumbnails_enabled(self) -> Optional[Response]: "roles": RelatedFieldFilter("name", FilterRelatedRoles), "created_by": RelatedFieldFilter("first_name", FilterRelatedOwners), } - allowed_rel_fields = {"owners", "roles", "created_by"} + allowed_rel_fields = {"owners", "roles", "created_by", "changed_by"} openapi_spec_tag = "Dashboards" """ Override the name set for this collection of endpoints """ @@ -1349,8 +1349,7 @@ def delete_embedded(self, dashboard: Dashboard) -> Response: 500: $ref: '#/components/responses/500' """ - for embedded in dashboard.embedded: - EmbeddedDashboardDAO.delete(embedded) + EmbeddedDashboardDAO.delete(dashboard.embedded) return self.response(200, message="OK") @expose("/<id_or_slug>/copy/", methods=("POST",)) diff --git a/superset/dashboards/filter_sets/api.py b/superset/dashboards/filter_sets/api.py index 5a2bf01923f32..ee7297ef4c80b 100644 --- a/superset/dashboards/filter_sets/api.py +++ b/superset/dashboards/filter_sets/api.py @@ -29,12 +29,10 @@ from flask_appbuilder.models.sqla.interface import SQLAInterface from marshmallow import ValidationError -from superset.commands.exceptions import ObjectNotFoundError -from superset.daos.dashboard import DashboardDAO -from superset.dashboards.commands.exceptions import DashboardNotFoundError -from superset.dashboards.filter_sets.commands.create import CreateFilterSetCommand -from superset.dashboards.filter_sets.commands.delete import DeleteFilterSetCommand -from superset.dashboards.filter_sets.commands.exceptions import ( +from superset.commands.dashboard.exceptions import DashboardNotFoundError +from superset.commands.dashboard.filter_set.create import CreateFilterSetCommand +from superset.commands.dashboard.filter_set.delete import DeleteFilterSetCommand +from superset.commands.dashboard.filter_set.exceptions import ( FilterSetCreateFailedError, FilterSetDeleteFailedError, FilterSetForbiddenError, @@ -42,7 +40,9 @@ FilterSetUpdateFailedError, UserIsNotDashboardOwnerError, ) -from superset.dashboards.filter_sets.commands.update import UpdateFilterSetCommand +from superset.commands.dashboard.filter_set.update import UpdateFilterSetCommand +from superset.commands.exceptions import ObjectNotFoundError +from superset.daos.dashboard import DashboardDAO from superset.dashboards.filter_sets.consts import ( DASHBOARD_FIELD, DASHBOARD_ID_FIELD, diff --git a/superset/dashboards/filter_state/api.py b/superset/dashboards/filter_state/api.py index 9e0720646a05c..d3b6ce8f7a14d 100644 --- a/superset/dashboards/filter_state/api.py +++ b/superset/dashboards/filter_state/api.py @@ -19,10 +19,10 @@ from flask import Response from flask_appbuilder.api import expose, protect, safe -from superset.dashboards.filter_state.commands.create import CreateFilterStateCommand -from superset.dashboards.filter_state.commands.delete import DeleteFilterStateCommand -from superset.dashboards.filter_state.commands.get import GetFilterStateCommand -from superset.dashboards.filter_state.commands.update import UpdateFilterStateCommand +from superset.commands.dashboard.filter_state.create import CreateFilterStateCommand +from superset.commands.dashboard.filter_state.delete import DeleteFilterStateCommand +from superset.commands.dashboard.filter_state.get import GetFilterStateCommand +from superset.commands.dashboard.filter_state.update import UpdateFilterStateCommand from superset.extensions import event_logger from superset.temporary_cache.api import TemporaryCacheRestApi diff --git a/superset/dashboards/permalink/api.py b/superset/dashboards/permalink/api.py index 0a786d1def2f9..a6ae2910f42a1 100644 --- a/superset/dashboards/permalink/api.py +++ b/superset/dashboards/permalink/api.py @@ -20,15 +20,13 @@ from flask_appbuilder.api import expose, protect, safe from marshmallow import ValidationError -from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP -from superset.dashboards.commands.exceptions import ( +from superset.commands.dashboard.exceptions import ( DashboardAccessDeniedError, DashboardNotFoundError, ) -from superset.dashboards.permalink.commands.create import ( - CreateDashboardPermalinkCommand, -) -from superset.dashboards.permalink.commands.get import GetDashboardPermalinkCommand +from superset.commands.dashboard.permalink.create import CreateDashboardPermalinkCommand +from superset.commands.dashboard.permalink.get import GetDashboardPermalinkCommand +from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP from superset.dashboards.permalink.exceptions import DashboardPermalinkInvalidStateError from superset.dashboards.permalink.schemas import DashboardPermalinkStateSchema from superset.extensions import event_logger diff --git a/superset/dashboards/schemas.py b/superset/dashboards/schemas.py index e467167297338..615d830d42420 100644 --- a/superset/dashboards/schemas.py +++ b/superset/dashboards/schemas.py @@ -18,11 +18,12 @@ import re from typing import Any, Union -from marshmallow import fields, post_load, pre_load, Schema +from marshmallow import fields, post_dump, post_load, pre_load, Schema from marshmallow.validate import Length, ValidationError +from superset import security_manager from superset.exceptions import SupersetException -from superset.tags.models import TagTypes +from superset.tags.models import TagType from superset.utils import core as utils get_delete_ids_schema = {"type": "array", "items": {"type": "integer"}} @@ -169,7 +170,7 @@ class RolesSchema(Schema): class TagSchema(Schema): id = fields.Int() name = fields.String() - type = fields.Enum(TagTypes, by_value=True) + type = fields.Enum(TagType, by_value=True) class DashboardGetResponseSchema(Schema): @@ -198,6 +199,15 @@ class DashboardGetResponseSchema(Schema): changed_on_humanized = fields.String(data_key="changed_on_delta_humanized") is_managed_externally = fields.Boolean(allow_none=True, dump_default=False) + # pylint: disable=unused-argument + @post_dump() + def post_dump(self, serialized: dict[str, Any], **kwargs: Any) -> dict[str, Any]: + if security_manager.is_guest_user(): + del serialized["owners"] + del serialized["changed_by_name"] + del serialized["changed_by"] + return serialized + class DatabaseSchema(Schema): id = fields.Int() @@ -247,6 +257,14 @@ class DashboardDatasetSchema(Schema): normalize_columns = fields.Bool() always_filter_main_dttm = fields.Bool() + # pylint: disable=unused-argument + @post_dump() + def post_dump(self, serialized: dict[str, Any], **kwargs: Any) -> dict[str, Any]: + if security_manager.is_guest_user(): + del serialized["owners"] + del serialized["database"] + return serialized + class BaseDashboardSchema(Schema): # pylint: disable=unused-argument diff --git a/superset/databases/api.py b/superset/databases/api.py index 116e2ddb1fa88..8de84a16af6dc 100644 --- a/superset/databases/api.py +++ b/superset/databases/api.py @@ -29,16 +29,9 @@ from sqlalchemy.exc import NoSuchTableError, OperationalError, SQLAlchemyError from superset import app, event_logger -from superset.commands.importers.exceptions import ( - IncorrectFormatError, - NoValidFilesFoundError, -) -from superset.commands.importers.v1.utils import get_contents_from_bundle -from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP, RouteMethod -from superset.daos.database import DatabaseDAO -from superset.databases.commands.create import CreateDatabaseCommand -from superset.databases.commands.delete import DeleteDatabaseCommand -from superset.databases.commands.exceptions import ( +from superset.commands.database.create import CreateDatabaseCommand +from superset.commands.database.delete import DeleteDatabaseCommand +from superset.commands.database.exceptions import ( DatabaseConnectionFailedError, DatabaseCreateFailedError, DatabaseDeleteDatasetsExistFailedError, @@ -49,13 +42,26 @@ DatabaseUpdateFailedError, InvalidParametersError, ) -from superset.databases.commands.export import ExportDatabasesCommand -from superset.databases.commands.importers.dispatcher import ImportDatabasesCommand -from superset.databases.commands.tables import TablesDatabaseCommand -from superset.databases.commands.test_connection import TestConnectionDatabaseCommand -from superset.databases.commands.update import UpdateDatabaseCommand -from superset.databases.commands.validate import ValidateDatabaseParametersCommand -from superset.databases.commands.validate_sql import ValidateSQLCommand +from superset.commands.database.export import ExportDatabasesCommand +from superset.commands.database.importers.dispatcher import ImportDatabasesCommand +from superset.commands.database.ssh_tunnel.delete import DeleteSSHTunnelCommand +from superset.commands.database.ssh_tunnel.exceptions import ( + SSHTunnelDeleteFailedError, + SSHTunnelingNotEnabledError, + SSHTunnelNotFoundError, +) +from superset.commands.database.tables import TablesDatabaseCommand +from superset.commands.database.test_connection import TestConnectionDatabaseCommand +from superset.commands.database.update import UpdateDatabaseCommand +from superset.commands.database.validate import ValidateDatabaseParametersCommand +from superset.commands.database.validate_sql import ValidateSQLCommand +from superset.commands.importers.exceptions import ( + IncorrectFormatError, + NoValidFilesFoundError, +) +from superset.commands.importers.v1.utils import get_contents_from_bundle +from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP, RouteMethod +from superset.daos.database import DatabaseDAO from superset.databases.decorators import check_datasource_access from superset.databases.filters import DatabaseFilter, DatabaseUploadEnabledFilter from superset.databases.schemas import ( @@ -79,12 +85,6 @@ ValidateSQLRequest, ValidateSQLResponse, ) -from superset.databases.ssh_tunnel.commands.delete import DeleteSSHTunnelCommand -from superset.databases.ssh_tunnel.commands.exceptions import ( - SSHTunnelDeleteFailedError, - SSHTunnelingNotEnabledError, - SSHTunnelNotFoundError, -) from superset.databases.utils import get_table_metadata from superset.db_engine_specs import get_available_engine_specs from superset.errors import ErrorLevel, SupersetError, SupersetErrorType @@ -111,6 +111,7 @@ class DatabaseRestApi(BaseSupersetModelRestApi): include_route_methods = RouteMethod.REST_MODEL_VIEW_CRUD_SET | { RouteMethod.EXPORT, RouteMethod.IMPORT, + RouteMethod.RELATED, "tables", "table_metadata", "table_extra_metadata", @@ -162,6 +163,8 @@ class DatabaseRestApi(BaseSupersetModelRestApi): "backend", "changed_on", "changed_on_delta_humanized", + "changed_by.first_name", + "changed_by.last_name", "created_by.first_name", "created_by.last_name", "database_name", @@ -194,7 +197,17 @@ class DatabaseRestApi(BaseSupersetModelRestApi): edit_columns = add_columns + search_columns = [ + "allow_file_upload", + "allow_dml", + "allow_run_async", + "created_by", + "changed_by", + "database_name", + "expose_in_sqllab", + ] search_filters = {"allow_file_upload": [DatabaseUploadEnabledFilter]} + allowed_rel_fields = {"changed_by", "created_by"} list_select_columns = list_columns + ["extra", "sqlalchemy_uri", "password"] order_columns = [ diff --git a/superset/databases/schemas.py b/superset/databases/schemas.py index abba9036a1efd..b56c98c5d64b5 100644 --- a/superset/databases/schemas.py +++ b/superset/databases/schemas.py @@ -28,13 +28,13 @@ from sqlalchemy import MetaData from superset import db, is_feature_enabled -from superset.constants import PASSWORD_MASK -from superset.databases.commands.exceptions import DatabaseInvalidError -from superset.databases.ssh_tunnel.commands.exceptions import ( +from superset.commands.database.exceptions import DatabaseInvalidError +from superset.commands.database.ssh_tunnel.exceptions import ( SSHTunnelingNotEnabledError, SSHTunnelInvalidCredentials, SSHTunnelMissingCredentials, ) +from superset.constants import PASSWORD_MASK from superset.databases.utils import make_url_safe from superset.db_engine_specs import get_engine_spec from superset.exceptions import CertificateException, SupersetSecurityException @@ -750,6 +750,7 @@ def fix_schemas_allowed_for_csv_upload( # pylint: disable=invalid-name allows_virtual_table_explore = fields.Boolean(required=False) cancel_query_on_windows_unload = fields.Boolean(required=False) disable_data_preview = fields.Boolean(required=False) + version = fields.String(required=False, allow_none=True) class ImportV1DatabaseSchema(Schema): diff --git a/superset/databases/utils.py b/superset/databases/utils.py index fa163e4d9ef47..21abd7b9c283d 100644 --- a/superset/databases/utils.py +++ b/superset/databases/utils.py @@ -18,7 +18,7 @@ from sqlalchemy.engine.url import make_url, URL -from superset.databases.commands.exceptions import DatabaseInvalidError +from superset.commands.database.exceptions import DatabaseInvalidError def get_foreign_keys_metadata( diff --git a/superset/datasets/api.py b/superset/datasets/api.py index e1d7e5a09eb51..bc4a42e58ee7e 100644 --- a/superset/datasets/api.py +++ b/superset/datasets/api.py @@ -30,17 +30,10 @@ from marshmallow import ValidationError from superset import event_logger, is_feature_enabled -from superset.commands.exceptions import CommandException -from superset.commands.importers.exceptions import NoValidFilesFoundError -from superset.commands.importers.v1.utils import get_contents_from_bundle -from superset.connectors.sqla.models import SqlaTable -from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP, RouteMethod -from superset.daos.dataset import DatasetDAO -from superset.databases.filters import DatabaseFilter -from superset.datasets.commands.create import CreateDatasetCommand -from superset.datasets.commands.delete import DeleteDatasetCommand -from superset.datasets.commands.duplicate import DuplicateDatasetCommand -from superset.datasets.commands.exceptions import ( +from superset.commands.dataset.create import CreateDatasetCommand +from superset.commands.dataset.delete import DeleteDatasetCommand +from superset.commands.dataset.duplicate import DuplicateDatasetCommand +from superset.commands.dataset.exceptions import ( DatasetCreateFailedError, DatasetDeleteFailedError, DatasetForbiddenError, @@ -49,11 +42,18 @@ DatasetRefreshFailedError, DatasetUpdateFailedError, ) -from superset.datasets.commands.export import ExportDatasetsCommand -from superset.datasets.commands.importers.dispatcher import ImportDatasetsCommand -from superset.datasets.commands.refresh import RefreshDatasetCommand -from superset.datasets.commands.update import UpdateDatasetCommand -from superset.datasets.commands.warm_up_cache import DatasetWarmUpCacheCommand +from superset.commands.dataset.export import ExportDatasetsCommand +from superset.commands.dataset.importers.dispatcher import ImportDatasetsCommand +from superset.commands.dataset.refresh import RefreshDatasetCommand +from superset.commands.dataset.update import UpdateDatasetCommand +from superset.commands.dataset.warm_up_cache import DatasetWarmUpCacheCommand +from superset.commands.exceptions import CommandException +from superset.commands.importers.exceptions import NoValidFilesFoundError +from superset.commands.importers.v1.utils import get_contents_from_bundle +from superset.connectors.sqla.models import SqlaTable +from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP, RouteMethod +from superset.daos.dataset import DatasetDAO +from superset.databases.filters import DatabaseFilter from superset.datasets.filters import DatasetCertifiedFilter, DatasetIsNullOrEmptyFilter from superset.datasets.schemas import ( DatasetCacheWarmUpRequestSchema, @@ -247,8 +247,17 @@ class DatasetRestApi(BaseSupersetModelRestApi): "sql": [DatasetIsNullOrEmptyFilter], "id": [DatasetCertifiedFilter], } - search_columns = ["id", "database", "owners", "schema", "sql", "table_name"] - allowed_rel_fields = {"database", "owners"} + search_columns = [ + "id", + "database", + "owners", + "schema", + "sql", + "table_name", + "created_by", + "changed_by", + ] + allowed_rel_fields = {"database", "owners", "created_by", "changed_by"} allowed_distinct_fields = {"schema"} apispec_parameter_schemas = { diff --git a/superset/datasets/columns/api.py b/superset/datasets/columns/api.py index 0aafab5d39dc9..90de0f7750ed4 100644 --- a/superset/datasets/columns/api.py +++ b/superset/datasets/columns/api.py @@ -20,14 +20,14 @@ from flask_appbuilder.api import expose, permission_name, protect, safe from flask_appbuilder.models.sqla.interface import SQLAInterface -from superset.connectors.sqla.models import TableColumn -from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP -from superset.datasets.columns.commands.delete import DeleteDatasetColumnCommand -from superset.datasets.columns.commands.exceptions import ( +from superset.commands.dataset.columns.delete import DeleteDatasetColumnCommand +from superset.commands.dataset.columns.exceptions import ( DatasetColumnDeleteFailedError, DatasetColumnForbiddenError, DatasetColumnNotFoundError, ) +from superset.connectors.sqla.models import TableColumn +from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP from superset.views.base_api import BaseSupersetModelRestApi, statsd_metrics logger = logging.getLogger(__name__) diff --git a/superset/datasets/metrics/api.py b/superset/datasets/metrics/api.py index 28ec9474e2b8d..aa29254fc0683 100644 --- a/superset/datasets/metrics/api.py +++ b/superset/datasets/metrics/api.py @@ -20,14 +20,14 @@ from flask_appbuilder.api import expose, permission_name, protect, safe from flask_appbuilder.models.sqla.interface import SQLAInterface -from superset.connectors.sqla.models import TableColumn -from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP -from superset.datasets.metrics.commands.delete import DeleteDatasetMetricCommand -from superset.datasets.metrics.commands.exceptions import ( +from superset.commands.dataset.metrics.delete import DeleteDatasetMetricCommand +from superset.commands.dataset.metrics.exceptions import ( DatasetMetricDeleteFailedError, DatasetMetricForbiddenError, DatasetMetricNotFoundError, ) +from superset.connectors.sqla.models import TableColumn +from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP from superset.views.base_api import BaseSupersetModelRestApi, statsd_metrics logger = logging.getLogger(__name__) diff --git a/superset/datasource/api.py b/superset/datasource/api.py index 0c4338e3496dd..6943d00bc75ec 100644 --- a/superset/datasource/api.py +++ b/superset/datasource/api.py @@ -115,11 +115,18 @@ def get_column_values( return self.response(403, message=ex.message) row_limit = apply_max_row_limit(app.config["FILTER_SELECT_ROW_LIMIT"]) + denormalize_column = not datasource.normalize_columns try: payload = datasource.values_for_column( - column_name=column_name, limit=row_limit + column_name=column_name, + limit=row_limit, + denormalize_column=denormalize_column, ) return self.response(200, result=payload) + except KeyError: + return self.response( + 400, message=f"Column name {column_name} does not exist" + ) except NotImplementedError: return self.response( 400, diff --git a/superset/db_engine_specs/__init__.py b/superset/db_engine_specs/__init__.py index d4ec199133d38..9ec0f0416bd09 100644 --- a/superset/db_engine_specs/__init__.py +++ b/superset/db_engine_specs/__init__.py @@ -35,11 +35,10 @@ from pathlib import Path from typing import Any, Optional -import sqlalchemy.databases import sqlalchemy.dialects from importlib_metadata import entry_points from sqlalchemy.engine.default import DefaultDialect -from sqlalchemy.engine.url import URL +from sqlalchemy.exc import NoSuchModuleError from superset import app, feature_flag_manager from superset.db_engine_specs.base import BaseEngineSpec @@ -128,27 +127,26 @@ def get_available_engine_specs() -> dict[type[BaseEngineSpec], set[str]]: drivers: dict[str, set[str]] = defaultdict(set) # native SQLAlchemy dialects - for attr in sqlalchemy.databases.__all__: - dialect = getattr(sqlalchemy.dialects, attr) - for attribute in dialect.__dict__.values(): + for attr in sqlalchemy.dialects.__all__: + try: + dialect = sqlalchemy.dialects.registry.load(attr) if ( - hasattr(attribute, "dialect") - and inspect.isclass(attribute.dialect) - and issubclass(attribute.dialect, DefaultDialect) + issubclass(dialect, DefaultDialect) + and hasattr(dialect, "driver") # adodbapi dialect is removed in SQLA 1.4 and doesn't implement the # `dbapi` method, hence needs to be ignored to avoid logging a warning - and attribute.dialect.driver != "adodbapi" + and dialect.driver != "adodbapi" ): try: - attribute.dialect.dbapi() + dialect.dbapi() except ModuleNotFoundError: continue except Exception as ex: # pylint: disable=broad-except - logger.warning( - "Unable to load dialect %s: %s", attribute.dialect, ex - ) + logger.warning("Unable to load dialect %s: %s", dialect, ex) continue - drivers[attr].add(attribute.dialect.driver) + drivers[attr].add(dialect.driver) + except NoSuchModuleError: + continue # installed 3rd-party dialects for ep in entry_points(group="sqlalchemy.dialects"): @@ -167,21 +165,20 @@ def get_available_engine_specs() -> dict[type[BaseEngineSpec], set[str]]: driver = driver.decode() drivers[backend].add(driver) + dbs_denylist = app.config["DBS_AVAILABLE_DENYLIST"] + if not feature_flag_manager.is_feature_enabled("ENABLE_SUPERSET_META_DB"): + dbs_denylist["superset"] = {""} + dbs_denylist_engines = dbs_denylist.keys() available_engines = {} + for engine_spec in load_engine_specs(): driver = drivers[engine_spec.engine] - - # do not add denied db engine specs to available list - dbs_denylist = app.config["DBS_AVAILABLE_DENYLIST"] - if not feature_flag_manager.is_feature_enabled("ENABLE_SUPERSET_META_DB"): - dbs_denylist["superset"] = {""} - dbs_denylist_engines = dbs_denylist.keys() - if ( engine_spec.engine in dbs_denylist_engines and hasattr(engine_spec, "default_driver") and engine_spec.default_driver in dbs_denylist[engine_spec.engine] ): + # do not add denied db engine specs to available list continue # lookup driver by engine aliases. diff --git a/superset/db_engine_specs/base.py b/superset/db_engine_specs/base.py index f355e4ef8cea8..6bc8c444d69bb 100644 --- a/superset/db_engine_specs/base.py +++ b/superset/db_engine_specs/base.py @@ -51,7 +51,7 @@ from sqlalchemy.engine.url import URL from sqlalchemy.ext.compiler import compiles from sqlalchemy.orm import Session -from sqlalchemy.sql import quoted_name, text +from sqlalchemy.sql import literal_column, quoted_name, text from sqlalchemy.sql.expression import ColumnClause, Select, TextAsFrom, TextClause from sqlalchemy.types import TypeEngine from sqlparse.tokens import CTE @@ -398,6 +398,19 @@ class BaseEngineSpec: # pylint: disable=too-many-public-methods # Can the catalog be changed on a per-query basis? supports_dynamic_catalog = False + @classmethod + def get_allows_alias_in_select( + cls, database: Database # pylint: disable=unused-argument + ) -> bool: + """ + Method for dynamic `allows_alias_in_select`. + + In Dremio this atribute is version-dependent, so Superset needs to inspect the + database configuration in order to determine it. This method allows engine-specs + to define dynamic values for the attribute. + """ + return cls.allows_alias_in_select + @classmethod def supports_url(cls, url: URL) -> bool: """ @@ -1309,8 +1322,12 @@ def get_table_comment( return comment @classmethod - def get_columns( - cls, inspector: Inspector, table_name: str, schema: str | None + def get_columns( # pylint: disable=unused-argument + cls, + inspector: Inspector, + table_name: str, + schema: str | None, + options: dict[str, Any] | None = None, ) -> list[ResultSetColumnType]: """ Get all columns from a given schema and table @@ -1318,6 +1335,8 @@ def get_columns( :param inspector: SqlAlchemy Inspector instance :param table_name: Table name :param schema: Schema name. If omitted, uses default schema for database + :param options: Extra options to customise the display of columns in + some databases :return: All columns in table """ return convert_inspector_columns( @@ -1369,7 +1388,12 @@ def where_latest_partition( # pylint: disable=too-many-arguments,unused-argumen @classmethod def _get_fields(cls, cols: list[ResultSetColumnType]) -> list[Any]: - return [column(c["column_name"]) for c in cols] + return [ + literal_column(query_as) + if (query_as := c.get("query_as")) + else column(c["column_name"]) + for c in cols + ] @classmethod def select_star( # pylint: disable=too-many-arguments,too-many-locals @@ -1409,8 +1433,9 @@ def select_star( # pylint: disable=too-many-arguments,too-many-locals if show_cols: fields = cls._get_fields(cols) quote = engine.dialect.identifier_preparer.quote + quote_schema = engine.dialect.identifier_preparer.quote_schema if schema: - full_table_name = quote(schema) + "." + quote(table_name) + full_table_name = quote_schema(schema) + "." + quote(table_name) else: full_table_name = quote(table_name) diff --git a/superset/db_engine_specs/databend.py b/superset/db_engine_specs/databend.py index 589e8b9168e26..fa165b80d5736 100644 --- a/superset/db_engine_specs/databend.py +++ b/superset/db_engine_specs/databend.py @@ -18,6 +18,7 @@ from sqlalchemy.engine.url import URL from urllib3.exceptions import NewConnectionError +from superset.constants import TimeGrain from superset.databases.utils import make_url_safe from superset.db_engine_specs.base import ( BaseEngineSpec, @@ -45,17 +46,17 @@ class DatabendBaseEngineSpec(BaseEngineSpec): _time_grain_expressions = { None: "{col}", - "PT1M": "to_start_of_minute(TO_DATETIME({col}))", - "PT5M": "to_start_of_five_minutes(TO_DATETIME({col}))", - "PT10M": "to_start_of_ten_minutes(TO_DATETIME({col}))", - "PT15M": "to_start_of_fifteen_minutes(TO_DATETIME({col}))", - "PT30M": "TO_DATETIME(intDiv(toUInt32(TO_DATETIME({col})), 1800)*1800)", - "PT1H": "to_start_of_hour(TO_DATETIME({col}))", - "P1D": "to_start_of_day(TO_DATETIME({col}))", - "P1W": "to_monday(TO_DATETIME({col}))", - "P1M": "to_start_of_month(TO_DATETIME({col}))", - "P3M": "to_start_of_quarter(TO_DATETIME({col}))", - "P1Y": "to_start_of_year(TO_DATETIME({col}))", + TimeGrain.SECOND: "DATE_TRUNC('SECOND', {col})", + TimeGrain.MINUTE: "to_start_of_minute(TO_DATETIME({col}))", + TimeGrain.FIVE_MINUTES: "to_start_of_five_minutes(TO_DATETIME({col}))", + TimeGrain.TEN_MINUTES: "to_start_of_ten_minutes(TO_DATETIME({col}))", + TimeGrain.FIFTEEN_MINUTES: "to_start_of_fifteen_minutes(TO_DATETIME({col}))", + TimeGrain.HOUR: "to_start_of_hour(TO_DATETIME({col}))", + TimeGrain.DAY: "to_start_of_day(TO_DATETIME({col}))", + TimeGrain.WEEK: "to_monday(TO_DATETIME({col}))", + TimeGrain.MONTH: "to_start_of_month(TO_DATETIME({col}))", + TimeGrain.QUARTER: "to_start_of_quarter(TO_DATETIME({col}))", + TimeGrain.YEAR: "to_start_of_year(TO_DATETIME({col}))", } column_type_mappings = ( @@ -133,6 +134,8 @@ def convert_dttm( if isinstance(sqla_type, types.Date): return f"to_date('{dttm.date().isoformat()}')" + if isinstance(sqla_type, types.TIMESTAMP): + return f"""TO_TIMESTAMP('{dttm.isoformat(timespec="microseconds")}')""" if isinstance(sqla_type, types.DateTime): return f"""to_dateTime('{dttm.isoformat(sep=" ", timespec="seconds")}')""" return None diff --git a/superset/db_engine_specs/doris.py b/superset/db_engine_specs/doris.py new file mode 100644 index 0000000000000..e502f5bda2be7 --- /dev/null +++ b/superset/db_engine_specs/doris.py @@ -0,0 +1,278 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +import logging +import re +from re import Pattern +from typing import Any, Optional +from urllib import parse + +from flask_babel import gettext as __ +from sqlalchemy import Float, Integer, Numeric, String, TEXT, types +from sqlalchemy.engine.url import URL +from sqlalchemy.sql.type_api import TypeEngine + +from superset.db_engine_specs.mysql import MySQLEngineSpec +from superset.errors import SupersetErrorType +from superset.utils.core import GenericDataType + +# Regular expressions to catch custom errors +CONNECTION_ACCESS_DENIED_REGEX = re.compile( + "Access denied for user '(?P<username>.*?)'" +) +CONNECTION_INVALID_HOSTNAME_REGEX = re.compile( + "Unknown Doris server host '(?P<hostname>.*?)'" +) +CONNECTION_UNKNOWN_DATABASE_REGEX = re.compile("Unknown database '(?P<database>.*?)'") +CONNECTION_HOST_DOWN_REGEX = re.compile( + "Can't connect to Doris server on '(?P<hostname>.*?)'" +) +SYNTAX_ERROR_REGEX = re.compile( + "check the manual that corresponds to your MySQL server " + "version for the right syntax to use near '(?P<server_error>.*)" +) + +logger = logging.getLogger(__name__) + + +class TINYINT(Integer): + __visit_name__ = "TINYINT" + + +class LARGEINT(Integer): + __visit_name__ = "LARGEINT" + + +class DOUBLE(Float): + __visit_name__ = "DOUBLE" + + +class HLL(Numeric): + __visit_name__ = "HLL" + + +class BITMAP(Numeric): + __visit_name__ = "BITMAP" + + +class QuantileState(Numeric): + __visit_name__ = "QUANTILE_STATE" + + +class AggState(Numeric): + __visit_name__ = "AGG_STATE" + + +class ARRAY(TypeEngine): + __visit_name__ = "ARRAY" + + @property + def python_type(self) -> Optional[type[list[Any]]]: + return list + + +class MAP(TypeEngine): + __visit_name__ = "MAP" + + @property + def python_type(self) -> Optional[type[dict[Any, Any]]]: + return dict + + +class STRUCT(TypeEngine): + __visit_name__ = "STRUCT" + + @property + def python_type(self) -> Optional[type[Any]]: + return None + + +class DorisEngineSpec(MySQLEngineSpec): + engine = "pydoris" + engine_aliases = {"doris"} + engine_name = "Apache Doris" + max_column_name_length = 64 + default_driver = "pydoris" + sqlalchemy_uri_placeholder = ( + "doris://user:password@host:port/catalog.db[?key=value&key=value...]" + ) + encryption_parameters = {"ssl": "0"} + supports_dynamic_schema = True + + column_type_mappings = ( # type: ignore + ( + re.compile(r"^tinyint", re.IGNORECASE), + TINYINT(), + GenericDataType.NUMERIC, + ), + ( + re.compile(r"^largeint", re.IGNORECASE), + LARGEINT(), + GenericDataType.NUMERIC, + ), + ( + re.compile(r"^decimal.*", re.IGNORECASE), + types.DECIMAL(), + GenericDataType.NUMERIC, + ), + ( + re.compile(r"^double", re.IGNORECASE), + DOUBLE(), + GenericDataType.NUMERIC, + ), + ( + re.compile(r"^varchar(\((\d+)\))*$", re.IGNORECASE), + types.VARCHAR(), + GenericDataType.STRING, + ), + ( + re.compile(r"^char(\((\d+)\))*$", re.IGNORECASE), + types.CHAR(), + GenericDataType.STRING, + ), + ( + re.compile(r"^json.*", re.IGNORECASE), + types.JSON(), + GenericDataType.STRING, + ), + ( + re.compile(r"^binary.*", re.IGNORECASE), + types.BINARY(), + GenericDataType.STRING, + ), + ( + re.compile(r"^quantile_state", re.IGNORECASE), + QuantileState(), + GenericDataType.STRING, + ), + ( + re.compile(r"^agg_state.*", re.IGNORECASE), + AggState(), + GenericDataType.STRING, + ), + (re.compile(r"^hll", re.IGNORECASE), HLL(), GenericDataType.STRING), + ( + re.compile(r"^bitmap", re.IGNORECASE), + BITMAP(), + GenericDataType.STRING, + ), + ( + re.compile(r"^array.*", re.IGNORECASE), + ARRAY(), + GenericDataType.STRING, + ), + ( + re.compile(r"^map.*", re.IGNORECASE), + MAP(), + GenericDataType.STRING, + ), + ( + re.compile(r"^struct.*", re.IGNORECASE), + STRUCT(), + GenericDataType.STRING, + ), + ( + re.compile(r"^datetime.*", re.IGNORECASE), + types.DATETIME(), + GenericDataType.STRING, + ), + ( + re.compile(r"^date.*", re.IGNORECASE), + types.DATE(), + GenericDataType.STRING, + ), + ( + re.compile(r"^text.*", re.IGNORECASE), + TEXT(), + GenericDataType.STRING, + ), + ( + re.compile(r"^string.*", re.IGNORECASE), + String(), + GenericDataType.STRING, + ), + ) + + custom_errors: dict[Pattern[str], tuple[str, SupersetErrorType, dict[str, Any]]] = { + CONNECTION_ACCESS_DENIED_REGEX: ( + __('Either the username "%(username)s" or the password is incorrect.'), + SupersetErrorType.CONNECTION_ACCESS_DENIED_ERROR, + {"invalid": ["username", "password"]}, + ), + CONNECTION_INVALID_HOSTNAME_REGEX: ( + __('Unknown Doris server host "%(hostname)s".'), + SupersetErrorType.CONNECTION_INVALID_HOSTNAME_ERROR, + {"invalid": ["host"]}, + ), + CONNECTION_HOST_DOWN_REGEX: ( + __('The host "%(hostname)s" might be down and can\'t be reached.'), + SupersetErrorType.CONNECTION_HOST_DOWN_ERROR, + {"invalid": ["host", "port"]}, + ), + CONNECTION_UNKNOWN_DATABASE_REGEX: ( + __('Unable to connect to database "%(database)s".'), + SupersetErrorType.CONNECTION_UNKNOWN_DATABASE_ERROR, + {"invalid": ["database"]}, + ), + SYNTAX_ERROR_REGEX: ( + __( + 'Please check your query for syntax errors near "%(server_error)s". ' + "Then, try running your query again." + ), + SupersetErrorType.SYNTAX_ERROR, + {}, + ), + } + + @classmethod + def adjust_engine_params( + cls, + uri: URL, + connect_args: dict[str, Any], + catalog: Optional[str] = None, + schema: Optional[str] = None, + ) -> tuple[URL, dict[str, Any]]: + database = uri.database + if schema and database: + schema = parse.quote(schema, safe="") + if "." in database: + database = database.split(".")[0] + "." + schema + else: + database = "internal." + schema + uri = uri.set(database=database) + + return uri, connect_args + + @classmethod + def get_schema_from_engine_params( + cls, + sqlalchemy_uri: URL, + connect_args: dict[str, Any], + ) -> Optional[str]: + """ + Return the configured schema. + + For doris the SQLAlchemy URI looks like this: + + doris://localhost:9030/catalog.database + + """ + database = sqlalchemy_uri.database.strip("/") + + if "." not in database: + return None + + return parse.unquote(database.split(".")[1]) diff --git a/superset/db_engine_specs/dremio.py b/superset/db_engine_specs/dremio.py index c96159f1b8aa4..746576d3f30da 100644 --- a/superset/db_engine_specs/dremio.py +++ b/superset/db_engine_specs/dremio.py @@ -14,14 +14,25 @@ # KIND, either express or implied. See the License for the # specific language governing permissions and limitations # under the License. + +from __future__ import annotations + from datetime import datetime -from typing import Any, Optional +from typing import Any, TYPE_CHECKING +from packaging.version import Version from sqlalchemy import types from superset.constants import TimeGrain from superset.db_engine_specs.base import BaseEngineSpec +if TYPE_CHECKING: + from superset.models.core import Database + + +# See https://github.com/apache/superset/pull/25657 +FIXED_ALIAS_IN_SELECT_VERSION = Version("24.1.0") + class DremioEngineSpec(BaseEngineSpec): engine = "dremio" @@ -43,10 +54,25 @@ class DremioEngineSpec(BaseEngineSpec): def epoch_to_dttm(cls) -> str: return "TO_DATE({col})" + @classmethod + def get_allows_alias_in_select(cls, database: Database) -> bool: + """ + Dremio supports aliases in SELECT statements since version 24.1.0. + + If no version is specified in the DB extra, we assume the Dremio version is post + 24.1.0. This way, as we move forward people don't have to specify a version when + setting up their databases. + """ + version = database.get_extra().get("version") + if version and Version(version) < FIXED_ALIAS_IN_SELECT_VERSION: + return False + + return True + @classmethod def convert_dttm( - cls, target_type: str, dttm: datetime, db_extra: Optional[dict[str, Any]] = None - ) -> Optional[str]: + cls, target_type: str, dttm: datetime, db_extra: dict[str, Any] | None = None + ) -> str | None: sqla_type = cls.get_sqla_column_type(target_type) if isinstance(sqla_type, types.Date): diff --git a/superset/db_engine_specs/druid.py b/superset/db_engine_specs/druid.py index 9bba3a727438b..7cd85ec924cf9 100644 --- a/superset/db_engine_specs/druid.py +++ b/superset/db_engine_specs/druid.py @@ -23,14 +23,12 @@ from typing import Any, TYPE_CHECKING from sqlalchemy import types -from sqlalchemy.engine.reflection import Inspector from superset import is_feature_enabled from superset.constants import TimeGrain from superset.db_engine_specs.base import BaseEngineSpec from superset.db_engine_specs.exceptions import SupersetDBAPIConnectionError from superset.exceptions import SupersetException -from superset.superset_typing import ResultSetColumnType from superset.utils import core as utils if TYPE_CHECKING: @@ -130,15 +128,6 @@ def epoch_ms_to_dttm(cls) -> str: """ return "MILLIS_TO_TIMESTAMP({col})" - @classmethod - def get_columns( - cls, inspector: Inspector, table_name: str, schema: str | None - ) -> list[ResultSetColumnType]: - """ - Update the Druid type map. - """ - return super().get_columns(inspector, table_name, schema) - @classmethod def get_dbapi_exception_mapping(cls) -> dict[type[Exception], type[Exception]]: # pylint: disable=import-outside-toplevel diff --git a/superset/db_engine_specs/hive.py b/superset/db_engine_specs/hive.py index 4a881e15b276b..bd303f928d625 100644 --- a/superset/db_engine_specs/hive.py +++ b/superset/db_engine_specs/hive.py @@ -410,9 +410,13 @@ def handle_cursor( # pylint: disable=too-many-locals @classmethod def get_columns( - cls, inspector: Inspector, table_name: str, schema: str | None + cls, + inspector: Inspector, + table_name: str, + schema: str | None, + options: dict[str, Any] | None = None, ) -> list[ResultSetColumnType]: - return BaseEngineSpec.get_columns(inspector, table_name, schema) + return BaseEngineSpec.get_columns(inspector, table_name, schema, options) @classmethod def where_latest_partition( # pylint: disable=too-many-arguments diff --git a/superset/db_engine_specs/postgres.py b/superset/db_engine_specs/postgres.py index 7512aac6067d3..b98fce4fe67a7 100644 --- a/superset/db_engine_specs/postgres.py +++ b/superset/db_engine_specs/postgres.py @@ -182,6 +182,19 @@ def fetch_data(cls, cursor: Any, limit: int | None = None) -> list[tuple[Any, .. def epoch_to_dttm(cls) -> str: return "(timestamp 'epoch' + {col} * interval '1 second')" + @classmethod + def convert_dttm( + cls, target_type: str, dttm: datetime, db_extra: dict[str, Any] | None = None + ) -> str | None: + sqla_type = cls.get_sqla_column_type(target_type) + + if isinstance(sqla_type, Date): + return f"TO_DATE('{dttm.date().isoformat()}', 'YYYY-MM-DD')" + if isinstance(sqla_type, DateTime): + dttm_formatted = dttm.isoformat(sep=" ", timespec="microseconds") + return f"""TO_TIMESTAMP('{dttm_formatted}', 'YYYY-MM-DD HH24:MI:SS.US')""" + return None + class PostgresEngineSpec(PostgresBaseEngineSpec, BasicParametersMixin): engine = "postgresql" @@ -357,19 +370,6 @@ def get_table_names( inspector.get_foreign_table_names(schema) ) - @classmethod - def convert_dttm( - cls, target_type: str, dttm: datetime, db_extra: dict[str, Any] | None = None - ) -> str | None: - sqla_type = cls.get_sqla_column_type(target_type) - - if isinstance(sqla_type, Date): - return f"TO_DATE('{dttm.date().isoformat()}', 'YYYY-MM-DD')" - if isinstance(sqla_type, DateTime): - dttm_formatted = dttm.isoformat(sep=" ", timespec="microseconds") - return f"""TO_TIMESTAMP('{dttm_formatted}', 'YYYY-MM-DD HH24:MI:SS.US')""" - return None - @staticmethod def get_extra_params(database: Database) -> dict[str, Any]: """ diff --git a/superset/db_engine_specs/presto.py b/superset/db_engine_specs/presto.py index 8afa82d9b55d9..27e86a7980875 100644 --- a/superset/db_engine_specs/presto.py +++ b/superset/db_engine_specs/presto.py @@ -981,7 +981,11 @@ def _show_columns( @classmethod def get_columns( - cls, inspector: Inspector, table_name: str, schema: str | None + cls, + inspector: Inspector, + table_name: str, + schema: str | None, + options: dict[str, Any] | None = None, ) -> list[ResultSetColumnType]: """ Get columns from a Presto data source. This includes handling row and @@ -989,6 +993,7 @@ def get_columns( :param inspector: object that performs database schema inspection :param table_name: table name :param schema: schema name + :param options: Extra configuration options, not used by this backend :return: a list of results that contain column info (i.e. column name and data type) """ diff --git a/superset/db_engine_specs/redshift.py b/superset/db_engine_specs/redshift.py index 2e746a6349365..fc21389099f38 100644 --- a/superset/db_engine_specs/redshift.py +++ b/superset/db_engine_specs/redshift.py @@ -14,10 +14,12 @@ # KIND, either express or implied. See the License for the # specific language governing permissions and limitations # under the License. +from __future__ import annotations + import logging import re from re import Pattern -from typing import Any, Optional +from typing import Any import pandas as pd from flask_babel import gettext as __ @@ -148,7 +150,7 @@ def _mutate_label(label: str) -> str: return label.lower() @classmethod - def get_cancel_query_id(cls, cursor: Any, query: Query) -> Optional[str]: + def get_cancel_query_id(cls, cursor: Any, query: Query) -> str | None: """ Get Redshift PID that will be used to cancel all other running queries in the same session. diff --git a/superset/db_engine_specs/trino.py b/superset/db_engine_specs/trino.py index 425137e302e6b..6e56dbfa24d6b 100644 --- a/superset/db_engine_specs/trino.py +++ b/superset/db_engine_specs/trino.py @@ -24,6 +24,7 @@ import simplejson as json from flask import current_app +from sqlalchemy.engine.reflection import Inspector from sqlalchemy.engine.url import URL from sqlalchemy.orm import Session @@ -33,6 +34,7 @@ from superset.db_engine_specs.exceptions import SupersetDBAPIConnectionError from superset.db_engine_specs.presto import PrestoBaseEngineSpec from superset.models.sql_lab import Query +from superset.superset_typing import ResultSetColumnType from superset.utils import core as utils if TYPE_CHECKING: @@ -184,7 +186,7 @@ def handle_cursor(cls, cursor: Cursor, query: Query, session: Session) -> None: @classmethod def execute_with_cursor( - cls, cursor: Any, sql: str, query: Query, session: Session + cls, cursor: Cursor, sql: str, query: Query, session: Session ) -> None: """ Trigger execution of a query and handle the resulting cursor. @@ -193,34 +195,40 @@ def execute_with_cursor( in another thread and invoke `handle_cursor` to poll for the query ID to appear on the cursor in parallel. """ + # Fetch the query ID beforehand, since it might fail inside the thread due to + # how the SQLAlchemy session is handled. + query_id = query.id + execute_result: dict[str, Any] = {} + execute_event = threading.Event() - def _execute(results: dict[str, Any]) -> None: - logger.debug("Query %d: Running query: %s", query.id, sql) + def _execute(results: dict[str, Any], event: threading.Event) -> None: + logger.debug("Query %d: Running query: %s", query_id, sql) - # Pass result / exception information back to the parent thread try: cls.execute(cursor, sql) - results["complete"] = True except Exception as ex: # pylint: disable=broad-except - results["complete"] = True results["error"] = ex + finally: + event.set() - execute_thread = threading.Thread(target=_execute, args=(execute_result,)) + execute_thread = threading.Thread( + target=_execute, + args=(execute_result, execute_event), + ) execute_thread.start() # Wait for a query ID to be available before handling the cursor, as # it's required by that method; it may never become available on error. - while not cursor.query_id and not execute_result.get("complete"): + while not cursor.query_id and not execute_event.is_set(): time.sleep(0.1) - logger.debug("Query %d: Handling cursor", query.id) + logger.debug("Query %d: Handling cursor", query_id) cls.handle_cursor(cursor, query, session) # Block until the query completes; same behaviour as the client itself - logger.debug("Query %d: Waiting for query to complete", query.id) - while not execute_result.get("complete"): - time.sleep(0.5) + logger.debug("Query %d: Waiting for query to complete", query_id) + execute_event.wait() # Unfortunately we'll mangle the stack trace due to the thread, but # throwing the original exception allows mapping database errors as normal @@ -234,7 +242,7 @@ def prepare_cancel_query(cls, query: Query, session: Session) -> None: session.commit() @classmethod - def cancel_query(cls, cursor: Any, query: Query, cancel_query_id: str) -> bool: + def cancel_query(cls, cursor: Cursor, query: Query, cancel_query_id: str) -> bool: """ Cancel query in the underlying database. @@ -325,3 +333,65 @@ def get_dbapi_exception_mapping(cls) -> dict[type[Exception], type[Exception]]: return { requests_exceptions.ConnectionError: SupersetDBAPIConnectionError, } + + @classmethod + def _expand_columns(cls, col: ResultSetColumnType) -> list[ResultSetColumnType]: + """ + Expand the given column out to one or more columns by analysing their types, + descending into ROWS and expanding out their inner fields recursively. + + We can only navigate named fields in ROWs in this way, so we can't expand out + MAP or ARRAY types, nor fields in ROWs which have no name (in fact the trino + library doesn't correctly parse unnamed fields in ROWs). We won't be able to + expand ROWs which are nested underneath any of those types, either. + + Expanded columns are named foo.bar.baz and we provide a query_as property to + instruct the base engine spec how to correctly query them: instead of quoting + the whole string they have to be quoted like "foo"."bar"."baz" and we then + alias them to the full dotted string for ease of reference. + """ + # pylint: disable=import-outside-toplevel + from trino.sqlalchemy import datatype + + cols = [col] + col_type = col.get("type") + + if not isinstance(col_type, datatype.ROW): + return cols + + for inner_name, inner_type in col_type.attr_types: + outer_name = col["name"] + name = ".".join([outer_name, inner_name]) + query_name = ".".join([f'"{piece}"' for piece in name.split(".")]) + column_spec = cls.get_column_spec(str(inner_type)) + is_dttm = column_spec.is_dttm if column_spec else False + + inner_col = ResultSetColumnType( + name=name, + column_name=name, + type=inner_type, + is_dttm=is_dttm, + query_as=f'{query_name} AS "{name}"', + ) + cols.extend(cls._expand_columns(inner_col)) + + return cols + + @classmethod + def get_columns( + cls, + inspector: Inspector, + table_name: str, + schema: str | None, + options: dict[str, Any] | None = None, + ) -> list[ResultSetColumnType]: + """ + If the "expand_rows" feature is enabled on the database via + "schema_options", expand the schema definition out to show all + subfields of nested ROWs as their appropriate dotted paths. + """ + base_cols = super().get_columns(inspector, table_name, schema, options) + if not (options or {}).get("expand_rows"): + return base_cols + + return [col for base_col in base_cols for col in cls._expand_columns(base_col)] diff --git a/superset/embedded/api.py b/superset/embedded/api.py index ae800bf2b9b2a..b907422bf5169 100644 --- a/superset/embedded/api.py +++ b/superset/embedded/api.py @@ -23,12 +23,12 @@ from flask_appbuilder.models.sqla.interface import SQLAInterface from superset import is_feature_enabled +from superset.commands.dashboard.embedded.exceptions import ( + EmbeddedDashboardNotFoundError, +) from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP, RouteMethod from superset.daos.dashboard import EmbeddedDashboardDAO from superset.dashboards.schemas import EmbeddedDashboardResponseSchema -from superset.embedded_dashboard.commands.exceptions import ( - EmbeddedDashboardNotFoundError, -) from superset.extensions import event_logger from superset.models.embedded_dashboard import EmbeddedDashboard from superset.reports.logs.schemas import openapi_spec_methods_override diff --git a/superset/embedded/view.py b/superset/embedded/view.py index e59a6ced90f68..462c6046faaf0 100644 --- a/superset/embedded/view.py +++ b/superset/embedded/view.py @@ -17,7 +17,7 @@ import json from typing import Callable -from flask import abort, g, request +from flask import abort, request from flask_appbuilder import expose from flask_login import AnonymousUserMixin, login_user from flask_wtf.csrf import same_origin @@ -78,7 +78,7 @@ def embedded( ) bootstrap_data = { - "common": common_bootstrap_payload(g.user), + "common": common_bootstrap_payload(), "embedded": { "dashboard_id": embedded.dashboard_id, }, diff --git a/superset/examples/bart_lines.py b/superset/examples/bart_lines.py index e18f6e4632a01..ad96aecac4cdb 100644 --- a/superset/examples/bart_lines.py +++ b/superset/examples/bart_lines.py @@ -60,9 +60,9 @@ def load_bart_lines(only_metadata: bool = False, force: bool = False) -> None: tbl = db.session.query(table).filter_by(table_name=tbl_name).first() if not tbl: tbl = table(table_name=tbl_name, schema=schema) + db.session.add(tbl) tbl.description = "BART lines" tbl.database = database tbl.filter_select_enabled = True - db.session.merge(tbl) db.session.commit() tbl.fetch_metadata() diff --git a/superset/examples/configs/charts/Filter_Segments.yaml b/superset/examples/configs/charts/Filter_Segments.yaml deleted file mode 100644 index 605e33ca7ee25..0000000000000 --- a/superset/examples/configs/charts/Filter_Segments.yaml +++ /dev/null @@ -1,68 +0,0 @@ -# Licensed to the Apache Software Foundation (ASF) under one -# or more contributor license agreements. See the NOTICE file -# distributed with this work for additional information -# regarding copyright ownership. The ASF licenses this file -# to you under the Apache License, Version 2.0 (the -# "License"); you may not use this file except in compliance -# with the License. You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, -# software distributed under the License is distributed on an -# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY -# KIND, either express or implied. See the License for the -# specific language governing permissions and limitations -# under the License. -slice_name: Filter Segments -viz_type: filter_box -params: - adhoc_filters: [] - datasource: 42__table - date_filter: false - filter_configs: - - asc: true - clearable: true - column: ethnic_minority - key: -xNBqpfQo - label: Ethnic Minority - multiple: true - searchAllOptions: false - - asc: true - clearable: true - column: gender - key: 19VeBGTKf - label: Gender - multiple: true - searchAllOptions: false - - asc: true - clearable: true - column: developer_type - key: OWTb4s69T - label: Developer Type - multiple: true - searchAllOptions: false - - asc: true - clearable: true - column: lang_at_home - key: Fn-YClyhb - label: Language at Home - multiple: true - searchAllOptions: false - - asc: true - clearable: true - column: country_live - key: 2fNskRCLJ - label: Country live - multiple: true - searchAllOptions: false - granularity_sqla: time_start - queryFields: {} - slice_id: 1387 - time_range: No filter - url_params: {} - viz_type: filter_box -cache_timeout: null -uuid: 6420629a-ce74-2c6b-ef7d-b2e78baa3cfe -version: 1.0.0 -dataset_uuid: d95a2865-53ce-1f82-a53d-8e3c89331469 diff --git a/superset/examples/configs/charts/Filtering_Vaccines.yaml b/superset/examples/configs/charts/Filtering_Vaccines.yaml deleted file mode 100644 index e458c5a009343..0000000000000 --- a/superset/examples/configs/charts/Filtering_Vaccines.yaml +++ /dev/null @@ -1,53 +0,0 @@ -# Licensed to the Apache Software Foundation (ASF) under one -# or more contributor license agreements. See the NOTICE file -# distributed with this work for additional information -# regarding copyright ownership. The ASF licenses this file -# to you under the Apache License, Version 2.0 (the -# "License"); you may not use this file except in compliance -# with the License. You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, -# software distributed under the License is distributed on an -# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY -# KIND, either express or implied. See the License for the -# specific language governing permissions and limitations -# under the License. -slice_name: Filtering Vaccines -viz_type: filter_box -params: - adhoc_filters: [] - datasource: 69__table - date_filter: false - filter_configs: - - asc: true - clearable: true - column: country_name - key: D00hRxPLE - label: Country - multiple: true - searchAllOptions: false - - asc: true - clearable: true - column: product_category - key: jJ7x2cuIc - label: Vaccine Approach - multiple: true - searchAllOptions: false - - asc: true - clearable: true - column: clinical_stage - key: EgGwwAUU6 - label: Clinical Stage - multiple: true - searchAllOptions: false - queryFields: {} - slice_id: 3965 - time_range: No filter - url_params: {} - viz_type: filter_box -cache_timeout: null -uuid: c29381ce-0e99-4cf3-bf0f-5f55d6b94176 -version: 1.0.0 -dataset_uuid: 974b7a1c-22ea-49cb-9214-97b7dbd511e0 diff --git a/superset/examples/configs/charts/Video_Game_Sales_Filter.yaml b/superset/examples/configs/charts/Video_Game_Sales_Filter.yaml deleted file mode 100644 index 6c76d53e8eabb..0000000000000 --- a/superset/examples/configs/charts/Video_Game_Sales_Filter.yaml +++ /dev/null @@ -1,55 +0,0 @@ -# Licensed to the Apache Software Foundation (ASF) under one -# or more contributor license agreements. See the NOTICE file -# distributed with this work for additional information -# regarding copyright ownership. The ASF licenses this file -# to you under the Apache License, Version 2.0 (the -# "License"); you may not use this file except in compliance -# with the License. You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, -# software distributed under the License is distributed on an -# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY -# KIND, either express or implied. See the License for the -# specific language governing permissions and limitations -# under the License. -slice_name: Video Game Sales filter -viz_type: filter_box -params: - adhoc_filters: [] - datasource: 21__table - date_filter: true - filter_configs: - - asc: true - clearable: true - column: platform - key: s3ItH9vhG - label: Platform - multiple: true - searchAllOptions: false - - asc: true - clearable: true - column: genre - key: 202hDeMsG - label: Genre - multiple: true - searchAllOptions: false - - asc: true - clearable: true - column: publisher - key: 5Os6jsJFK - label: Publisher - multiple: true - searchAllOptions: false - granularity_sqla: year - queryFields: {} - time_range: No filter - url_params: - preselect_filters: '{"1389": {"platform": ["PS", "PS2", "PS3", "PS4"], "genre": - null, "__time_range": "No filter"}}' - viz_type: filter_box -cache_timeout: null -uuid: fd9ce7ec-ae08-4f71-93e0-7c26b132b2e6 -version: 1.0.0 -dataset_uuid: 53d47c0c-c03d-47f0-b9ac-81225f808283 diff --git a/superset/examples/configs/dashboards/COVID_Vaccine_Dashboard.yaml b/superset/examples/configs/dashboards/COVID_Vaccine_Dashboard.yaml index 363077aebe43c..1d870880b913b 100644 --- a/superset/examples/configs/dashboards/COVID_Vaccine_Dashboard.yaml +++ b/superset/examples/configs/dashboards/COVID_Vaccine_Dashboard.yaml @@ -18,6 +18,9 @@ dashboard_title: COVID Vaccine Dashboard description: null css: "" slug: null +certified_by: "" +certification_details: "" +published: true uuid: f4065089-110a-41fa-8dd7-9ce98a65e250 position: CHART-63bEuxjDMJ: @@ -25,32 +28,32 @@ position: id: CHART-63bEuxjDMJ meta: chartId: 3961 - height: 72 + height: 60 sliceName: Vaccine Candidates per Country sliceNameOverride: Map of Vaccine Candidates uuid: ddc91df6-fb40-4826-bdca-16b85af1c024 - width: 12 + width: 8 parents: - ROOT_ID - TABS-wUKya7eQ0Z - TAB-BCIJF4NvgQ - - ROW-zvw7luvEL + - ROW-xSeNAspgw type: CHART CHART-F-fkth0Dnv: children: [] id: CHART-F-fkth0Dnv meta: chartId: 3960 - height: 60 + height: 82 sliceName: Vaccine Candidates per Country sliceNameOverride: Treemap of Vaccine Candidates per Country uuid: e2f5a8a7-feb0-4f79-bc6b-01fe55b98b3c - width: 8 + width: 4 parents: - ROOT_ID - TABS-wUKya7eQ0Z - TAB-BCIJF4NvgQ - - ROW-xSeNAspgw + - ROW-dieUdkeUw type: CHART CHART-RjD_ygqtwH: children: [] @@ -66,7 +69,7 @@ position: - ROOT_ID - TABS-wUKya7eQ0Z - TAB-BCIJF4NvgQ - - ROW-zvw7luvEL + - ROW-zhOlQLQnB type: CHART CHART-aGfmWtliqA: children: [] @@ -81,17 +84,17 @@ position: - ROOT_ID - TABS-wUKya7eQ0Z - TAB-BCIJF4NvgQ - - ROW-zvw7luvEL + - ROW-zhOlQLQnB type: CHART - CHART-j4hUvP5dDD: + CHART-dCUpAcPsji: children: [] - id: CHART-j4hUvP5dDD + id: CHART-dCUpAcPsji meta: - chartId: 3962 + chartId: 3963 height: 82 - sliceName: Vaccine Candidates per Approach & Stage - sliceNameOverride: Heatmap of Approaches & Clinical Stages - uuid: 0c953c84-0c9a-418d-be9f-2894d2a2cee0 + sliceName: Vaccine Candidates per Country & Stage + sliceNameOverride: Heatmap of Countries & Clinical Stages + uuid: cd111331-d286-4258-9020-c7949a109ed2 width: 4 parents: - ROOT_ID @@ -99,37 +102,37 @@ position: - TAB-BCIJF4NvgQ - ROW-dieUdkeUw type: CHART - CHART-dCUpAcPsji: + CHART-fYo7IyvKZQ: children: [] - id: CHART-dCUpAcPsji + id: CHART-fYo7IyvKZQ meta: - chartId: 3963 - height: 72 + chartId: 3964 + height: 60 sliceName: Vaccine Candidates per Country & Stage - sliceNameOverride: Heatmap of Countries & Clinical Stages - uuid: cd111331-d286-4258-9020-c7949a109ed2 + sliceNameOverride: Sunburst of Country & Clinical Stages + uuid: f69c556f-15fe-4a82-a8bb-69d5b6954123 width: 4 parents: - ROOT_ID - TABS-wUKya7eQ0Z - TAB-BCIJF4NvgQ - - ROW-dieUdkeUw + - ROW-xSeNAspgw type: CHART - CHART-eirDduqb1A: + CHART-j4hUvP5dDD: children: [] - id: CHART-eirDduqb1A + id: CHART-j4hUvP5dDD meta: - chartId: 3965 - height: 60 - sliceName: Filtering Vaccines - sliceNameOverride: Filter Box of Vaccines - uuid: c29381ce-0e99-4cf3-bf0f-5f55d6b94176 + chartId: 3962 + height: 82 + sliceName: Vaccine Candidates per Approach & Stage + sliceNameOverride: Heatmap of Approaches & Clinical Stages + uuid: 0c953c84-0c9a-418d-be9f-2894d2a2cee0 width: 4 parents: - ROOT_ID - TABS-wUKya7eQ0Z - TAB-BCIJF4NvgQ - - ROW-xSeNAspgw + - ROW-dieUdkeUw type: CHART DASHBOARD_VERSION_KEY: v2 GRID_ID: @@ -189,46 +192,17 @@ position: - TAB-BCIJF4NvgQ - ROW-zhOlQLQnB type: MARKDOWN - CHART-fYo7IyvKZQ: - children: [] - id: CHART-fYo7IyvKZQ - meta: - chartId: 3964 - height: 72 - sliceName: Vaccine Candidates per Country & Stage - sliceNameOverride: Sunburst of Country & Clinical Stages - uuid: f69c556f-15fe-4a82-a8bb-69d5b6954123 - width: 4 - parents: - - ROOT_ID - - TABS-wUKya7eQ0Z - - TAB-BCIJF4NvgQ - - ROW-dieUdkeUw - type: CHART ROOT_ID: children: - TABS-wUKya7eQ0Z id: ROOT_ID type: ROOT - ROW-zhOlQLQnB: - children: - - MARKDOWN-VjQQ5SFj5v - - CHART-RjD_ygqtwH - - CHART-aGfmWtliqA - id: ROW-zhOlQLQnB - meta: - "0": ROOT_ID - background: BACKGROUND_TRANSPARENT - parents: - - ROOT_ID - - TABS-wUKya7eQ0Z - - TAB-BCIJF4NvgQ - type: ROW - ROW-xSeNAspgw: + ROW-dieUdkeUw: children: - - CHART-eirDduqb1A - CHART-F-fkth0Dnv - id: ROW-xSeNAspgw + - CHART-dCUpAcPsji + - CHART-j4hUvP5dDD + id: ROW-dieUdkeUw meta: "0": ROOT_ID background: BACKGROUND_TRANSPARENT @@ -237,12 +211,11 @@ position: - TABS-wUKya7eQ0Z - TAB-BCIJF4NvgQ type: ROW - ROW-dieUdkeUw: + ROW-xSeNAspgw: children: - - CHART-dCUpAcPsji + - CHART-63bEuxjDMJ - CHART-fYo7IyvKZQ - - CHART-j4hUvP5dDD - id: ROW-dieUdkeUw + id: ROW-xSeNAspgw meta: "0": ROOT_ID background: BACKGROUND_TRANSPARENT @@ -251,10 +224,12 @@ position: - TABS-wUKya7eQ0Z - TAB-BCIJF4NvgQ type: ROW - ROW-zvw7luvEL: + ROW-zhOlQLQnB: children: - - CHART-63bEuxjDMJ - id: ROW-zvw7luvEL + - MARKDOWN-VjQQ5SFj5v + - CHART-RjD_ygqtwH + - CHART-aGfmWtliqA + id: ROW-zhOlQLQnB meta: "0": ROOT_ID background: BACKGROUND_TRANSPARENT @@ -267,7 +242,6 @@ position: children: - ROW-zhOlQLQnB - ROW-xSeNAspgw - - ROW-zvw7luvEL - ROW-dieUdkeUw id: TAB-BCIJF4NvgQ meta: @@ -316,18 +290,4 @@ metadata: Unknown: "#EFA1AA" Live attenuated virus: "#FDE380" COUNT(*): "#D1C6BC" - filter_scopes: - "3965": - country_name: - scope: - - ROOT_ID - immune: [] - product_category: - scope: - - ROOT_ID - immune: [] - clinical_stage: - scope: - - ROOT_ID - immune: [] version: 1.0.0 diff --git a/superset/examples/configs/dashboards/FCC_New_Coder_Survey_2018.yaml b/superset/examples/configs/dashboards/FCC_New_Coder_Survey_2018.yaml index 2e97e6b576a39..b1508daff0b01 100644 --- a/superset/examples/configs/dashboards/FCC_New_Coder_Survey_2018.yaml +++ b/superset/examples/configs/dashboards/FCC_New_Coder_Survey_2018.yaml @@ -16,8 +16,11 @@ # under the License. dashboard_title: FCC New Coder Survey 2018 description: null -css: '' +css: "" slug: null +certified_by: "" +certification_details: "" +published: true uuid: 5b12b583-8204-08e9-392c-422209c29787 position: CHART--0GPGmD-pO: @@ -25,17 +28,17 @@ position: id: CHART--0GPGmD-pO meta: chartId: 1361 - height: 48 - sliceName: 'Current Developers: Is this your first development job?' + height: 56 + sliceName: "Current Developers: Is this your first development job?" sliceNameOverride: Is this your first development job? uuid: bfe5a8e6-146f-ef59-5e6c-13d519b236a8 width: 2 parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-l_9I0aNYZ - - ROW-b7USYEngT + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-l_9I0aNYZ + - ROW-b7USYEngT type: CHART CHART--w_Br1tPP3: children: [] @@ -47,27 +50,27 @@ position: uuid: a6dd2d5a-2cdc-c8ec-f30c-85920f4f8a65 width: 3 parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-YT6eNksV- - - ROW-DR80aHJA2c + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-YT6eNksV- + - ROW-DR80aHJA2c type: CHART CHART-0-zzTwBINh: children: [] id: CHART-0-zzTwBINh meta: chartId: 3631 - height: 49 + height: 55 sliceName: Last Year Income Distribution uuid: a2ec5256-94b4-43c4-b8c7-b83f70c5d4df width: 3 parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-l_9I0aNYZ - - ROW-b7USYEngT + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-l_9I0aNYZ + - ROW-b7USYEngT type: CHART CHART-37fu7fO6Z0: children: [] @@ -79,11 +82,11 @@ position: uuid: 02f546ae-1bf4-bd26-8bc2-14b9279c8a62 width: 7 parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-l_9I0aNYZ - - ROW-kNjtGVFpp + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-l_9I0aNYZ + - ROW-kNjtGVFpp type: CHART CHART-5QwNlSbXYU: children: [] @@ -95,11 +98,11 @@ position: uuid: 097c05c9-2dd2-481d-813d-d6c0c12b4a3d width: 5 parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-l_9I0aNYZ - - ROW-kNjtGVFpp + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-l_9I0aNYZ + - ROW-kNjtGVFpp type: CHART CHART-FKuVqq4kaA: children: [] @@ -112,11 +115,11 @@ position: uuid: e6b09c28-98cf-785f-4caf-320fd4fca802 width: 3 parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-YT6eNksV- - - ROW-DR80aHJA2c + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-YT6eNksV- + - ROW-DR80aHJA2c type: CHART CHART-JnpdZOhVer: children: [] @@ -124,16 +127,16 @@ position: meta: chartId: 1369 height: 50 - sliceName: "\U0001F393 Highest degree held" + sliceName: Highest degree held uuid: 9f7d2b9c-6b3a-69f9-f03e-d3a141514639 width: 2 parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-YT6eNksV- - - ROW--BIzjz9F0 - - COLUMN-IEKAo_QJlz + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-YT6eNksV- + - ROW--BIzjz9F0 + - COLUMN-IEKAo_QJlz type: CHART CHART-LjfhrUkEef: children: [] @@ -145,11 +148,11 @@ position: uuid: 067c4a1e-ae03-4c0c-8e2a-d2c0f4bf43c3 width: 5 parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-l_9I0aNYZ - - ROW-s3l4os7YY + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-l_9I0aNYZ + - ROW-s3l4os7YY type: CHART CHART-Q3pbwsH3id: children: [] @@ -162,27 +165,27 @@ position: uuid: def07750-b5c0-0b69-6228-cb2330916166 width: 3 parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-AsMaxdYL_t - - ROW-mOvr_xWm1 + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-AsMaxdYL_t + - ROW-mOvr_xWm1 type: CHART CHART-QVql08s5Bv: children: [] id: CHART-QVql08s5Bv meta: chartId: 3632 - height: 50 + height: 56 sliceName: First Time Developer? uuid: edc75073-8f33-4123-a28d-cd6dfb33cade width: 3 parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-l_9I0aNYZ - - ROW-b7USYEngT + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-l_9I0aNYZ + - ROW-b7USYEngT type: CHART CHART-UtSaz4pfV6: children: [] @@ -194,12 +197,12 @@ position: uuid: 5f1ea868-604e-f69d-a241-5daa83ff33be width: 3 parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-AsMaxdYL_t - - ROW-UsW-_RPAb - - COLUMN-OJ5spdMmNh + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-AsMaxdYL_t + - ROW-UsW-_RPAb + - COLUMN-OJ5spdMmNh type: CHART CHART-VvFbGxi3X_: children: [] @@ -211,12 +214,12 @@ position: uuid: 03a74c97-52fc-cf87-233c-d4275f8c550c width: 3 parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-AsMaxdYL_t - - ROW-UsW-_RPAb - - COLUMN-OJ5spdMmNh + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-AsMaxdYL_t + - ROW-UsW-_RPAb + - COLUMN-OJ5spdMmNh type: CHART CHART-XHncHuS5pZ: children: [] @@ -229,11 +232,11 @@ position: uuid: a0e5329f-224e-6fc8-efd2-d37d0f546ee8 width: 2 parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-YT6eNksV- - - ROW-DR80aHJA2c + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-YT6eNksV- + - ROW-DR80aHJA2c type: CHART CHART-YSzS5GOOLf: children: [] @@ -245,11 +248,11 @@ position: uuid: 4880e4f4-b701-4be0-86f3-e7e89432e83b width: 3 parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-AsMaxdYL_t - - ROW-mOvr_xWm1 + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-AsMaxdYL_t + - ROW-mOvr_xWm1 type: CHART CHART-ZECnzPz8Bi: children: [] @@ -261,44 +264,27 @@ position: uuid: 5596e0f6-78a9-465d-8325-7139c794a06a width: 7 parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-l_9I0aNYZ - - ROW-s3l4os7YY + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-l_9I0aNYZ + - ROW-s3l4os7YY type: CHART CHART-aytwlT4GAq: children: [] id: CHART-aytwlT4GAq meta: chartId: 1384 - height: 50 + height: 30 sliceName: Breakdown of Developer Type uuid: b8386be8-f44e-6535-378c-2aa2ba461286 - width: 4 - parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-AsMaxdYL_t - - ROW-y-GwJPgxLr - type: CHART - CHART-d6vjW6rC6V: - children: [] - id: CHART-d6vjW6rC6V - meta: - chartId: 1387 - height: 54 - sliceName: Filter Segments - sliceNameOverride: Filter By - uuid: 6420629a-ce74-2c6b-ef7d-b2e78baa3cfe - width: 5 + width: 6 parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-AsMaxdYL_t - - ROW-y-GwJPgxLr + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-AsMaxdYL_t + - ROW-y-GwJPgxLr type: CHART CHART-fLpTSAHpAO: children: [] @@ -310,11 +296,11 @@ position: uuid: 2ba66056-a756-d6a3-aaec-0c243fb7062e width: 9 parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-AsMaxdYL_t - - ROW-UsW-_RPAb + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-AsMaxdYL_t + - ROW-UsW-_RPAb type: CHART CHART-lQVSAw0Or3: children: [] @@ -327,11 +313,11 @@ position: uuid: cb8998ab-9f93-4f0f-4e4b-3bfe4b0dea9d width: 4 parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-YT6eNksV- - - ROW--BIzjz9F0 + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-YT6eNksV- + - ROW--BIzjz9F0 type: CHART CHART-o-JPAWMZK-: children: [] @@ -343,11 +329,11 @@ position: uuid: 0f6b447c-828c-e71c-87ac-211bc412b214 width: 3 parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-AsMaxdYL_t - - ROW-mOvr_xWm1 + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-AsMaxdYL_t + - ROW-mOvr_xWm1 type: CHART CHART-v22McUFMtx: children: [] @@ -360,12 +346,12 @@ position: uuid: 6d0ceb30-2008-d19c-d285-cf77dc764433 width: 4 parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-YT6eNksV- - - ROW--BIzjz9F0 - - COLUMN-IEKAo_QJlz + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-YT6eNksV- + - ROW--BIzjz9F0 + - COLUMN-IEKAo_QJlz type: CHART CHART-wxWVtlajRF: children: [] @@ -377,49 +363,49 @@ position: uuid: bff88053-ccc4-92f2-d6f5-de83e950e8cd width: 4 parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-YT6eNksV- - - ROW--BIzjz9F0 + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-YT6eNksV- + - ROW--BIzjz9F0 type: CHART COLUMN-IEKAo_QJlz: children: - - CHART-JnpdZOhVer - - CHART-v22McUFMtx + - CHART-JnpdZOhVer + - CHART-v22McUFMtx id: COLUMN-IEKAo_QJlz meta: background: BACKGROUND_TRANSPARENT width: 4 parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-YT6eNksV- - - ROW--BIzjz9F0 + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-YT6eNksV- + - ROW--BIzjz9F0 type: COLUMN COLUMN-OJ5spdMmNh: children: - - CHART-VvFbGxi3X_ - - CHART-UtSaz4pfV6 + - CHART-VvFbGxi3X_ + - CHART-UtSaz4pfV6 id: COLUMN-OJ5spdMmNh meta: background: BACKGROUND_TRANSPARENT width: 3 parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-AsMaxdYL_t - - ROW-UsW-_RPAb + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-AsMaxdYL_t + - ROW-UsW-_RPAb type: COLUMN DASHBOARD_VERSION_KEY: v2 GRID_ID: children: - - TABS-L-d9eyOE-b + - TABS-L-d9eyOE-b id: GRID_ID parents: - - ROOT_ID + - ROOT_ID type: GRID HEADER_ID: id: HEADER_ID @@ -453,21 +439,21 @@ position: height: 50 width: 4 parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-YT6eNksV- - - ROW-DR80aHJA2c + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-YT6eNksV- + - ROW-DR80aHJA2c type: MARKDOWN MARKDOWN-NQmSPDOtpl: children: [] id: MARKDOWN-NQmSPDOtpl meta: - code: '# Current Developers + code: "# Current Developers - While majority of the students on FCC are Aspiring developers, there''s a - nontrivial minority that''s there to continue leveling up their skills (17% + While majority of the students on FCC are Aspiring developers, there's a + nontrivial minority that's there to continue leveling up their skills (17% of the survey respondents). @@ -480,28 +466,28 @@ position: - The proportion of developers whose current job is their first developer job - - Distribution of last year''s income + - Distribution of last year's income - The geographic distribution of these developers - The overlap between commute time and if their current job is their first developer job - - Potential link between highest degree earned and last year''s income' - height: 50 + - Potential link between highest degree earned and last year's income" + height: 56 width: 4 parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-l_9I0aNYZ - - ROW-b7USYEngT + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-l_9I0aNYZ + - ROW-b7USYEngT type: MARKDOWN MARKDOWN-__u6CsUyfh: children: [] id: MARKDOWN-__u6CsUyfh meta: - code: '## FreeCodeCamp New Coder Survey 2018 + code: "## FreeCodeCamp New Coder Survey 2018 Every year, FCC surveys its user base (mostly budding software developers) @@ -513,21 +499,22 @@ position: - [Dataset](https://github.com/freeCodeCamp/2018-new-coder-survey) - - [FCC Blog Post](https://www.freecodecamp.org/news/we-asked-20-000-people-who-they-are-and-how-theyre-learning-to-code-fff5d668969/)' - height: 45 - width: 3 - parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-AsMaxdYL_t - - ROW-y-GwJPgxLr + - [FCC Blog Post](https://www.freecodecamp.org/news/we-asked-20-000-people-who-they-are-and-how-theyre-learning-to-code-fff5d668969/)" + height: 30 + width: 6 + parents: + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-AsMaxdYL_t + - ROW-y-GwJPgxLr type: MARKDOWN MARKDOWN-zc2mWxZeox: children: [] id: MARKDOWN-zc2mWxZeox meta: - code: "# Demographics\n\nFreeCodeCamp is a completely-online community of people\ + code: + "# Demographics\n\nFreeCodeCamp is a completely-online community of people\ \ learning to code and consists of aspiring & current developers from all\ \ over the world. That doesn't necessarily mean that access to these types\ \ of opportunities are evenly distributed. \n\nThe following charts can begin\ @@ -537,243 +524,220 @@ position: height: 52 width: 3 parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-AsMaxdYL_t - - ROW-mOvr_xWm1 + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-AsMaxdYL_t + - ROW-mOvr_xWm1 type: MARKDOWN ROOT_ID: children: - - GRID_ID + - GRID_ID id: ROOT_ID type: ROOT ROW--BIzjz9F0: children: - - COLUMN-IEKAo_QJlz - - CHART-lQVSAw0Or3 - - CHART-wxWVtlajRF + - COLUMN-IEKAo_QJlz + - CHART-lQVSAw0Or3 + - CHART-wxWVtlajRF id: ROW--BIzjz9F0 meta: background: BACKGROUND_TRANSPARENT parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-YT6eNksV- + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-YT6eNksV- type: ROW ROW-DR80aHJA2c: children: - - MARKDOWN-BUmyHM2s0x - - CHART-XHncHuS5pZ - - CHART--w_Br1tPP3 - - CHART-FKuVqq4kaA + - MARKDOWN-BUmyHM2s0x + - CHART-XHncHuS5pZ + - CHART--w_Br1tPP3 + - CHART-FKuVqq4kaA id: ROW-DR80aHJA2c meta: background: BACKGROUND_TRANSPARENT parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-YT6eNksV- + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-YT6eNksV- type: ROW ROW-UsW-_RPAb: children: - - COLUMN-OJ5spdMmNh - - CHART-fLpTSAHpAO + - COLUMN-OJ5spdMmNh + - CHART-fLpTSAHpAO id: ROW-UsW-_RPAb meta: background: BACKGROUND_TRANSPARENT parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-AsMaxdYL_t + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-AsMaxdYL_t type: ROW ROW-b7USYEngT: children: - - MARKDOWN-NQmSPDOtpl - - CHART--0GPGmD-pO - - CHART-QVql08s5Bv - - CHART-0-zzTwBINh + - MARKDOWN-NQmSPDOtpl + - CHART--0GPGmD-pO + - CHART-QVql08s5Bv + - CHART-0-zzTwBINh id: ROW-b7USYEngT meta: background: BACKGROUND_TRANSPARENT parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-l_9I0aNYZ + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-l_9I0aNYZ type: ROW ROW-kNjtGVFpp: children: - - CHART-5QwNlSbXYU - - CHART-37fu7fO6Z0 + - CHART-5QwNlSbXYU + - CHART-37fu7fO6Z0 id: ROW-kNjtGVFpp meta: background: BACKGROUND_TRANSPARENT parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-l_9I0aNYZ + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-l_9I0aNYZ type: ROW ROW-mOvr_xWm1: children: - - MARKDOWN-zc2mWxZeox - - CHART-Q3pbwsH3id - - CHART-o-JPAWMZK- - - CHART-YSzS5GOOLf + - MARKDOWN-zc2mWxZeox + - CHART-Q3pbwsH3id + - CHART-o-JPAWMZK- + - CHART-YSzS5GOOLf id: ROW-mOvr_xWm1 meta: background: BACKGROUND_TRANSPARENT parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-AsMaxdYL_t + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-AsMaxdYL_t type: ROW ROW-s3l4os7YY: children: - - CHART-LjfhrUkEef - - CHART-ZECnzPz8Bi + - CHART-LjfhrUkEef + - CHART-ZECnzPz8Bi id: ROW-s3l4os7YY meta: background: BACKGROUND_TRANSPARENT parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-l_9I0aNYZ + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-l_9I0aNYZ type: ROW ROW-y-GwJPgxLr: children: - - MARKDOWN-__u6CsUyfh - - CHART-aytwlT4GAq - - CHART-d6vjW6rC6V + - MARKDOWN-__u6CsUyfh + - CHART-aytwlT4GAq id: ROW-y-GwJPgxLr meta: background: BACKGROUND_TRANSPARENT parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b - - TAB-AsMaxdYL_t + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b + - TAB-AsMaxdYL_t type: ROW TAB-AsMaxdYL_t: children: - - ROW-y-GwJPgxLr - - ROW-mOvr_xWm1 - - ROW-UsW-_RPAb + - ROW-y-GwJPgxLr + - ROW-mOvr_xWm1 + - ROW-UsW-_RPAb id: TAB-AsMaxdYL_t meta: text: Overview parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b type: TAB TAB-YT6eNksV-: children: - - ROW-DR80aHJA2c - - ROW--BIzjz9F0 + - ROW-DR80aHJA2c + - ROW--BIzjz9F0 id: TAB-YT6eNksV- meta: text: "\U0001F680 Aspiring Developers" parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b type: TAB TAB-l_9I0aNYZ: children: - - ROW-b7USYEngT - - ROW-kNjtGVFpp - - ROW-s3l4os7YY + - ROW-b7USYEngT + - ROW-kNjtGVFpp + - ROW-s3l4os7YY id: TAB-l_9I0aNYZ meta: text: "\U0001F4BB Current Developers" parents: - - ROOT_ID - - GRID_ID - - TABS-L-d9eyOE-b + - ROOT_ID + - GRID_ID + - TABS-L-d9eyOE-b type: TAB TABS-L-d9eyOE-b: children: - - TAB-AsMaxdYL_t - - TAB-YT6eNksV- - - TAB-l_9I0aNYZ + - TAB-AsMaxdYL_t + - TAB-YT6eNksV- + - TAB-l_9I0aNYZ id: TABS-L-d9eyOE-b meta: {} parents: - - ROOT_ID - - GRID_ID + - ROOT_ID + - GRID_ID type: TABS metadata: timed_refresh_immune_slices: [] expanded_slices: {} refresh_frequency: 0 - default_filters: '{}' + default_filters: "{}" color_scheme: supersetColors - filter_scopes: - '1387': - ethnic_minority: - scope: - - TAB-AsMaxdYL_t - immune: [] - gender: - scope: - - ROOT_ID - immune: [] - developer_type: - scope: - - ROOT_ID - immune: [] - lang_at_home: - scope: - - ROOT_ID - immune: [] - country_live: - scope: - - ROOT_ID - immune: [] label_colors: - '0': '#FCC700' - '1': '#A868B7' - '15': '#3CCCCB' - '30': '#A38F79' - '45': '#8FD3E4' - age: '#1FA8C9' - Yes,: '#1FA8C9' - Female: '#454E7C' - Prefer: '#5AC189' - No,: '#FF7F44' - Male: '#666666' - Prefer not to say: '#E04355' - Ph.D.: '#FCC700' - associate's degree: '#A868B7' - bachelor's degree: '#3CCCCB' - high school diploma or equivalent (GED): '#A38F79' - master's degree (non-professional): '#8FD3E4' - no high school (secondary school): '#A1A6BD' - professional degree (MBA, MD, JD, etc.): '#ACE1C4' - some college credit, no degree: '#FEC0A1' - some high school: '#B2B2B2' - trade, technical, or vocational training: '#EFA1AA' - No, not an ethnic minority: '#1FA8C9' - Yes, an ethnic minority: '#454E7C' - <NULL>: '#5AC189' - 'Yes': '#FF7F44' - 'No': '#666666' - last_yr_income: '#E04355' - More: '#A1A6BD' - Less: '#ACE1C4' - I: '#FEC0A1' - expected_earn: '#B2B2B2' - 'Yes: Willing To': '#EFA1AA' - 'No: Not Willing to': '#FDE380' - No Answer: '#D3B3DA' - In an Office (with Other Developers): '#9EE5E5' - No Preference: '#D1C6BC' - From Home: '#1FA8C9' + "0": "#FCC700" + "1": "#A868B7" + "15": "#3CCCCB" + "30": "#A38F79" + "45": "#8FD3E4" + age: "#1FA8C9" + Yes,: "#1FA8C9" + Female: "#454E7C" + Prefer: "#5AC189" + No,: "#FF7F44" + Male: "#666666" + Prefer not to say: "#E04355" + Ph.D.: "#FCC700" + associate's degree: "#A868B7" + bachelor's degree: "#3CCCCB" + high school diploma or equivalent (GED): "#A38F79" + master's degree (non-professional): "#8FD3E4" + no high school (secondary school): "#A1A6BD" + professional degree (MBA, MD, JD, etc.): "#ACE1C4" + some college credit, no degree: "#FEC0A1" + some high school: "#B2B2B2" + trade, technical, or vocational training: "#EFA1AA" + No, not an ethnic minority: "#1FA8C9" + Yes, an ethnic minority: "#454E7C" + <NULL>: "#5AC189" + "Yes": "#FF7F44" + "No": "#666666" + last_yr_income: "#E04355" + More: "#A1A6BD" + Less: "#ACE1C4" + I: "#FEC0A1" + expected_earn: "#B2B2B2" + "Yes: Willing To": "#EFA1AA" + "No: Not Willing to": "#FDE380" + No Answer: "#D3B3DA" + In an Office (with Other Developers): "#9EE5E5" + No Preference: "#D1C6BC" + From Home: "#1FA8C9" version: 1.0.0 diff --git a/superset/examples/configs/dashboards/Sales_Dashboard.yaml b/superset/examples/configs/dashboards/Sales_Dashboard.yaml index 3efea3af2599b..439b763d0cfca 100644 --- a/superset/examples/configs/dashboards/Sales_Dashboard.yaml +++ b/superset/examples/configs/dashboards/Sales_Dashboard.yaml @@ -16,8 +16,11 @@ # under the License. dashboard_title: Sales Dashboard description: null -css: '' +css: "" slug: null +certified_by: "" +certification_details: "" +published: true uuid: 04f79081-fb49-7bac-7f14-cc76cd2ad93b position: CHART-1NOOLm5YPs: @@ -31,26 +34,26 @@ position: uuid: c3d643cd-fd6f-4659-a5b7-59402487a8d0 width: 2 parents: - - ROOT_ID - - TABS-e5Ruro0cjP - - TAB-d-E0Zc1cTH - - ROW-Tyv02UA_6W - - COLUMN-8Rp54B6ikC + - ROOT_ID + - TABS-e5Ruro0cjP + - TAB-d-E0Zc1cTH + - ROW-Tyv02UA_6W + - COLUMN-8Rp54B6ikC type: CHART CHART-AYpv8gFi_q: children: [] id: CHART-AYpv8gFi_q meta: chartId: 2810 - height: 91 + height: 70 sliceName: Number of Deals (for each Combination) uuid: bd20fc69-dd51-46c1-99b5-09e37a434bf1 - width: 3 + width: 6 parents: - - ROOT_ID - - TABS-e5Ruro0cjP - - TAB-4fthLQmdX - - ROW-0l1WcDzW3 + - ROOT_ID + - TABS-e5Ruro0cjP + - TAB-4fthLQmdX + - ROW-0l1WcDzW3 type: CHART CHART-KKT9BsnUst: children: [] @@ -63,90 +66,74 @@ position: uuid: db9609e4-9b78-4a32-87a7-4d9e19d51cd8 width: 7 parents: - - ROOT_ID - - TABS-e5Ruro0cjP - - TAB-d-E0Zc1cTH - - ROW-oAtmu5grZ + - ROOT_ID + - TABS-e5Ruro0cjP + - TAB-d-E0Zc1cTH + - ROW-oAtmu5grZ type: CHART CHART-OJ9aWDmn1q: children: [] id: CHART-OJ9aWDmn1q meta: chartId: 2808 - height: 91 + height: 70 sliceName: Proportion of Revenue by Product Line sliceNameOverride: Proportion of Monthly Revenue by Product Line uuid: 08aff161-f60c-4cb3-a225-dc9b1140d2e3 width: 6 parents: - - ROOT_ID - - TABS-e5Ruro0cjP - - TAB-4fthLQmdX - - ROW-0l1WcDzW3 + - ROOT_ID + - TABS-e5Ruro0cjP + - TAB-4fthLQmdX + - ROW-0l1WcDzW3 type: CHART CHART-YFg-9wHE7s: children: [] id: CHART-YFg-9wHE7s meta: chartId: 2811 - height: 63 + height: 49 sliceName: Seasonality of Revenue (per Product Line) uuid: cf0da099-b3ab-4d94-ab62-cf353ac3c611 width: 6 parents: - - ROOT_ID - - TABS-e5Ruro0cjP - - TAB-4fthLQmdX - - ROW-E7MDSGfnm + - ROOT_ID + - TABS-e5Ruro0cjP + - TAB-4fthLQmdX + - ROW-E7MDSGfnm type: CHART CHART-_LMKI0D3tj: children: [] id: CHART-_LMKI0D3tj meta: chartId: 2809 - height: 62 - sliceName: Revenue by Deal SIze + height: 49 + sliceName: Revenue by Deal Size sliceNameOverride: Monthly Revenue by Deal SIze uuid: f065a533-2e13-42b9-bd19-801a21700dff width: 6 parents: - - ROOT_ID - - TABS-e5Ruro0cjP - - TAB-4fthLQmdX - - ROW-E7MDSGfnm + - ROOT_ID + - TABS-e5Ruro0cjP + - TAB-4fthLQmdX + - ROW-E7MDSGfnm type: CHART CHART-id4RGv80N-: children: [] id: CHART-id4RGv80N- meta: chartId: 2807 - height: 40 + height: 59 sliceName: Total Items Sold (By Product Line) sliceNameOverride: Total Products Sold (By Product Line) uuid: b8b7ca30-6291-44b0-bc64-ba42e2892b86 width: 2 parents: - - ROOT_ID - - TABS-e5Ruro0cjP - - TAB-d-E0Zc1cTH - - ROW-oAtmu5grZ - - COLUMN-G6_2DvG8aK - type: CHART - CHART-iyvXMcqHt9: - children: [] - id: CHART-iyvXMcqHt9 - meta: - chartId: 671 - height: 39 - sliceName: Filter - uuid: a5689df7-98fc-7c51-602c-ebd92dc3ec70 - width: 2 - parents: - - ROOT_ID - - TABS-e5Ruro0cjP - - TAB-4fthLQmdX - - ROW-0l1WcDzW3 - - COLUMN-jlNWyWCfTC + - ROOT_ID + - TABS-e5Ruro0cjP + - TAB-d-E0Zc1cTH + - ROW-oAtmu5grZ + - COLUMN-G6_2DvG8aK type: CHART CHART-j24u8ve41b: children: [] @@ -159,10 +146,10 @@ position: uuid: 09c497e0-f442-1121-c9e7-671e37750424 width: 3 parents: - - ROOT_ID - - TABS-e5Ruro0cjP - - TAB-d-E0Zc1cTH - - ROW-oAtmu5grZ + - ROOT_ID + - TABS-e5Ruro0cjP + - TAB-d-E0Zc1cTH + - ROW-oAtmu5grZ type: CHART CHART-lFanAaYKBK: children: [] @@ -174,11 +161,11 @@ position: uuid: 7b12a243-88e0-4dc5-ac33-9a840bb0ac5a width: 3 parents: - - ROOT_ID - - TABS-e5Ruro0cjP - - TAB-d-E0Zc1cTH - - ROW-Tyv02UA_6W - - COLUMN-8Rp54B6ikC + - ROOT_ID + - TABS-e5Ruro0cjP + - TAB-d-E0Zc1cTH + - ROW-Tyv02UA_6W + - COLUMN-8Rp54B6ikC type: CHART CHART-vomBOiI7U9: children: [] @@ -191,58 +178,44 @@ position: uuid: 692aca26-a526-85db-c94c-411c91cc1077 width: 7 parents: - - ROOT_ID - - TABS-e5Ruro0cjP - - TAB-d-E0Zc1cTH - - ROW-Tyv02UA_6W + - ROOT_ID + - TABS-e5Ruro0cjP + - TAB-d-E0Zc1cTH + - ROW-Tyv02UA_6W type: CHART COLUMN-8Rp54B6ikC: children: - - CHART-lFanAaYKBK - - CHART-1NOOLm5YPs + - CHART-lFanAaYKBK + - CHART-1NOOLm5YPs id: COLUMN-8Rp54B6ikC meta: background: BACKGROUND_TRANSPARENT width: 2 parents: - - ROOT_ID - - TABS-e5Ruro0cjP - - TAB-d-E0Zc1cTH - - ROW-Tyv02UA_6W + - ROOT_ID + - TABS-e5Ruro0cjP + - TAB-d-E0Zc1cTH + - ROW-Tyv02UA_6W type: COLUMN COLUMN-G6_2DvG8aK: children: - - CHART-id4RGv80N- + - CHART-id4RGv80N- id: COLUMN-G6_2DvG8aK meta: background: BACKGROUND_TRANSPARENT width: 2 parents: - - ROOT_ID - - TABS-e5Ruro0cjP - - TAB-d-E0Zc1cTH - - ROW-oAtmu5grZ - type: COLUMN - COLUMN-jlNWyWCfTC: - children: - - MARKDOWN-HrzsMmvGQo - - CHART-iyvXMcqHt9 - id: COLUMN-jlNWyWCfTC - meta: - background: BACKGROUND_TRANSPARENT - width: 3 - parents: - - ROOT_ID - - TABS-e5Ruro0cjP - - TAB-4fthLQmdX - - ROW-0l1WcDzW3 + - ROOT_ID + - TABS-e5Ruro0cjP + - TAB-d-E0Zc1cTH + - ROW-oAtmu5grZ type: COLUMN DASHBOARD_VERSION_KEY: v2 GRID_ID: children: [] id: GRID_ID parents: - - ROOT_ID + - ROOT_ID type: GRID HEADER_ID: id: HEADER_ID @@ -253,7 +226,8 @@ position: children: [] id: MARKDOWN--AtDSWnapE meta: - code: "# \U0001F697 Vehicle Sales Dashboard \U0001F3CD\n\nThis example dashboard\ + code: + "# \U0001F697 Vehicle Sales Dashboard \U0001F3CD\n\nThis example dashboard\ \ provides insight into the business operations of vehicle seller. The dataset\ \ powering this dashboard can be found [here on Kaggle](https://www.kaggle.com/kyanyoga/sample-sales-data).\n\ \n### Timeline\n\nThe dataset contains data on all orders from the 2003 and\ @@ -265,151 +239,113 @@ position: height: 53 width: 3 parents: - - ROOT_ID - - TABS-e5Ruro0cjP - - TAB-d-E0Zc1cTH - - ROW-Tyv02UA_6W - type: MARKDOWN - MARKDOWN-HrzsMmvGQo: - children: [] - id: MARKDOWN-HrzsMmvGQo - meta: - code: "# \U0001F50D Filter Box\n\nDashboard filters are a powerful way to enable\ - \ teams to dive deeper into their business operations data. This filter box\ - \ helps focus the charts along the following variables:\n\n- Time Range: Focus\ - \ in on a specific time period (e.g. a holiday or quarter)\n- Product Line:\ - \ Choose 1 or more product lines to see relevant sales data\n- Deal Size:\ - \ Zoom in on small, medium, and / or large sales deals\n\nThe filter box below\ - \ \U0001F447 is configured to only apply to the charts in this tab (**Exploratory**).\ - \ You can customize the charts that this filter box applies to by:\n\n- entering\ - \ Edit mode in this dashboard\n- selecting the `...` in the top right corner\n\ - - selecting the **Set filter mapping** button" - height: 50 - width: 3 - parents: - - ROOT_ID - - TABS-e5Ruro0cjP - - TAB-4fthLQmdX - - ROW-0l1WcDzW3 - - COLUMN-jlNWyWCfTC + - ROOT_ID + - TABS-e5Ruro0cjP + - TAB-d-E0Zc1cTH + - ROW-Tyv02UA_6W type: MARKDOWN ROOT_ID: children: - - TABS-e5Ruro0cjP + - TABS-e5Ruro0cjP id: ROOT_ID type: ROOT ROW-0l1WcDzW3: children: - - COLUMN-jlNWyWCfTC - - CHART-OJ9aWDmn1q - - CHART-AYpv8gFi_q + - CHART-OJ9aWDmn1q + - CHART-AYpv8gFi_q id: ROW-0l1WcDzW3 meta: background: BACKGROUND_TRANSPARENT parents: - - ROOT_ID - - TABS-e5Ruro0cjP - - TAB-4fthLQmdX + - ROOT_ID + - TABS-e5Ruro0cjP + - TAB-4fthLQmdX type: ROW ROW-E7MDSGfnm: children: - - CHART-YFg-9wHE7s - - CHART-_LMKI0D3tj + - CHART-YFg-9wHE7s + - CHART-_LMKI0D3tj id: ROW-E7MDSGfnm meta: background: BACKGROUND_TRANSPARENT parents: - - ROOT_ID - - TABS-e5Ruro0cjP - - TAB-4fthLQmdX + - ROOT_ID + - TABS-e5Ruro0cjP + - TAB-4fthLQmdX type: ROW ROW-Tyv02UA_6W: children: - - COLUMN-8Rp54B6ikC - - CHART-vomBOiI7U9 - - MARKDOWN--AtDSWnapE + - COLUMN-8Rp54B6ikC + - CHART-vomBOiI7U9 + - MARKDOWN--AtDSWnapE id: ROW-Tyv02UA_6W meta: background: BACKGROUND_TRANSPARENT parents: - - ROOT_ID - - TABS-e5Ruro0cjP - - TAB-d-E0Zc1cTH + - ROOT_ID + - TABS-e5Ruro0cjP + - TAB-d-E0Zc1cTH type: ROW ROW-oAtmu5grZ: children: - - COLUMN-G6_2DvG8aK - - CHART-KKT9BsnUst - - CHART-j24u8ve41b + - COLUMN-G6_2DvG8aK + - CHART-KKT9BsnUst + - CHART-j24u8ve41b id: ROW-oAtmu5grZ meta: background: BACKGROUND_TRANSPARENT parents: - - ROOT_ID - - TABS-e5Ruro0cjP - - TAB-d-E0Zc1cTH + - ROOT_ID + - TABS-e5Ruro0cjP + - TAB-d-E0Zc1cTH type: ROW TAB-4fthLQmdX: children: - - ROW-0l1WcDzW3 - - ROW-E7MDSGfnm + - ROW-0l1WcDzW3 + - ROW-E7MDSGfnm id: TAB-4fthLQmdX meta: text: "\U0001F9ED Exploratory" parents: - - ROOT_ID - - TABS-e5Ruro0cjP + - ROOT_ID + - TABS-e5Ruro0cjP type: TAB TAB-d-E0Zc1cTH: children: - - ROW-Tyv02UA_6W - - ROW-oAtmu5grZ + - ROW-Tyv02UA_6W + - ROW-oAtmu5grZ id: TAB-d-E0Zc1cTH meta: text: "\U0001F3AF Sales Overview" parents: - - ROOT_ID - - TABS-e5Ruro0cjP + - ROOT_ID + - TABS-e5Ruro0cjP type: TAB TABS-e5Ruro0cjP: children: - - TAB-d-E0Zc1cTH - - TAB-4fthLQmdX + - TAB-d-E0Zc1cTH + - TAB-4fthLQmdX id: TABS-e5Ruro0cjP meta: {} parents: - - ROOT_ID + - ROOT_ID type: TABS metadata: timed_refresh_immune_slices: [] expanded_slices: {} refresh_frequency: 0 - default_filters: '{"671": {"__time_range": "No filter"}}' - filter_scopes: - "671": - product_line: - scope: - - TAB-4fthLQmdX - immune: [] - deal_size: - scope: - - ROOT_ID - immune: [] - __time_range: - scope: - - ROOT_ID - immune: [] + default_filters: "{}" color_scheme: supersetColors label_colors: - Medium: '#1FA8C9' - Small: '#454E7C' - Large: '#5AC189' - SUM(SALES): '#1FA8C9' - Classic Cars: '#454E7C' - Vintage Cars: '#5AC189' - Motorcycles: '#FF7F44' - Trucks and Buses: '#666666' - Planes: '#E04355' - Ships: '#FCC700' - Trains: '#A868B7' + Medium: "#1FA8C9" + Small: "#454E7C" + Large: "#5AC189" + SUM(SALES): "#1FA8C9" + Classic Cars: "#454E7C" + Vintage Cars: "#5AC189" + Motorcycles: "#FF7F44" + Trucks and Buses: "#666666" + Planes: "#E04355" + Ships: "#FCC700" + Trains: "#A868B7" version: 1.0.0 diff --git a/superset/examples/configs/dashboards/Video_Game_Sales.yaml b/superset/examples/configs/dashboards/Video_Game_Sales.yaml index 958d32b0696b7..2edaad2d1a8bc 100644 --- a/superset/examples/configs/dashboards/Video_Game_Sales.yaml +++ b/superset/examples/configs/dashboards/Video_Game_Sales.yaml @@ -16,39 +16,27 @@ # under the License. dashboard_title: Video Game Sales description: null -css: '' +css: "" slug: null +certified_by: "" +certification_details: "" +published: true uuid: c7bc10f4-6a2d-7569-caae-bbc91864ee11 position: - CHART-1L7NIcXvVN: - children: [] - id: CHART-1L7NIcXvVN - meta: - chartId: 3544 - height: 79 - sliceName: Games per Genre over time - uuid: 0f8976aa-7bb4-40c7-860b-64445a51aaaf - width: 6 - parents: - - ROOT_ID - - TABS-97PVJa11D_ - - TAB-2_QXp8aNq - - ROW-fjg6YQBkH - type: CHART CHART-7mKdnU7OUJ: children: [] id: CHART-7mKdnU7OUJ meta: chartId: 3545 - height: 80 + height: 55 sliceName: Games per Genre uuid: 0499bdec-0837-44f3-ae8a-8c670de81afd - width: 3 + width: 8 parents: - - ROOT_ID - - TABS-97PVJa11D_ - - TAB-2_QXp8aNq - - ROW-yP9SB89PZ + - ROOT_ID + - TABS-97PVJa11D_ + - TAB-2_QXp8aNq + - ROW-yP9SB89PZ type: CHART CHART-8OG3UJX-Tn: children: [] @@ -56,15 +44,15 @@ position: meta: chartId: 661 height: 54 - sliceName: '# of Games That Hit 100k in Sales By Release Year' - sliceNameOverride: 'Top 10 Consoles, by # of Hit Games' + sliceName: "# of Games That Hit 100k in Sales By Release Year" + sliceNameOverride: "Top 10 Consoles, by # of Hit Games" uuid: 2b69887b-23e3-b46d-d38c-8ea11856c555 width: 6 parents: - - ROOT_ID - - TABS-97PVJa11D_ - - TAB-lg-5ymUDgm - - ROW-7kAf1blYU + - ROOT_ID + - TABS-97PVJa11D_ + - TAB-lg-5ymUDgm + - ROW-7kAf1blYU type: CHART CHART-W02beJK7ms: children: [] @@ -77,10 +65,10 @@ position: uuid: d20b7324-3b80-24d4-37e2-3bd583b66713 width: 3 parents: - - ROOT_ID - - TABS-97PVJa11D_ - - TAB-lg-5ymUDgm - - ROW-7kAf1blYU + - ROOT_ID + - TABS-97PVJa11D_ + - TAB-lg-5ymUDgm + - ROW-7kAf1blYU type: CHART CHART-XFag0yZdLk: children: [] @@ -93,10 +81,10 @@ position: uuid: 1810975a-f6d4-07c3-495c-c3b535d01f21 width: 3 parents: - - ROOT_ID - - TABS-97PVJa11D_ - - TAB-lg-5ymUDgm - - ROW-7kAf1blYU + - ROOT_ID + - TABS-97PVJa11D_ + - TAB-lg-5ymUDgm + - ROW-7kAf1blYU type: CHART CHART-XRvRfsMsaQ: children: [] @@ -104,14 +92,14 @@ position: meta: chartId: 3546 height: 62 - sliceName: 'Top 10 Games: Proportion of Sales in Markets' + sliceName: "Top 10 Games: Proportion of Sales in Markets" uuid: a40879d5-653a-42fe-9314-bbe88ad26e92 width: 6 parents: - - ROOT_ID - - TABS-97PVJa11D_ - - TAB-lg-5ymUDgm - - ROW-NuR8GFQTO + - ROOT_ID + - TABS-97PVJa11D_ + - TAB-lg-5ymUDgm + - ROW-NuR8GFQTO type: CHART CHART-XVIYTeubZh: children: [] @@ -121,12 +109,12 @@ position: height: 80 sliceName: Games uuid: 2a5e562b-ab37-1b9b-1de3-1be4335c8e83 - width: 5 + width: 6 parents: - - ROOT_ID - - TABS-97PVJa11D_ - - TAB-2_QXp8aNq - - ROW-yP9SB89PZ + - ROOT_ID + - TABS-97PVJa11D_ + - TAB-2_QXp8aNq + - ROW-yP9SB89PZ type: CHART CHART-_sx22yawJO: children: [] @@ -138,78 +126,45 @@ position: uuid: 326fc7e5-b7f1-448e-8a6f-80d0e7ce0b64 width: 6 parents: - - ROOT_ID - - TABS-97PVJa11D_ - - TAB-lg-5ymUDgm - - ROW-NuR8GFQTO + - ROOT_ID + - TABS-97PVJa11D_ + - TAB-lg-5ymUDgm + - ROW-NuR8GFQTO type: CHART CHART-nYns6xr4Ft: children: [] id: CHART-nYns6xr4Ft meta: chartId: 3548 - height: 79 + height: 80 sliceName: Total Sales per Market (Grouped by Genre) uuid: d8bf948e-46fd-4380-9f9c-a950c34bcc92 width: 6 parents: - - ROOT_ID - - TABS-97PVJa11D_ - - TAB-2_QXp8aNq - - ROW-fjg6YQBkH - type: CHART - CHART-uP9GF0z0rT: - children: [] - id: CHART-uP9GF0z0rT - meta: - chartId: 3547 - height: 45 - sliceName: Filter - uuid: fd9ce7ec-ae08-4f71-93e0-7c26b132b2e6 - width: 4 - parents: - - ROOT_ID - - TABS-97PVJa11D_ - - TAB-2_QXp8aNq - - ROW-yP9SB89PZ - - COLUMN-F53B1OSMcz - type: CHART - CHART-wt6ZO8jRXZ: - children: [] - id: CHART-wt6ZO8jRXZ - meta: - chartId: 659 - height: 72 - sliceName: Rise & Fall of Video Game Consoles - sliceNameOverride: Global Sales per Console - uuid: 83b0e2d0-d38b-d980-ed8e-e1c9846361b6 - width: 12 - parents: - - ROOT_ID - - TABS-97PVJa11D_ - - TAB-lg-5ymUDgm - - ROW-XT1DsNA_V + - ROOT_ID + - TABS-97PVJa11D_ + - TAB-2_QXp8aNq + - ROW-fjg6YQBkH type: CHART COLUMN-F53B1OSMcz: children: - - MARKDOWN-7K5cBNy7qu - - CHART-uP9GF0z0rT + - MARKDOWN-7K5cBNy7qu id: COLUMN-F53B1OSMcz meta: background: BACKGROUND_TRANSPARENT width: 4 parents: - - ROOT_ID - - TABS-97PVJa11D_ - - TAB-2_QXp8aNq - - ROW-yP9SB89PZ + - ROOT_ID + - TABS-97PVJa11D_ + - TAB-2_QXp8aNq + - ROW-yP9SB89PZ type: COLUMN DASHBOARD_VERSION_KEY: v2 GRID_ID: children: [] id: GRID_ID parents: - - ROOT_ID + - ROOT_ID type: GRID HEADER_ID: id: HEADER_ID @@ -220,224 +175,194 @@ position: children: [] id: MARKDOWN-7K5cBNy7qu meta: - code: "# \U0001F93F Explore Trends\n\nDive into data on popular video games\ + code: + "# \U0001F93F Explore Trends\n\nDive into data on popular video games\ \ using the following dimensions:\n\n- Year\n- Platform\n- Publisher\n- Genre\n\ \nTo use the **Filter Games** box below, select values for each dimension\ \ you want to zoom in on and then click **Apply**. \n\nThe filter criteria\ \ you set in this Filter-box will apply to *all* charts in this tab." - height: 33 + height: 55 width: 4 parents: - - ROOT_ID - - TABS-97PVJa11D_ - - TAB-2_QXp8aNq - - ROW-yP9SB89PZ - - COLUMN-F53B1OSMcz + - ROOT_ID + - TABS-97PVJa11D_ + - TAB-2_QXp8aNq + - ROW-yP9SB89PZ + - COLUMN-F53B1OSMcz type: MARKDOWN MARKDOWN-JOZKOjVc3a: children: [] id: MARKDOWN-JOZKOjVc3a meta: - code: "## \U0001F3AEVideo Game Sales\n\nThis dashboard visualizes sales & platform\ + code: + "## \U0001F3AEVideo Game Sales\n\nThis dashboard visualizes sales & platform\ \ data on video games that sold more than 100k copies. The data was last updated\ \ in early 2017.\n\n[Original dataset](https://www.kaggle.com/gregorut/videogamesales)" height: 18 width: 12 parents: - - ROOT_ID - - TABS-97PVJa11D_ - - TAB-lg-5ymUDgm - - ROW-0F99WDC-sz + - ROOT_ID + - TABS-97PVJa11D_ + - TAB-lg-5ymUDgm + - ROW-0F99WDC-sz type: MARKDOWN ROOT_ID: children: - - TABS-97PVJa11D_ + - TABS-97PVJa11D_ id: ROOT_ID type: ROOT ROW-0F99WDC-sz: children: - - MARKDOWN-JOZKOjVc3a + - MARKDOWN-JOZKOjVc3a id: ROW-0F99WDC-sz meta: background: BACKGROUND_TRANSPARENT parents: - - ROOT_ID - - TABS-97PVJa11D_ - - TAB-lg-5ymUDgm + - ROOT_ID + - TABS-97PVJa11D_ + - TAB-lg-5ymUDgm type: ROW ROW-7kAf1blYU: children: - - CHART-W02beJK7ms - - CHART-XFag0yZdLk - - CHART-8OG3UJX-Tn + - CHART-W02beJK7ms + - CHART-XFag0yZdLk + - CHART-8OG3UJX-Tn id: ROW-7kAf1blYU meta: - '0': ROOT_ID + "0": ROOT_ID background: BACKGROUND_TRANSPARENT parents: - - ROOT_ID - - TABS-97PVJa11D_ - - TAB-lg-5ymUDgm + - ROOT_ID + - TABS-97PVJa11D_ + - TAB-lg-5ymUDgm type: ROW ROW-NuR8GFQTO: children: - - CHART-_sx22yawJO - - CHART-XRvRfsMsaQ + - CHART-_sx22yawJO + - CHART-XRvRfsMsaQ id: ROW-NuR8GFQTO meta: - '0': ROOT_ID - '1': TABS-97PVJa11D_ - background: BACKGROUND_TRANSPARENT - parents: - - ROOT_ID - - TABS-97PVJa11D_ - - TAB-lg-5ymUDgm - type: ROW - ROW-XT1DsNA_V: - children: - - CHART-wt6ZO8jRXZ - id: ROW-XT1DsNA_V - meta: + "0": ROOT_ID + "1": TABS-97PVJa11D_ background: BACKGROUND_TRANSPARENT parents: - - ROOT_ID - - TABS-97PVJa11D_ - - TAB-lg-5ymUDgm + - ROOT_ID + - TABS-97PVJa11D_ + - TAB-lg-5ymUDgm type: ROW ROW-fjg6YQBkH: children: - - CHART-1L7NIcXvVN - - CHART-nYns6xr4Ft + - CHART-nYns6xr4Ft + - CHART-XVIYTeubZh id: ROW-fjg6YQBkH meta: background: BACKGROUND_TRANSPARENT parents: - - ROOT_ID - - TABS-97PVJa11D_ - - TAB-2_QXp8aNq + - ROOT_ID + - TABS-97PVJa11D_ + - TAB-2_QXp8aNq type: ROW ROW-yP9SB89PZ: children: - - COLUMN-F53B1OSMcz - - CHART-XVIYTeubZh - - CHART-7mKdnU7OUJ + - COLUMN-F53B1OSMcz + - CHART-7mKdnU7OUJ id: ROW-yP9SB89PZ meta: background: BACKGROUND_TRANSPARENT parents: - - ROOT_ID - - TABS-97PVJa11D_ - - TAB-2_QXp8aNq + - ROOT_ID + - TABS-97PVJa11D_ + - TAB-2_QXp8aNq type: ROW TAB-2_QXp8aNq: children: - - ROW-yP9SB89PZ - - ROW-fjg6YQBkH + - ROW-yP9SB89PZ + - ROW-fjg6YQBkH id: TAB-2_QXp8aNq meta: text: "\U0001F93F Explore Trends" parents: - - ROOT_ID - - TABS-97PVJa11D_ + - ROOT_ID + - TABS-97PVJa11D_ type: TAB TAB-lg-5ymUDgm: children: - - ROW-0F99WDC-sz - - ROW-XT1DsNA_V - - ROW-7kAf1blYU - - ROW-NuR8GFQTO + - ROW-0F99WDC-sz + - ROW-7kAf1blYU + - ROW-NuR8GFQTO id: TAB-lg-5ymUDgm meta: text: Overview parents: - - ROOT_ID - - TABS-97PVJa11D_ + - ROOT_ID + - TABS-97PVJa11D_ type: TAB TABS-97PVJa11D_: children: - - TAB-lg-5ymUDgm - - TAB-2_QXp8aNq + - TAB-lg-5ymUDgm + - TAB-2_QXp8aNq id: TABS-97PVJa11D_ meta: {} parents: - - ROOT_ID + - ROOT_ID type: TABS metadata: timed_refresh_immune_slices: [] expanded_slices: {} refresh_frequency: 0 - default_filters: '{"3547": {"platform": ["PS", "PS2", "PS3", "XB", "X360"], "__time_range": - "No filter"}}' + default_filters: "{}" color_scheme: supersetColors - filter_scopes: - "3547": - platform: - scope: - - TAB-2_QXp8aNq - immune: [] - genre: - scope: - - ROOT_ID - immune: [] - publisher: - scope: - - ROOT_ID - immune: [] - __time_range: - scope: - - ROOT_ID - immune: [] label_colors: - '0': '#1FA8C9' - '1': '#454E7C' - '2600': '#666666' - Europe: '#5AC189' - Japan: '#FF7F44' - North America: '#666666' - Other: '#E04355' - PS2: '#FCC700' - X360: '#A868B7' - PS3: '#3CCCCB' - Wii: '#A38F79' - DS: '#8FD3E4' - PS: '#A1A6BD' - GBA: '#ACE1C4' - PSP: '#FEC0A1' - PS4: '#B2B2B2' - PC: '#EFA1AA' - GB: '#FDE380' - XB: '#D3B3DA' - NES: '#9EE5E5' - 3DS: '#D1C6BC' - N64: '#1FA8C9' - SNES: '#454E7C' - GC: '#5AC189' - XOne: '#FF7F44' - WiiU: '#E04355' - PSV: '#FCC700' - SAT: '#A868B7' - GEN: '#3CCCCB' - DC: '#A38F79' - SCD: '#8FD3E4' - NG: '#A1A6BD' - WS: '#ACE1C4' - TG16: '#FEC0A1' - 3DO: '#B2B2B2' - GG: '#EFA1AA' - PCFX: '#FDE380' - Nintendo: '#D3B3DA' - Take-Two Interactive: '#9EE5E5' - Microsoft Game Studios: '#D1C6BC' - Action: '#1FA8C9' - Adventure: '#454E7C' - Fighting: '#5AC189' - Misc: '#FF7F44' - Platform: '#666666' - Puzzle: '#E04355' - Racing: '#FCC700' - Role-Playing: '#A868B7' - Shooter: '#3CCCCB' - Simulation: '#A38F79' - Sports: '#8FD3E4' - Strategy: '#A1A6BD' + "0": "#1FA8C9" + "1": "#454E7C" + "2600": "#666666" + Europe: "#5AC189" + Japan: "#FF7F44" + North America: "#666666" + Other: "#E04355" + PS2: "#FCC700" + X360: "#A868B7" + PS3: "#3CCCCB" + Wii: "#A38F79" + DS: "#8FD3E4" + PS: "#A1A6BD" + GBA: "#ACE1C4" + PSP: "#FEC0A1" + PS4: "#B2B2B2" + PC: "#EFA1AA" + GB: "#FDE380" + XB: "#D3B3DA" + NES: "#9EE5E5" + 3DS: "#D1C6BC" + N64: "#1FA8C9" + SNES: "#454E7C" + GC: "#5AC189" + XOne: "#FF7F44" + WiiU: "#E04355" + PSV: "#FCC700" + SAT: "#A868B7" + GEN: "#3CCCCB" + DC: "#A38F79" + SCD: "#8FD3E4" + NG: "#A1A6BD" + WS: "#ACE1C4" + TG16: "#FEC0A1" + 3DO: "#B2B2B2" + GG: "#EFA1AA" + PCFX: "#FDE380" + Nintendo: "#D3B3DA" + Take-Two Interactive: "#9EE5E5" + Microsoft Game Studios: "#D1C6BC" + Action: "#1FA8C9" + Adventure: "#454E7C" + Fighting: "#5AC189" + Misc: "#FF7F44" + Platform: "#666666" + Puzzle: "#E04355" + Racing: "#FCC700" + Role-Playing: "#A868B7" + Shooter: "#3CCCCB" + Simulation: "#A38F79" + Sports: "#8FD3E4" + Strategy: "#A1A6BD" version: 1.0.0 diff --git a/superset/examples/configs/datasets/examples/FCC_2018_Survey.yaml b/superset/examples/configs/datasets/examples/FCC_2018_Survey.yaml index 5bbbe2f74b934..85aeb51eb9cdc 100644 --- a/superset/examples/configs/datasets/examples/FCC_2018_Survey.yaml +++ b/superset/examples/configs/datasets/examples/FCC_2018_Survey.yaml @@ -21,7 +21,7 @@ default_endpoint: null offset: 0 cache_timeout: null schema: null -sql: '' +sql: "" params: null template_params: null filter_select_enabled: true @@ -29,1487 +29,1465 @@ fetch_values_predicate: null extra: null uuid: d95a2865-53ce-1f82-a53d-8e3c89331469 metrics: -- metric_name: count - verbose_name: COUNT(*) - metric_type: null - expression: COUNT(*) - description: null - d3format: null - extra: null - warning_text: null + - metric_name: count + verbose_name: COUNT(*) + metric_type: null + expression: COUNT(*) + description: null + d3format: null + extra: null + warning_text: null columns: -- column_name: highest_degree_earned - verbose_name: Highest Degree Earned - is_dttm: false - is_active: null - type: STRING - groupby: true - filterable: true - expression: "CASE \n WHEN school_degree = 'no high school (secondary school)'\ - \ THEN 'A. No high school (secondary school)'\n WHEN school_degree = 'some\ - \ high school' THEN 'B. Some high school'\n WHEN school_degree = 'high school\ - \ diploma or equivalent (GED)' THEN 'C. High school diploma or equivalent (GED)'\ - \n WHEN school_degree = 'associate''s degree' THEN 'D. Associate''s degree'\ - \n WHEN school_degree = 'some college credit, no degree' THEN 'E. Some college\ - \ credit, no degree'\n WHEN school_degree = 'bachelor''s degree' THEN 'F.\ - \ Bachelor''s degree'\n WHEN school_degree = 'trade, technical, or vocational\ - \ training' THEN 'G. Trade, technical, or vocational training'\n WHEN school_degree\ - \ = 'master''s degree (non-professional)' THEN 'H. Master''s degree (non-professional)'\ - \n WHEN school_degree = 'Ph.D.' THEN 'I. Ph.D.'\n WHEN school_degree = '\ - professional degree (MBA, MD, JD, etc.)' THEN 'J. Professional degree (MBA,\ - \ MD, JD, etc.)'\nEND" - description: Highest Degree Earned - python_date_format: null -- column_name: job_location_preference - verbose_name: Job Location Preference - is_dttm: false - is_active: null - type: null - groupby: true - filterable: true - expression: "case \nwhen job_lctn_pref is Null then 'No Answer' \nwhen job_lctn_pref\ - \ = 'from home' then 'From Home'\nwhen job_lctn_pref = 'no preference' then 'No\ - \ Preference'\nwhen job_lctn_pref = 'in an office with other developers' then\ - \ 'In an Office (with Other Developers)'\nelse job_lctn_pref\nend " - description: null - python_date_format: null -- column_name: ethnic_minority - verbose_name: Ethnic Minority - is_dttm: null - is_active: null - type: STRING - groupby: true - filterable: true - expression: "CASE \nWHEN is_ethnic_minority = 0 THEN 'No, not an ethnic minority'\ - \ \nWHEN is_ethnic_minority = 1 THEN 'Yes, an ethnic minority' \nELSE 'No Answer'\n\ - END" - description: null - python_date_format: null -- column_name: willing_to_relocate - verbose_name: Willing To Relocate - is_dttm: false - is_active: null - type: STRING - groupby: true - filterable: true - expression: "CASE \nWHEN job_relocate = 0 THEN 'No: Not Willing to' \nWHEN job_relocate\ - \ = 1 THEN 'Yes: Willing To'\nELSE 'No Answer'\nEND" - description: null - python_date_format: null -- column_name: developer_type - verbose_name: Developer Type - is_dttm: false - is_active: null - type: STRING - groupby: true - filterable: true - expression: CASE WHEN is_software_dev = 0 THEN 'Aspiring Developer' WHEN is_software_dev - = 1 THEN 'Currently A Developer' END - description: null - python_date_format: null -- column_name: first_time_developer - verbose_name: First Time Developer - is_dttm: false - is_active: null - type: null - groupby: true - filterable: true - expression: "CASE \nWHEN is_first_dev_job = 0 THEN 'No' \nWHEN is_first_dev_job\ - \ = 1 THEN 'Yes' \nELSE 'No Answer'\nEND" - description: null - python_date_format: null -- column_name: gender - verbose_name: null - is_dttm: null - is_active: null - type: STRING - groupby: true - filterable: true - expression: "CASE \nWHEN gender = 'Male' THEN 'Male'\nWHEN gender = 'Female' THEN\ - \ 'Female'\nELSE 'Prefer Not to Say'\nEND" - description: null - python_date_format: null -- column_name: calc_first_time_dev - verbose_name: null - is_dttm: false - is_active: null - type: STRING - groupby: true - filterable: true - expression: CASE WHEN is_first_dev_job = 0 THEN 'No' WHEN is_first_dev_job = 1 THEN - 'Yes' END - description: null - python_date_format: null -- column_name: yt_codingtuts360 - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: is_recv_disab_bnft - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: job_intr_qa_engn - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: has_high_spd_ntnet - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: is_first_dev_job - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: job_intr_ux_engn - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: bootcamp_have_loan - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: podcast_js_jabber - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: job_intr_datasci - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: job_intr_dataengn - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: rsrc_khan_acdm - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: has_finance_depends - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: has_served_military - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: job_intr_backend - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: job_intr_teacher - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: months_job_search - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: student_debt_has - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: student_debt_amt - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: job_intr_gamedev - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: rsrc_code_wars - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: do_finance_support - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: last_yr_income - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: is_software_dev - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: money_for_learning - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: home_mrtg_has - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: job_intr_mobile - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: job_intr_infosec - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: job_intr_fllstck - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: job_intr_frntend - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: job_intr_devops - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: job_intr_projm - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: rsrc_css_tricks - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: yt_cs_dojo - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: is_ethnic_minority - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: yt_mit_ocw - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: is_self_employed - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: home_mrtg_owe - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: yt_engn_truth - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: bootcamp_attend - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: yt_derekbanas - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: yt_learncodeacdm - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: podcast_changelog - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: rsrc_hackerrank - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: podcast_devtea - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: podcast_sedaily - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: podcast_seradio - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: codeevnt_gamejam - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: podcast_geekspeak - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: podcast_talkpythonme - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: podcast_hanselmnts - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: podcast_syntaxfm - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: podcast_shoptalk - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: yt_mozillahacks - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: podcast_codingblcks - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: podcast_codenewbie - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: bootcamp_recommend - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: codeevnt_railsbrdg - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: bootcamp_finished - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: podcast_rubyrogues - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: job_relocate - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: debt_amt - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: rsrc_codeacdm - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: podcast_fcc - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: podcast_codepenrd - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: podcast_fullstckrd - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: codeevnt_hackthn - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: rsrc_udacity - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: podcast_ltcwm - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: rsrc_coursera - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: codeevnt_djangogrls - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: codeevnt_startupwknd - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: podcast_progthrwdwn - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: expected_earn - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: rsrc_egghead - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: codeevnt_railsgrls - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: has_children - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: podcast_frnthppyhr - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: yt_codingtrain - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: rsrc_lynda - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: rsrc_pluralsight - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: hours_learning - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: yt_simplilearn - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: codeevnt_wkndbtcmp - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: codeevnt_fcc - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: rsrc_fcc - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: codeevnt_coderdojo - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: codeevnt_nodeschl - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: codeevnt_womenwc - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: codeevnt_confs - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: yt_fcc - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: codeevnt_girldevit - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: codeevnt_meetup - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: codeevnt_workshps - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: rsrc_frntendmstr - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: num_children - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: rsrc_udemy - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: rsrc_edx - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: rsrc_mdn - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: rsrc_treehouse - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: yt_computerphile - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: yt_funfunfunct - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: rsrc_so - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: yt_googledevs - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: yt_devtips - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: yt_simpleprog - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: yt_lvluptuts - verbose_name: null - is_dttm: false - is_active: null - type: DOUBLE PRECISION - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: time_start - verbose_name: null - is_dttm: true - is_active: null - type: DATETIME - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: time_total_sec - verbose_name: null - is_dttm: false - is_active: null - type: BIGINT - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: months_programming - verbose_name: null - is_dttm: false - is_active: null - type: BIGINT - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: age - verbose_name: null - is_dttm: false - is_active: null - type: BIGINT - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: ID - verbose_name: null - is_dttm: false - is_active: null - type: TEXT - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: reasons_to_code_other - verbose_name: null - is_dttm: false - is_active: null - type: TEXT - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: lang_at_home - verbose_name: null - is_dttm: false - is_active: null - type: TEXT - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: when_appl_job - verbose_name: null - is_dttm: false - is_active: null - type: TEXT - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: reasons_to_code - verbose_name: null - is_dttm: false - is_active: null - type: TEXT - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: live_city_population - verbose_name: null - is_dttm: false - is_active: null - type: TEXT - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: job_lctn_pref - verbose_name: null - is_dttm: false - is_active: null - type: TEXT - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: job_intr_other - verbose_name: null - is_dttm: false - is_active: null - type: TEXT - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: curr_employment_other - verbose_name: null - is_dttm: false - is_active: null - type: TEXT - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: marital_status - verbose_name: null - is_dttm: false - is_active: null - type: TEXT - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: bootcamp_name - verbose_name: null - is_dttm: false - is_active: null - type: TEXT - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: podcast_other - verbose_name: null - is_dttm: false - is_active: null - type: TEXT - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: school_major - verbose_name: null - is_dttm: false - is_active: null - type: TEXT - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: job_pref - verbose_name: null - is_dttm: false - is_active: null - type: TEXT - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: country_citizen - verbose_name: null - is_dttm: false - is_active: null - type: TEXT - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: school_degree - verbose_name: null - is_dttm: false - is_active: null - type: TEXT - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: codeevnt_other - verbose_name: null - is_dttm: false - is_active: null - type: TEXT - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: curr_field - verbose_name: null - is_dttm: false - is_active: null - type: TEXT - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: communite_time - verbose_name: null - is_dttm: false - is_active: null - type: TEXT - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: rsrc_other - verbose_name: null - is_dttm: false - is_active: null - type: TEXT - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: country_live - verbose_name: null - is_dttm: false - is_active: null - type: TEXT - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: curr_employment - verbose_name: null - is_dttm: false - is_active: null - type: TEXT - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: gender_other - verbose_name: null - is_dttm: false - is_active: null - type: TEXT - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: time_end - verbose_name: null - is_dttm: false - is_active: null - type: TEXT - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: network_id - verbose_name: null - is_dttm: false - is_active: null - type: TEXT - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: yt_other - verbose_name: null - is_dttm: false - is_active: null - type: TEXT - groupby: true - filterable: true - expression: null - description: null - python_date_format: null -- column_name: gender - verbose_name: null - is_dttm: false - is_active: null - type: TEXT - groupby: true - filterable: true - expression: null - description: null - python_date_format: null + - column_name: highest_degree_earned + verbose_name: Highest Degree Earned + is_dttm: false + is_active: null + type: STRING + groupby: true + filterable: true + expression: + "CASE \n WHEN school_degree = 'no high school (secondary school)'\ + \ THEN 'A. No high school (secondary school)'\n WHEN school_degree = 'some\ + \ high school' THEN 'B. Some high school'\n WHEN school_degree = 'high school\ + \ diploma or equivalent (GED)' THEN 'C. High school diploma or equivalent (GED)'\ + \n WHEN school_degree = 'associate''s degree' THEN 'D. Associate''s degree'\ + \n WHEN school_degree = 'some college credit, no degree' THEN 'E. Some college\ + \ credit, no degree'\n WHEN school_degree = 'bachelor''s degree' THEN 'F.\ + \ Bachelor''s degree'\n WHEN school_degree = 'trade, technical, or vocational\ + \ training' THEN 'G. Trade, technical, or vocational training'\n WHEN school_degree\ + \ = 'master''s degree (non-professional)' THEN 'H. Master''s degree (non-professional)'\ + \n WHEN school_degree = 'Ph.D.' THEN 'I. Ph.D.'\n WHEN school_degree = '\ + professional degree (MBA, MD, JD, etc.)' THEN 'J. Professional degree (MBA,\ + \ MD, JD, etc.)'\nEND" + description: Highest Degree Earned + python_date_format: null + - column_name: job_location_preference + verbose_name: Job Location Preference + is_dttm: false + is_active: null + type: null + groupby: true + filterable: true + expression: + "case \nwhen job_lctn_pref is Null then 'No Answer' \nwhen job_lctn_pref\ + \ = 'from home' then 'From Home'\nwhen job_lctn_pref = 'no preference' then 'No\ + \ Preference'\nwhen job_lctn_pref = 'in an office with other developers' then\ + \ 'In an Office (with Other Developers)'\nelse job_lctn_pref\nend " + description: null + python_date_format: null + - column_name: ethnic_minority + verbose_name: Ethnic Minority + is_dttm: null + is_active: null + type: STRING + groupby: true + filterable: true + expression: + "CASE \nWHEN is_ethnic_minority = 0 THEN 'No, not an ethnic minority'\ + \ \nWHEN is_ethnic_minority = 1 THEN 'Yes, an ethnic minority' \nELSE 'No Answer'\n\ + END" + description: null + python_date_format: null + - column_name: willing_to_relocate + verbose_name: Willing To Relocate + is_dttm: false + is_active: null + type: STRING + groupby: true + filterable: true + expression: + "CASE \nWHEN job_relocate = 0 THEN 'No: Not Willing to' \nWHEN job_relocate\ + \ = 1 THEN 'Yes: Willing To'\nELSE 'No Answer'\nEND" + description: null + python_date_format: null + - column_name: developer_type + verbose_name: Developer Type + is_dttm: false + is_active: null + type: STRING + groupby: true + filterable: true + expression: + CASE WHEN is_software_dev = 0 THEN 'Aspiring Developer' WHEN is_software_dev + = 1 THEN 'Currently A Developer' END + description: null + python_date_format: null + - column_name: first_time_developer + verbose_name: First Time Developer + is_dttm: false + is_active: null + type: null + groupby: true + filterable: true + expression: + "CASE \nWHEN is_first_dev_job = 0 THEN 'No' \nWHEN is_first_dev_job\ + \ = 1 THEN 'Yes' \nELSE 'No Answer'\nEND" + description: null + python_date_format: null + - column_name: gender + verbose_name: null + is_dttm: null + is_active: null + type: STRING + groupby: true + filterable: true + expression: + "CASE \nWHEN gender = 'Male' THEN 'Male'\nWHEN gender = 'Female' THEN\ + \ 'Female'\nELSE 'Prefer Not to Say'\nEND" + description: null + python_date_format: null + - column_name: calc_first_time_dev + verbose_name: null + is_dttm: false + is_active: null + type: STRING + groupby: true + filterable: true + expression: + CASE WHEN is_first_dev_job = 0 THEN 'No' WHEN is_first_dev_job = 1 THEN + 'Yes' END + description: null + python_date_format: null + - column_name: yt_codingtuts360 + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: is_recv_disab_bnft + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: job_intr_qa_engn + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: has_high_spd_ntnet + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: is_first_dev_job + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: job_intr_ux_engn + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: bootcamp_have_loan + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: podcast_js_jabber + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: job_intr_datasci + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: job_intr_dataengn + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: rsrc_khan_acdm + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: has_finance_depends + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: has_served_military + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: job_intr_backend + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: job_intr_teacher + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: months_job_search + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: student_debt_has + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: student_debt_amt + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: job_intr_gamedev + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: rsrc_code_wars + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: do_finance_support + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: last_yr_income + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: is_software_dev + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: money_for_learning + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: home_mrtg_has + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: job_intr_mobile + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: job_intr_infosec + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: job_intr_fllstck + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: job_intr_frntend + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: job_intr_devops + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: job_intr_projm + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: rsrc_css_tricks + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: yt_cs_dojo + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: is_ethnic_minority + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: yt_mit_ocw + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: is_self_employed + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: home_mrtg_owe + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: yt_engn_truth + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: bootcamp_attend + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: yt_derekbanas + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: yt_learncodeacdm + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: podcast_changelog + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: rsrc_hackerrank + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: podcast_devtea + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: podcast_sedaily + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: podcast_seradio + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: codeevnt_gamejam + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: podcast_geekspeak + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: podcast_talkpythonme + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: podcast_hanselmnts + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: podcast_syntaxfm + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: podcast_shoptalk + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: yt_mozillahacks + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: podcast_codingblcks + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: podcast_codenewbie + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: bootcamp_recommend + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: codeevnt_railsbrdg + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: bootcamp_finished + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: podcast_rubyrogues + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: job_relocate + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: debt_amt + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: rsrc_codeacdm + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: podcast_fcc + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: podcast_codepenrd + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: podcast_fullstckrd + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: codeevnt_hackthn + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: rsrc_udacity + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: podcast_ltcwm + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: rsrc_coursera + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: codeevnt_djangogrls + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: codeevnt_startupwknd + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: podcast_progthrwdwn + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: expected_earn + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: rsrc_egghead + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: codeevnt_railsgrls + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: has_children + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: podcast_frnthppyhr + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: yt_codingtrain + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: rsrc_lynda + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: hours_learning + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: yt_simplilearn + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: codeevnt_wkndbtcmp + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: codeevnt_fcc + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: rsrc_fcc + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: codeevnt_coderdojo + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: codeevnt_nodeschl + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: codeevnt_womenwc + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: codeevnt_confs + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: yt_fcc + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: codeevnt_girldevit + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: codeevnt_meetup + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: codeevnt_workshps + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: rsrc_frntendmstr + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: num_children + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: rsrc_udemy + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: rsrc_edx + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: rsrc_mdn + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: rsrc_treehouse + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: yt_computerphile + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: yt_funfunfunct + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: rsrc_so + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: yt_googledevs + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: yt_devtips + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: yt_simpleprog + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: yt_lvluptuts + verbose_name: null + is_dttm: false + is_active: null + type: DOUBLE PRECISION + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: time_start + verbose_name: null + is_dttm: true + is_active: null + type: DATETIME + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: time_total_sec + verbose_name: null + is_dttm: false + is_active: null + type: BIGINT + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: months_programming + verbose_name: null + is_dttm: false + is_active: null + type: BIGINT + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: age + verbose_name: null + is_dttm: false + is_active: null + type: BIGINT + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: ID + verbose_name: null + is_dttm: false + is_active: null + type: TEXT + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: reasons_to_code_other + verbose_name: null + is_dttm: false + is_active: null + type: TEXT + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: lang_at_home + verbose_name: null + is_dttm: false + is_active: null + type: TEXT + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: when_appl_job + verbose_name: null + is_dttm: false + is_active: null + type: TEXT + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: reasons_to_code + verbose_name: null + is_dttm: false + is_active: null + type: TEXT + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: live_city_population + verbose_name: null + is_dttm: false + is_active: null + type: TEXT + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: job_lctn_pref + verbose_name: null + is_dttm: false + is_active: null + type: TEXT + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: job_intr_other + verbose_name: null + is_dttm: false + is_active: null + type: TEXT + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: marital_status + verbose_name: null + is_dttm: false + is_active: null + type: TEXT + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: bootcamp_name + verbose_name: null + is_dttm: false + is_active: null + type: TEXT + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: podcast_other + verbose_name: null + is_dttm: false + is_active: null + type: TEXT + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: school_major + verbose_name: null + is_dttm: false + is_active: null + type: TEXT + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: job_pref + verbose_name: null + is_dttm: false + is_active: null + type: TEXT + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: country_citizen + verbose_name: null + is_dttm: false + is_active: null + type: TEXT + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: school_degree + verbose_name: null + is_dttm: false + is_active: null + type: TEXT + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: codeevnt_other + verbose_name: null + is_dttm: false + is_active: null + type: TEXT + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: curr_field + verbose_name: null + is_dttm: false + is_active: null + type: TEXT + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: communite_time + verbose_name: null + is_dttm: false + is_active: null + type: TEXT + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: rsrc_other + verbose_name: null + is_dttm: false + is_active: null + type: TEXT + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: country_live + verbose_name: null + is_dttm: false + is_active: null + type: TEXT + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: gender_other + verbose_name: null + is_dttm: false + is_active: null + type: TEXT + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: time_end + verbose_name: null + is_dttm: false + is_active: null + type: TEXT + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: network_id + verbose_name: null + is_dttm: false + is_active: null + type: TEXT + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: yt_other + verbose_name: null + is_dttm: false + is_active: null + type: TEXT + groupby: true + filterable: true + expression: null + description: null + python_date_format: null + - column_name: gender + verbose_name: null + is_dttm: false + is_active: null + type: TEXT + groupby: true + filterable: true + expression: null + description: null + python_date_format: null version: 1.0.0 database_uuid: a2dc77af-e654-49bb-b321-40f6b559a1ee data: https://github.com/apache-superset/examples-data/raw/master/datasets/examples/fcc_survey_2018.csv.gz diff --git a/superset/examples/country_map.py b/superset/examples/country_map.py index 4331033ca8369..3caf637584633 100644 --- a/superset/examples/country_map.py +++ b/superset/examples/country_map.py @@ -80,13 +80,13 @@ def load_country_map_data(only_metadata: bool = False, force: bool = False) -> N obj = db.session.query(table).filter_by(table_name=tbl_name).first() if not obj: obj = table(table_name=tbl_name, schema=schema) + db.session.add(obj) obj.main_dttm_col = "dttm" obj.database = database obj.filter_select_enabled = True if not any(col.metric_name == "avg__2004" for col in obj.metrics): col = str(column("2004").compile(db.engine)) obj.metrics.append(SqlMetric(metric_name="avg__2004", expression=f"AVG({col})")) - db.session.merge(obj) db.session.commit() obj.fetch_metadata() tbl = obj diff --git a/superset/examples/css_templates.py b/superset/examples/css_templates.py index 4f3f355895ef9..2f67d2e1faac9 100644 --- a/superset/examples/css_templates.py +++ b/superset/examples/css_templates.py @@ -27,6 +27,7 @@ def load_css_templates() -> None: obj = db.session.query(CssTemplate).filter_by(template_name="Flat").first() if not obj: obj = CssTemplate(template_name="Flat") + db.session.add(obj) css = textwrap.dedent( """\ .navbar { @@ -51,12 +52,12 @@ def load_css_templates() -> None: """ ) obj.css = css - db.session.merge(obj) db.session.commit() obj = db.session.query(CssTemplate).filter_by(template_name="Courier Black").first() if not obj: obj = CssTemplate(template_name="Courier Black") + db.session.add(obj) css = textwrap.dedent( """\ h2 { @@ -96,5 +97,4 @@ def load_css_templates() -> None: """ ) obj.css = css - db.session.merge(obj) db.session.commit() diff --git a/superset/examples/deck.py b/superset/examples/deck.py index fc1e8ba00cd6c..326977054e87a 100644 --- a/superset/examples/deck.py +++ b/superset/examples/deck.py @@ -532,6 +532,7 @@ def load_deck_dash() -> None: # pylint: disable=too-many-statements if not dash: dash = Dashboard() + db.session.add(dash) dash.published = True js = POSITION_JSON pos = json.loads(js) @@ -540,5 +541,4 @@ def load_deck_dash() -> None: # pylint: disable=too-many-statements dash.dashboard_title = title dash.slug = slug dash.slices = slices - db.session.merge(dash) db.session.commit() diff --git a/superset/examples/energy.py b/superset/examples/energy.py index 6688e5d08844d..998ee97a30df2 100644 --- a/superset/examples/energy.py +++ b/superset/examples/energy.py @@ -66,6 +66,7 @@ def load_energy( tbl = db.session.query(table).filter_by(table_name=tbl_name).first() if not tbl: tbl = table(table_name=tbl_name, schema=schema) + db.session.add(tbl) tbl.description = "Energy consumption" tbl.database = database tbl.filter_select_enabled = True @@ -76,7 +77,6 @@ def load_energy( SqlMetric(metric_name="sum__value", expression=f"SUM({col})") ) - db.session.merge(tbl) db.session.commit() tbl.fetch_metadata() diff --git a/superset/examples/flights.py b/superset/examples/flights.py index 7c8f9802988bd..c7890cfa18d39 100644 --- a/superset/examples/flights.py +++ b/superset/examples/flights.py @@ -63,10 +63,10 @@ def load_flights(only_metadata: bool = False, force: bool = False) -> None: tbl = db.session.query(table).filter_by(table_name=tbl_name).first() if not tbl: tbl = table(table_name=tbl_name, schema=schema) + db.session.add(tbl) tbl.description = "Random set of flights in the US" tbl.database = database tbl.filter_select_enabled = True - db.session.merge(tbl) db.session.commit() tbl.fetch_metadata() print("Done loading table!") diff --git a/superset/examples/long_lat.py b/superset/examples/long_lat.py index 88b45548f48dc..6f7cc64020d33 100644 --- a/superset/examples/long_lat.py +++ b/superset/examples/long_lat.py @@ -92,10 +92,10 @@ def load_long_lat_data(only_metadata: bool = False, force: bool = False) -> None obj = db.session.query(table).filter_by(table_name=tbl_name).first() if not obj: obj = table(table_name=tbl_name, schema=schema) + db.session.add(obj) obj.main_dttm_col = "datetime" obj.database = database obj.filter_select_enabled = True - db.session.merge(obj) db.session.commit() obj.fetch_metadata() tbl = obj diff --git a/superset/examples/misc_dashboard.py b/superset/examples/misc_dashboard.py index 4146ea1bd38f0..aa8d03749550a 100644 --- a/superset/examples/misc_dashboard.py +++ b/superset/examples/misc_dashboard.py @@ -34,40 +34,26 @@ def load_misc_dashboard() -> None: if not dash: dash = Dashboard() + db.session.add(dash) js = textwrap.dedent( """\ { - "CHART-BkeVbh8ANQ": { - "children": [], - "id": "CHART-BkeVbh8ANQ", - "meta": { - "chartId": 4004, - "height": 34, - "sliceName": "Multi Line", - "width": 8 - }, - "type": "CHART" - }, - "CHART-H1HYNzEANX": { - "children": [], - "id": "CHART-H1HYNzEANX", - "meta": { - "chartId": 3940, - "height": 50, - "sliceName": "Energy Sankey", - "width": 6 - }, - "type": "CHART" - }, "CHART-HJOYVMV0E7": { "children": [], "id": "CHART-HJOYVMV0E7", "meta": { "chartId": 3969, - "height": 63, + "height": 69, "sliceName": "Mapbox Long/Lat", - "width": 6 + "uuid": "164efe31-295b-4408-aaa6-2f4bfb58a212", + "width": 4 }, + "parents": [ + "ROOT_ID", + "GRID_ID", + "ROW-S1MK4M4A4X", + "COLUMN-ByUFVf40EQ" + ], "type": "CHART" }, "CHART-S1WYNz4AVX": { @@ -75,32 +61,16 @@ def load_misc_dashboard() -> None: "id": "CHART-S1WYNz4AVX", "meta": { "chartId": 3989, - "height": 25, + "height": 69, "sliceName": "Parallel Coordinates", + "uuid": "e84f7e74-031a-47bb-9f80-ae0694dcca48", "width": 4 }, - "type": "CHART" - }, - "CHART-r19KVMNCE7": { - "children": [], - "id": "CHART-r19KVMNCE7", - "meta": { - "chartId": 3971, - "height": 34, - "sliceName": "Calendar Heatmap multiformat 0", - "width": 4 - }, - "type": "CHART" - }, - "CHART-rJ4K4GV04Q": { - "children": [], - "id": "CHART-rJ4K4GV04Q", - "meta": { - "chartId": 3941, - "height": 63, - "sliceName": "Energy Force Layout", - "width": 6 - }, + "parents": [ + "ROOT_ID", + "GRID_ID", + "ROW-SytNzNA4X" + ], "type": "CHART" }, "CHART-rkgF4G4A4X": { @@ -108,54 +78,27 @@ def load_misc_dashboard() -> None: "id": "CHART-rkgF4G4A4X", "meta": { "chartId": 3970, - "height": 25, + "height": 69, "sliceName": "Birth in France by department in 2016", - "width": 8 - }, - "type": "CHART" - }, - "CHART-rywK4GVR4X": { - "children": [], - "id": "CHART-rywK4GVR4X", - "meta": { - "chartId": 3942, - "height": 50, - "sliceName": "Heatmap", - "width": 6 - }, - "type": "CHART" - }, - "COLUMN-ByUFVf40EQ": { - "children": [ - "CHART-rywK4GVR4X", - "CHART-HJOYVMV0E7" - ], - "id": "COLUMN-ByUFVf40EQ", - "meta": { - "background": "BACKGROUND_TRANSPARENT", - "width": 6 + "uuid": "54583ae9-c99a-42b5-a906-7ee2adfe1fb1", + "width": 4 }, - "type": "COLUMN" - }, - "COLUMN-rkmYVGN04Q": { - "children": [ - "CHART-rJ4K4GV04Q", - "CHART-H1HYNzEANX" + "parents": [ + "ROOT_ID", + "GRID_ID", + "ROW-SytNzNA4X" ], - "id": "COLUMN-rkmYVGN04Q", - "meta": { - "background": "BACKGROUND_TRANSPARENT", - "width": 6 - }, - "type": "COLUMN" + "type": "CHART" }, + "DASHBOARD_VERSION_KEY": "v2", "GRID_ID": { "children": [ - "ROW-SytNzNA4X", - "ROW-S1MK4M4A4X", - "ROW-HkFFEzVRVm" + "ROW-SytNzNA4X" ], "id": "GRID_ID", + "parents": [ + "ROOT_ID" + ], "type": "GRID" }, "HEADER_ID": { @@ -172,40 +115,22 @@ def load_misc_dashboard() -> None: "id": "ROOT_ID", "type": "ROOT" }, - "ROW-HkFFEzVRVm": { - "children": [ - "CHART-r19KVMNCE7", - "CHART-BkeVbh8ANQ" - ], - "id": "ROW-HkFFEzVRVm", - "meta": { - "background": "BACKGROUND_TRANSPARENT" - }, - "type": "ROW" - }, - "ROW-S1MK4M4A4X": { - "children": [ - "COLUMN-rkmYVGN04Q", - "COLUMN-ByUFVf40EQ" - ], - "id": "ROW-S1MK4M4A4X", - "meta": { - "background": "BACKGROUND_TRANSPARENT" - }, - "type": "ROW" - }, "ROW-SytNzNA4X": { "children": [ "CHART-rkgF4G4A4X", - "CHART-S1WYNz4AVX" + "CHART-S1WYNz4AVX", + "CHART-HJOYVMV0E7" ], "id": "ROW-SytNzNA4X", "meta": { "background": "BACKGROUND_TRANSPARENT" }, + "parents": [ + "ROOT_ID", + "GRID_ID" + ], "type": "ROW" - }, - "DASHBOARD_VERSION_KEY": "v2" + } } """ ) @@ -215,5 +140,4 @@ def load_misc_dashboard() -> None: dash.position_json = json.dumps(pos, indent=4) dash.slug = DASH_SLUG dash.slices = slices - db.session.merge(dash) db.session.commit() diff --git a/superset/examples/multiformat_time_series.py b/superset/examples/multiformat_time_series.py index 6bad2a7ac252b..4c1e79631648e 100644 --- a/superset/examples/multiformat_time_series.py +++ b/superset/examples/multiformat_time_series.py @@ -82,6 +82,7 @@ def load_multiformat_time_series( # pylint: disable=too-many-locals obj = db.session.query(table).filter_by(table_name=tbl_name).first() if not obj: obj = table(table_name=tbl_name, schema=schema) + db.session.add(obj) obj.main_dttm_col = "ds" obj.database = database obj.filter_select_enabled = True @@ -100,7 +101,6 @@ def load_multiformat_time_series( # pylint: disable=too-many-locals col.python_date_format = dttm_and_expr[0] col.database_expression = dttm_and_expr[1] col.is_dttm = True - db.session.merge(obj) db.session.commit() obj.fetch_metadata() tbl = obj diff --git a/superset/examples/paris.py b/superset/examples/paris.py index 1180c428feb21..fa5c77b84d1a6 100644 --- a/superset/examples/paris.py +++ b/superset/examples/paris.py @@ -57,9 +57,9 @@ def load_paris_iris_geojson(only_metadata: bool = False, force: bool = False) -> tbl = db.session.query(table).filter_by(table_name=tbl_name).first() if not tbl: tbl = table(table_name=tbl_name, schema=schema) + db.session.add(tbl) tbl.description = "Map of Paris" tbl.database = database tbl.filter_select_enabled = True - db.session.merge(tbl) db.session.commit() tbl.fetch_metadata() diff --git a/superset/examples/random_time_series.py b/superset/examples/random_time_series.py index 9a296ec2c4713..4a2d10aee9b92 100644 --- a/superset/examples/random_time_series.py +++ b/superset/examples/random_time_series.py @@ -67,10 +67,10 @@ def load_random_time_series_data( obj = db.session.query(table).filter_by(table_name=tbl_name).first() if not obj: obj = table(table_name=tbl_name, schema=schema) + db.session.add(obj) obj.main_dttm_col = "ds" obj.database = database obj.filter_select_enabled = True - db.session.merge(obj) db.session.commit() obj.fetch_metadata() tbl = obj diff --git a/superset/examples/sf_population_polygons.py b/superset/examples/sf_population_polygons.py index 76c039afb88a3..ba5905f58a924 100644 --- a/superset/examples/sf_population_polygons.py +++ b/superset/examples/sf_population_polygons.py @@ -59,9 +59,9 @@ def load_sf_population_polygons( tbl = db.session.query(table).filter_by(table_name=tbl_name).first() if not tbl: tbl = table(table_name=tbl_name, schema=schema) + db.session.add(tbl) tbl.description = "Population density of San Francisco" tbl.database = database tbl.filter_select_enabled = True - db.session.merge(tbl) db.session.commit() tbl.fetch_metadata() diff --git a/superset/examples/tabbed_dashboard.py b/superset/examples/tabbed_dashboard.py index 58c0ba3e4c063..b05726334565a 100644 --- a/superset/examples/tabbed_dashboard.py +++ b/superset/examples/tabbed_dashboard.py @@ -33,6 +33,7 @@ def load_tabbed_dashboard(_: bool = False) -> None: if not dash: dash = Dashboard() + db.session.add(dash) js = textwrap.dedent( """ @@ -556,6 +557,4 @@ def load_tabbed_dashboard(_: bool = False) -> None: dash.slices = slices dash.dashboard_title = "Tabbed Dashboard" dash.slug = slug - - db.session.merge(dash) db.session.commit() diff --git a/superset/examples/world_bank.py b/superset/examples/world_bank.py index 31d956f5fde82..1541e3e4724a1 100644 --- a/superset/examples/world_bank.py +++ b/superset/examples/world_bank.py @@ -24,14 +24,8 @@ import superset.utils.database from superset import app, db -from superset.connectors.sqla.models import SqlMetric -from superset.models.dashboard import Dashboard -from superset.models.slice import Slice -from superset.utils import core as utils -from superset.utils.core import DatasourceType - -from ..connectors.base.models import BaseDatasource -from .helpers import ( +from superset.connectors.sqla.models import BaseDatasource, SqlMetric +from superset.examples.helpers import ( get_example_url, get_examples_folder, get_slice_json, @@ -40,6 +34,10 @@ misc_dash_slices, update_slice_ids, ) +from superset.models.dashboard import Dashboard +from superset.models.slice import Slice +from superset.utils import core as utils +from superset.utils.core import DatasourceType def load_world_bank_health_n_pop( # pylint: disable=too-many-locals, too-many-statements @@ -87,6 +85,7 @@ def load_world_bank_health_n_pop( # pylint: disable=too-many-locals, too-many-s tbl = db.session.query(table).filter_by(table_name=tbl_name).first() if not tbl: tbl = table(table_name=tbl_name, schema=schema) + db.session.add(tbl) tbl.description = utils.readfile( os.path.join(get_examples_folder(), "countries.md") ) @@ -110,7 +109,6 @@ def load_world_bank_health_n_pop( # pylint: disable=too-many-locals, too-many-s SqlMetric(metric_name=metric, expression=f"{aggr_func}({col})") ) - db.session.merge(tbl) db.session.commit() tbl.fetch_metadata() @@ -126,6 +124,7 @@ def load_world_bank_health_n_pop( # pylint: disable=too-many-locals, too-many-s if not dash: dash = Dashboard() + db.session.add(dash) dash.published = True pos = dashboard_positions slices = update_slice_ids(pos) @@ -134,7 +133,6 @@ def load_world_bank_health_n_pop( # pylint: disable=too-many-locals, too-many-s dash.position_json = json.dumps(pos, indent=4) dash.slug = slug dash.slices = slices - db.session.merge(dash) db.session.commit() @@ -169,35 +167,6 @@ def create_slices(tbl: BaseDatasource) -> list[Slice]: } return [ - Slice( - slice_name="Region Filter", - viz_type="filter_box", - datasource_type=DatasourceType.TABLE, - datasource_id=tbl.id, - params=get_slice_json( - defaults, - viz_type="filter_box", - date_filter=False, - filter_configs=[ - { - "asc": False, - "clearable": True, - "column": "region", - "key": "2s98dfu", - "metric": "sum__SP_POP_TOTL", - "multiple": False, - }, - { - "asc": False, - "clearable": True, - "key": "li3j2lk", - "column": "country_name", - "metric": "sum__SP_POP_TOTL", - "multiple": True, - }, - ], - ), - ), Slice( slice_name="World's Population", viz_type="big_number", @@ -374,18 +343,12 @@ def create_slices(tbl: BaseDatasource) -> list[Slice]: dashboard_positions = { - "CHART-36bfc934": { - "children": [], - "id": "CHART-36bfc934", - "meta": {"chartId": 40, "height": 25, "sliceName": "Region Filter", "width": 2}, - "type": "CHART", - }, "CHART-37982887": { "children": [], "id": "CHART-37982887", "meta": { "chartId": 41, - "height": 25, + "height": 52, "sliceName": "World's Population", "width": 2, }, @@ -466,7 +429,7 @@ def create_slices(tbl: BaseDatasource) -> list[Slice]: "type": "COLUMN", }, "COLUMN-fe3914b8": { - "children": ["CHART-36bfc934", "CHART-37982887"], + "children": ["CHART-37982887"], "id": "COLUMN-fe3914b8", "meta": {"background": "BACKGROUND_TRANSPARENT", "width": 2}, "type": "COLUMN", diff --git a/superset/explore/api.py b/superset/explore/api.py index ebda161beadb9..faadbe8d9ad79 100644 --- a/superset/explore/api.py +++ b/superset/explore/api.py @@ -19,18 +19,18 @@ from flask import g, request, Response from flask_appbuilder.api import expose, protect, safe -from superset.charts.commands.exceptions import ChartNotFoundError +from superset.commands.chart.exceptions import ChartNotFoundError +from superset.commands.explore.get import GetExploreCommand +from superset.commands.explore.parameters import CommandParameters +from superset.commands.temporary_cache.exceptions import ( + TemporaryCacheAccessDeniedError, + TemporaryCacheResourceNotFoundError, +) from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP -from superset.explore.commands.get import GetExploreCommand -from superset.explore.commands.parameters import CommandParameters from superset.explore.exceptions import DatasetAccessDeniedError, WrongEndpointError from superset.explore.permalink.exceptions import ExplorePermalinkGetFailedError from superset.explore.schemas import ExploreContextSchema from superset.extensions import event_logger -from superset.temporary_cache.commands.exceptions import ( - TemporaryCacheAccessDeniedError, - TemporaryCacheResourceNotFoundError, -) from superset.views.base_api import BaseSupersetApi, statsd_metrics logger = logging.getLogger(__name__) diff --git a/superset/explore/form_data/api.py b/superset/explore/form_data/api.py index 36489ca44974d..6c882d92a6fe6 100644 --- a/superset/explore/form_data/api.py +++ b/superset/explore/form_data/api.py @@ -20,18 +20,18 @@ from flask_appbuilder.api import expose, protect, safe from marshmallow import ValidationError -from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP -from superset.explore.form_data.commands.create import CreateFormDataCommand -from superset.explore.form_data.commands.delete import DeleteFormDataCommand -from superset.explore.form_data.commands.get import GetFormDataCommand -from superset.explore.form_data.commands.parameters import CommandParameters -from superset.explore.form_data.commands.update import UpdateFormDataCommand -from superset.explore.form_data.schemas import FormDataPostSchema, FormDataPutSchema -from superset.extensions import event_logger -from superset.temporary_cache.commands.exceptions import ( +from superset.commands.explore.form_data.create import CreateFormDataCommand +from superset.commands.explore.form_data.delete import DeleteFormDataCommand +from superset.commands.explore.form_data.get import GetFormDataCommand +from superset.commands.explore.form_data.parameters import CommandParameters +from superset.commands.explore.form_data.update import UpdateFormDataCommand +from superset.commands.temporary_cache.exceptions import ( TemporaryCacheAccessDeniedError, TemporaryCacheResourceNotFoundError, ) +from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP +from superset.explore.form_data.schemas import FormDataPostSchema, FormDataPutSchema +from superset.extensions import event_logger from superset.views.base_api import BaseSupersetApi, requires_json, statsd_metrics logger = logging.getLogger(__name__) diff --git a/superset/explore/permalink/api.py b/superset/explore/permalink/api.py index b249d4dee2384..bc9bd1cf67a21 100644 --- a/superset/explore/permalink/api.py +++ b/superset/explore/permalink/api.py @@ -20,17 +20,17 @@ from flask_appbuilder.api import expose, protect, safe from marshmallow import ValidationError -from superset.charts.commands.exceptions import ( +from superset.commands.chart.exceptions import ( ChartAccessDeniedError, ChartNotFoundError, ) -from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP -from superset.datasets.commands.exceptions import ( +from superset.commands.dataset.exceptions import ( DatasetAccessDeniedError, DatasetNotFoundError, ) -from superset.explore.permalink.commands.create import CreateExplorePermalinkCommand -from superset.explore.permalink.commands.get import GetExplorePermalinkCommand +from superset.commands.explore.permalink.create import CreateExplorePermalinkCommand +from superset.commands.explore.permalink.get import GetExplorePermalinkCommand +from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP from superset.explore.permalink.exceptions import ExplorePermalinkInvalidStateError from superset.explore.permalink.schemas import ExplorePermalinkStateSchema from superset.extensions import event_logger diff --git a/superset/explore/utils.py b/superset/explore/utils.py index ca73cb39fb481..7d5c0d86be3d1 100644 --- a/superset/explore/utils.py +++ b/superset/explore/utils.py @@ -17,10 +17,14 @@ from typing import Optional from superset import security_manager -from superset.charts.commands.exceptions import ( +from superset.commands.chart.exceptions import ( ChartAccessDeniedError, ChartNotFoundError, ) +from superset.commands.dataset.exceptions import ( + DatasetAccessDeniedError, + DatasetNotFoundError, +) from superset.commands.exceptions import ( DatasourceNotFoundValidationError, DatasourceTypeInvalidError, @@ -29,10 +33,6 @@ from superset.daos.chart import ChartDAO from superset.daos.dataset import DatasetDAO from superset.daos.query import QueryDAO -from superset.datasets.commands.exceptions import ( - DatasetAccessDeniedError, - DatasetNotFoundError, -) from superset.utils.core import DatasourceType diff --git a/superset/extensions/metadb.py b/superset/extensions/metadb.py index 5b014b7af6642..bdfe1ae1e7cbb 100644 --- a/superset/extensions/metadb.py +++ b/superset/extensions/metadb.py @@ -38,6 +38,7 @@ from __future__ import annotations import datetime +import decimal import operator import urllib.parse from collections.abc import Iterator @@ -49,7 +50,6 @@ from shillelagh.backends.apsw.dialects.base import APSWDialect from shillelagh.exceptions import ProgrammingError from shillelagh.fields import ( - Blob, Boolean, Date, DateTime, @@ -86,7 +86,7 @@ class SupersetAPSWDialect(APSWDialect): Queries can also join data across different Superset databases. - The dialect is built in top of the shillelagh library, leveraging SQLite to + The dialect is built in top of the Shillelagh library, leveraging SQLite to create virtual tables on-the-fly proxying Superset tables. The `SupersetShillelaghAdapter` adapter is responsible for returning data when a Superset table is accessed. @@ -164,11 +164,32 @@ class Duration(Field[datetime.timedelta, datetime.timedelta]): db_api_type = "DATETIME" +class Decimal(Field[decimal.Decimal, decimal.Decimal]): + """ + Shillelagh field used for representing decimals. + """ + + type = "DECIMAL" + db_api_type = "NUMBER" + + +class FallbackField(Field[Any, str]): + """ + Fallback field for unknown types; converts to string. + """ + + type = "TEXT" + db_api_type = "STRING" + + def parse(self, value: Any) -> str | None: + return value if value is None else str(value) + + # pylint: disable=too-many-instance-attributes class SupersetShillelaghAdapter(Adapter): """ - A shillelagh adapter for Superset tables. + A Shillelagh adapter for Superset tables. Shillelagh adapters are responsible for fetching data from a given resource, allowing it to be represented as a virtual table in SQLite. This one works @@ -190,6 +211,7 @@ class SupersetShillelaghAdapter(Adapter): datetime.datetime: DateTime, datetime.time: Time, datetime.timedelta: Duration, + decimal.Decimal: Decimal, } @staticmethod @@ -268,7 +290,7 @@ def get_field(cls, python_type: Any) -> Field: """ Convert a Python type into a Shillelagh field. """ - class_ = cls.type_map.get(python_type, Blob) + class_ = cls.type_map.get(python_type, FallbackField) return class_(filters=[Equal, Range], order=Order.ANY, exact=True) def _set_columns(self) -> None: diff --git a/superset/extensions/metastore_cache.py b/superset/extensions/metastore_cache.py index b6effdfe91797..435c38ced8e67 100644 --- a/superset/extensions/metastore_cache.py +++ b/superset/extensions/metastore_cache.py @@ -71,7 +71,7 @@ def get_key(self, key: str) -> UUID: @staticmethod def _prune() -> None: # pylint: disable=import-outside-toplevel - from superset.key_value.commands.delete_expired import ( + from superset.commands.key_value.delete_expired import ( DeleteExpiredKeyValueCommand, ) @@ -85,7 +85,7 @@ def _get_expiry(self, timeout: Optional[int]) -> Optional[datetime]: def set(self, key: str, value: Any, timeout: Optional[int] = None) -> bool: # pylint: disable=import-outside-toplevel - from superset.key_value.commands.upsert import UpsertKeyValueCommand + from superset.commands.key_value.upsert import UpsertKeyValueCommand UpsertKeyValueCommand( resource=RESOURCE, @@ -98,7 +98,7 @@ def set(self, key: str, value: Any, timeout: Optional[int] = None) -> bool: def add(self, key: str, value: Any, timeout: Optional[int] = None) -> bool: # pylint: disable=import-outside-toplevel - from superset.key_value.commands.create import CreateKeyValueCommand + from superset.commands.key_value.create import CreateKeyValueCommand try: CreateKeyValueCommand( @@ -115,7 +115,7 @@ def add(self, key: str, value: Any, timeout: Optional[int] = None) -> bool: def get(self, key: str) -> Any: # pylint: disable=import-outside-toplevel - from superset.key_value.commands.get import GetKeyValueCommand + from superset.commands.key_value.get import GetKeyValueCommand return GetKeyValueCommand( resource=RESOURCE, @@ -131,6 +131,6 @@ def has(self, key: str) -> bool: def delete(self, key: str) -> Any: # pylint: disable=import-outside-toplevel - from superset.key_value.commands.delete import DeleteKeyValueCommand + from superset.commands.key_value.delete import DeleteKeyValueCommand return DeleteKeyValueCommand(resource=RESOURCE, key=self.get_key(key)).run() diff --git a/superset/jinja_context.py b/superset/jinja_context.py index c159a667ee4ed..3b046b732e30b 100644 --- a/superset/jinja_context.py +++ b/superset/jinja_context.py @@ -17,9 +17,11 @@ """Defines the templating context for SQL Lab""" import json import re +from datetime import datetime from functools import lru_cache, partial from typing import Any, Callable, cast, Optional, TYPE_CHECKING, TypedDict, Union +import dateutil from flask import current_app, g, has_request_context, request from flask_babel import gettext as _ from jinja2 import DebugUndefined @@ -28,8 +30,8 @@ from sqlalchemy.sql.expression import bindparam from sqlalchemy.types import String +from superset.commands.dataset.exceptions import DatasetNotFoundError from superset.constants import LRU_CACHE_MAX_SIZE -from superset.datasets.commands.exceptions import DatasetNotFoundError from superset.exceptions import SupersetTemplateException from superset.extensions import feature_flag_manager from superset.utils.core import ( @@ -486,6 +488,19 @@ def process_template(self, sql: str, **kwargs: Any) -> str: class JinjaTemplateProcessor(BaseTemplateProcessor): + def _parse_datetime(self, dttm: str) -> Optional[datetime]: + """ + Try to parse a datetime and default to None in the worst case. + + Since this may have been rendered by different engines, the datetime may + vary slightly in format. We try to make it consistent, and if all else + fails, just return None. + """ + try: + return dateutil.parser.parse(dttm) + except dateutil.parser.ParserError: + return None + def set_context(self, **kwargs: Any) -> None: super().set_context(**kwargs) extra_cache = ExtraCache( @@ -494,6 +509,23 @@ def set_context(self, **kwargs: Any) -> None: removed_filters=self._removed_filters, dialect=self._database.get_dialect(), ) + + from_dttm = ( + self._parse_datetime(dttm) + if (dttm := self._context.get("from_dttm")) + else None + ) + to_dttm = ( + self._parse_datetime(dttm) + if (dttm := self._context.get("to_dttm")) + else None + ) + + dataset_macro_with_context = partial( + dataset_macro, + from_dttm=from_dttm, + to_dttm=to_dttm, + ) self._context.update( { "url_param": partial(safe_proxy, extra_cache.url_param), @@ -502,7 +534,7 @@ def set_context(self, **kwargs: Any) -> None: "cache_key_wrapper": partial(safe_proxy, extra_cache.cache_key_wrapper), "filter_values": partial(safe_proxy, extra_cache.filter_values), "get_filters": partial(safe_proxy, extra_cache.get_filters), - "dataset": partial(safe_proxy, dataset_macro), + "dataset": partial(safe_proxy, dataset_macro_with_context), } ) @@ -638,12 +670,18 @@ def dataset_macro( dataset_id: int, include_metrics: bool = False, columns: Optional[list[str]] = None, + from_dttm: Optional[datetime] = None, + to_dttm: Optional[datetime] = None, ) -> str: """ Given a dataset ID, return the SQL that represents it. The generated SQL includes all columns (including computed) by default. Optionally the user can also request metrics to be included, and columns to group by. + + The from_dttm and to_dttm parameters are filled in from filter values in explore + views, and we take them to make those properties available to jinja templates in + the underlying dataset. """ # pylint: disable=import-outside-toplevel from superset.daos.dataset import DatasetDAO @@ -659,6 +697,8 @@ def dataset_macro( "filter": [], "metrics": metrics if include_metrics else None, "columns": columns, + "from_dttm": from_dttm, + "to_dttm": to_dttm, } sqla_query = dataset.get_query_str_extended(query_obj, mutate=False) sql = sqla_query.sql diff --git a/superset/key_value/shared_entries.py b/superset/key_value/shared_entries.py index 7895b759079ef..130313157a53d 100644 --- a/superset/key_value/shared_entries.py +++ b/superset/key_value/shared_entries.py @@ -28,7 +28,7 @@ def get_shared_value(key: SharedKey) -> Optional[Any]: # pylint: disable=import-outside-toplevel - from superset.key_value.commands.get import GetKeyValueCommand + from superset.commands.key_value.get import GetKeyValueCommand uuid_key = uuid3(NAMESPACE, key) return GetKeyValueCommand(RESOURCE, key=uuid_key, codec=CODEC).run() @@ -36,7 +36,7 @@ def get_shared_value(key: SharedKey) -> Optional[Any]: def set_shared_value(key: SharedKey, value: Any) -> None: # pylint: disable=import-outside-toplevel - from superset.key_value.commands.create import CreateKeyValueCommand + from superset.commands.key_value.create import CreateKeyValueCommand uuid_key = uuid3(NAMESPACE, key) CreateKeyValueCommand( diff --git a/superset/migrations/shared/migrate_viz/base.py b/superset/migrations/shared/migrate_viz/base.py index b9826fee34f3a..f9e1b9d3c92a2 100644 --- a/superset/migrations/shared/migrate_viz/base.py +++ b/superset/migrations/shared/migrate_viz/base.py @@ -123,7 +123,7 @@ def _migrate_temporal_filter(self, rv_data: dict[str, Any]) -> None: ] @classmethod - def upgrade_slice(cls, slc: Slice) -> Slice: + def upgrade_slice(cls, slc: Slice) -> None: clz = cls(slc.params) form_data_bak = copy.deepcopy(clz.data) @@ -141,10 +141,9 @@ def upgrade_slice(cls, slc: Slice) -> Slice: if "form_data" in (query_context := try_load_json(slc.query_context)): query_context["form_data"] = clz.data slc.query_context = json.dumps(query_context) - return slc @classmethod - def downgrade_slice(cls, slc: Slice) -> Slice: + def downgrade_slice(cls, slc: Slice) -> None: form_data = try_load_json(slc.params) if "viz_type" in (form_data_bak := form_data.get(FORM_DATA_BAK_FIELD_NAME, {})): slc.params = json.dumps(form_data_bak) @@ -153,19 +152,15 @@ def downgrade_slice(cls, slc: Slice) -> Slice: if "form_data" in query_context: query_context["form_data"] = form_data_bak slc.query_context = json.dumps(query_context) - return slc @classmethod def upgrade(cls, session: Session) -> None: slices = session.query(Slice).filter(Slice.viz_type == cls.source_viz_type) for slc in paginated_update( slices, - lambda current, total: print( - f" Updating {current}/{total} charts", end="\r" - ), + lambda current, total: print(f"Upgraded {current}/{total} charts"), ): - new_viz = cls.upgrade_slice(slc) - session.merge(new_viz) + cls.upgrade_slice(slc) @classmethod def downgrade(cls, session: Session) -> None: @@ -177,9 +172,6 @@ def downgrade(cls, session: Session) -> None: ) for slc in paginated_update( slices, - lambda current, total: print( - f" Downgrading {current}/{total} charts", end="\r" - ), + lambda current, total: print(f"Downgraded {current}/{total} charts"), ): - new_viz = cls.downgrade_slice(slc) - session.merge(new_viz) + cls.downgrade_slice(slc) diff --git a/superset/migrations/shared/migrate_viz/processors.py b/superset/migrations/shared/migrate_viz/processors.py index 4ff6b2a93467e..5fbd624aa8bf9 100644 --- a/superset/migrations/shared/migrate_viz/processors.py +++ b/superset/migrations/shared/migrate_viz/processors.py @@ -16,6 +16,8 @@ # under the License. from typing import Any +from superset.utils.core import as_list + from .base import MigrateViz @@ -34,40 +36,6 @@ def _pre_action(self) -> None: self.data["metric"] = self.data["metrics"][0] -class MigrateAreaChart(MigrateViz): - """ - Migrate area charts. - - This migration is incomplete, see https://github.com/apache/superset/pull/24703#discussion_r1265222611 - for more details. If you fix this migration, please update the ``migrate_chart`` - function in ``superset/charts/commands/importers/v1/utils.py`` so that it gets - applied in chart imports. - """ - - source_viz_type = "area" - target_viz_type = "echarts_area" - remove_keys = {"contribution", "stacked_style", "x_axis_label"} - - def _pre_action(self) -> None: - if self.data.get("contribution"): - self.data["contributionMode"] = "row" - - if stacked := self.data.get("stacked_style"): - stacked_map = { - "expand": "Expand", - "stack": "Stack", - } - self.data["show_extra_controls"] = True - self.data["stack"] = stacked_map.get(stacked) - - if x_axis := self.data.get("granularity_sqla"): - self.data["x_axis"] = x_axis - - if x_axis_label := self.data.get("x_axis_label"): - self.data["x_axis_title"] = x_axis_label - self.data["x_axis_title_margin"] = 30 - - class MigratePivotTable(MigrateViz): source_viz_type = "pivot_table" target_viz_type = "pivot_table_v2" @@ -131,3 +99,117 @@ class MigrateSunburst(MigrateViz): source_viz_type = "sunburst" target_viz_type = "sunburst_v2" rename_keys = {"groupby": "columns"} + + +class TimeseriesChart(MigrateViz): + has_x_axis_control = True + rename_keys = { + "bottom_margin": "x_axis_title_margin", + "left_margin": "y_axis_title_margin", + "show_controls": "show_extra_controls", + "x_axis_label": "x_axis_title", + "x_axis_format": "x_axis_time_format", + "x_ticks_layout": "xAxisLabelRotation", + "y_axis_label": "y_axis_title", + "y_axis_showminmax": "truncateYAxis", + "y_log_scale": "logAxis", + } + remove_keys = {"contribution", "show_brush", "show_markers"} + + def _pre_action(self) -> None: + self.data["contributionMode"] = "row" if self.data.get("contribution") else None + self.data["zoomable"] = self.data.get("show_brush") == "yes" + self.data["markerEnabled"] = self.data.get("show_markers") or False + self.data["y_axis_showminmax"] = True + + bottom_margin = self.data.get("bottom_margin") + if self.data.get("x_axis_label") and ( + not bottom_margin or bottom_margin == "auto" + ): + self.data["bottom_margin"] = 30 + + if (rolling_type := self.data.get("rolling_type")) and rolling_type != "None": + self.data["rolling_type"] = rolling_type + + if time_compare := self.data.get("time_compare"): + self.data["time_compare"] = [ + value + " ago" for value in as_list(time_compare) if value + ] + + comparison_type = self.data.get("comparison_type") or "values" + self.data["comparison_type"] = ( + "difference" if comparison_type == "absolute" else comparison_type + ) + + if x_ticks_layout := self.data.get("x_ticks_layout"): + self.data["x_ticks_layout"] = 45 if x_ticks_layout == "45°" else 0 + + +class MigrateLineChart(TimeseriesChart): + source_viz_type = "line" + target_viz_type = "echarts_timeseries_line" + + def _pre_action(self) -> None: + super()._pre_action() + + self.remove_keys.add("line_interpolation") + + line_interpolation = self.data.get("line_interpolation") + if line_interpolation == "cardinal": + self.target_viz_type = "echarts_timeseries_smooth" + elif line_interpolation == "step-before": + self.target_viz_type = "echarts_timeseries_step" + self.data["seriesType"] = "start" + elif line_interpolation == "step-after": + self.target_viz_type = "echarts_timeseries_step" + self.data["seriesType"] = "end" + + +class MigrateAreaChart(TimeseriesChart): + source_viz_type = "area" + target_viz_type = "echarts_area" + stacked_map = { + "expand": "Expand", + "stack": "Stack", + "stream": "Stream", + } + + def _pre_action(self) -> None: + super()._pre_action() + + self.remove_keys.add("stacked_style") + + self.data["stack"] = self.stacked_map.get( + self.data.get("stacked_style") or "stack" + ) + + self.data["opacity"] = 0.7 + + +class MigrateBubbleChart(MigrateViz): + source_viz_type = "bubble" + target_viz_type = "bubble_v2" + rename_keys = { + "bottom_margin": "x_axis_title_margin", + "left_margin": "y_axis_title_margin", + "limit": "row_limit", + "x_axis_format": "xAxisFormat", + "x_log_scale": "logXAxis", + "x_ticks_layout": "xAxisLabelRotation", + "y_axis_showminmax": "truncateYAxis", + "y_log_scale": "logYAxis", + } + remove_keys = {"x_axis_showminmax"} + + def _pre_action(self) -> None: + bottom_margin = self.data.get("bottom_margin") + if self.data.get("x_axis_label") and ( + not bottom_margin or bottom_margin == "auto" + ): + self.data["bottom_margin"] = 30 + + if x_ticks_layout := self.data.get("x_ticks_layout"): + self.data["x_ticks_layout"] = 45 if x_ticks_layout == "45°" else 0 + + # Truncate y-axis by default to preserve layout + self.data["y_axis_showminmax"] = True diff --git a/superset/migrations/shared/security_converge.py b/superset/migrations/shared/security_converge.py index 9b1730a2a1464..42a68acb24369 100644 --- a/superset/migrations/shared/security_converge.py +++ b/superset/migrations/shared/security_converge.py @@ -243,7 +243,6 @@ def migrate_roles( if new_pvm not in role.permissions: logger.info(f"Add {new_pvm} to {role}") role.permissions.append(new_pvm) - session.merge(role) # Delete old permissions _delete_old_permissions(session, pvm_map) diff --git a/superset/migrations/shared/utils.py b/superset/migrations/shared/utils.py index 32e7dc1a3992e..2ae0dfeac158a 100644 --- a/superset/migrations/shared/utils.py +++ b/superset/migrations/shared/utils.py @@ -43,11 +43,9 @@ def table_has_column(table: str, column: str) -> bool: :param column: A column name :returns: True iff the column exists in the table """ - config = op.get_context().config - engine = engine_from_config( - config.get_section(config.config_ini_section), prefix="sqlalchemy." - ) - insp = reflection.Inspector.from_engine(engine) + + insp = inspect(op.get_context().bind) + try: return any(col["name"] == column for col in insp.get_columns(table)) except NoSuchTableError: diff --git a/superset/migrations/versions/2016-04-25_08-54_c3a8f8611885_materializing_permission.py b/superset/migrations/versions/2016-04-25_08-54_c3a8f8611885_materializing_permission.py index b92378f092a6e..c3d04e875a822 100644 --- a/superset/migrations/versions/2016-04-25_08-54_c3a8f8611885_materializing_permission.py +++ b/superset/migrations/versions/2016-04-25_08-54_c3a8f8611885_materializing_permission.py @@ -56,7 +56,6 @@ def upgrade(): for slc in session.query(Slice).all(): if slc.datasource: slc.perm = slc.datasource.perm - session.merge(slc) session.commit() db.session.close() diff --git a/superset/migrations/versions/2016-09-07_23-50_33d996bcc382_update_slice_model.py b/superset/migrations/versions/2016-09-07_23-50_33d996bcc382_update_slice_model.py index f4373a3f383ef..8f4542cb3cd9f 100644 --- a/superset/migrations/versions/2016-09-07_23-50_33d996bcc382_update_slice_model.py +++ b/superset/migrations/versions/2016-09-07_23-50_33d996bcc382_update_slice_model.py @@ -56,7 +56,6 @@ def upgrade(): slc.datasource_id = slc.druid_datasource_id if slc.table_id: slc.datasource_id = slc.table_id - session.merge(slc) session.commit() session.close() @@ -69,7 +68,6 @@ def downgrade(): slc.druid_datasource_id = slc.datasource_id if slc.datasource_type == "table": slc.table_id = slc.datasource_id - session.merge(slc) session.commit() session.close() op.drop_column("slices", "datasource_id") diff --git a/superset/migrations/versions/2017-01-24_12-31_db0c65b146bd_update_slice_model_json.py b/superset/migrations/versions/2017-01-24_12-31_db0c65b146bd_update_slice_model_json.py index 1f3dbab6367aa..0bae8cd9a374b 100644 --- a/superset/migrations/versions/2017-01-24_12-31_db0c65b146bd_update_slice_model_json.py +++ b/superset/migrations/versions/2017-01-24_12-31_db0c65b146bd_update_slice_model_json.py @@ -57,7 +57,6 @@ def upgrade(): try: d = json.loads(slc.params or "{}") slc.params = json.dumps(d, indent=2, sort_keys=True) - session.merge(slc) session.commit() print(f"Upgraded ({i}/{slice_len}): {slc.slice_name}") except Exception as ex: diff --git a/superset/migrations/versions/2017-02-08_14-16_a99f2f7c195a_rewriting_url_from_shortner_with_new_.py b/superset/migrations/versions/2017-02-08_14-16_a99f2f7c195a_rewriting_url_from_shortner_with_new_.py index 8e97ada3cd69f..8dafb77beedd1 100644 --- a/superset/migrations/versions/2017-02-08_14-16_a99f2f7c195a_rewriting_url_from_shortner_with_new_.py +++ b/superset/migrations/versions/2017-02-08_14-16_a99f2f7c195a_rewriting_url_from_shortner_with_new_.py @@ -80,7 +80,6 @@ def upgrade(): "/".join(split[:-1]) + "/?form_data=" + parse.quote_plus(json.dumps(d)) ) url.url = newurl - session.merge(url) session.commit() print(f"Updating url ({i}/{urls_len})") session.close() diff --git a/superset/migrations/versions/2017-12-08_08-19_67a6ac9b727b_update_spatial_params.py b/superset/migrations/versions/2017-12-08_08-19_67a6ac9b727b_update_spatial_params.py index 6073e8b84c915..81bbb47914961 100644 --- a/superset/migrations/versions/2017-12-08_08-19_67a6ac9b727b_update_spatial_params.py +++ b/superset/migrations/versions/2017-12-08_08-19_67a6ac9b727b_update_spatial_params.py @@ -58,7 +58,6 @@ def upgrade(): del params["latitude"] del params["longitude"] slc.params = json.dumps(params) - session.merge(slc) session.commit() session.close() diff --git a/superset/migrations/versions/2017-12-17_11-06_21e88bc06c02_annotation_migration.py b/superset/migrations/versions/2017-12-17_11-06_21e88bc06c02_annotation_migration.py index 4b1b807a6fddc..785e282397a1f 100644 --- a/superset/migrations/versions/2017-12-17_11-06_21e88bc06c02_annotation_migration.py +++ b/superset/migrations/versions/2017-12-17_11-06_21e88bc06c02_annotation_migration.py @@ -69,7 +69,6 @@ def upgrade(): ) params["annotation_layers"] = new_layers slc.params = json.dumps(params) - session.merge(slc) session.commit() session.close() @@ -86,6 +85,5 @@ def downgrade(): if layers: params["annotation_layers"] = [layer["value"] for layer in layers] slc.params = json.dumps(params) - session.merge(slc) session.commit() session.close() diff --git a/superset/migrations/versions/2018-02-13_08-07_e866bd2d4976_smaller_grid.py b/superset/migrations/versions/2018-02-13_08-07_e866bd2d4976_smaller_grid.py index bf6276d702c7b..6241ab2a3985a 100644 --- a/superset/migrations/versions/2018-02-13_08-07_e866bd2d4976_smaller_grid.py +++ b/superset/migrations/versions/2018-02-13_08-07_e866bd2d4976_smaller_grid.py @@ -62,7 +62,6 @@ def upgrade(): pos["v"] = 1 dashboard.position_json = json.dumps(positions, indent=2) - session.merge(dashboard) session.commit() session.close() @@ -85,6 +84,5 @@ def downgrade(): pos["v"] = 0 dashboard.position_json = json.dumps(positions, indent=2) - session.merge(dashboard) session.commit() pass diff --git a/superset/migrations/versions/2018-04-10_11-19_bf706ae5eb46_cal_heatmap_metric_to_metrics.py b/superset/migrations/versions/2018-04-10_11-19_bf706ae5eb46_cal_heatmap_metric_to_metrics.py index 49b19b9c696fd..2aa703cfec76a 100644 --- a/superset/migrations/versions/2018-04-10_11-19_bf706ae5eb46_cal_heatmap_metric_to_metrics.py +++ b/superset/migrations/versions/2018-04-10_11-19_bf706ae5eb46_cal_heatmap_metric_to_metrics.py @@ -59,7 +59,6 @@ def upgrade(): params["metrics"] = [params.get("metric")] del params["metric"] slc.params = json.dumps(params, indent=2, sort_keys=True) - session.merge(slc) session.commit() print(f"Upgraded ({i}/{slice_len}): {slc.slice_name}") except Exception as ex: diff --git a/superset/migrations/versions/2018-07-22_11-59_bebcf3fed1fe_convert_dashboard_v1_positions.py b/superset/migrations/versions/2018-07-22_11-59_bebcf3fed1fe_convert_dashboard_v1_positions.py index 620e2c5008e62..3dc0bcc4557b2 100644 --- a/superset/migrations/versions/2018-07-22_11-59_bebcf3fed1fe_convert_dashboard_v1_positions.py +++ b/superset/migrations/versions/2018-07-22_11-59_bebcf3fed1fe_convert_dashboard_v1_positions.py @@ -647,7 +647,6 @@ def upgrade(): sorted_by_key = collections.OrderedDict(sorted(v2_layout.items())) dashboard.position_json = json.dumps(sorted_by_key, indent=2) - session.merge(dashboard) session.commit() else: print(f"Skip converted dash_id: {dashboard.id}") diff --git a/superset/migrations/versions/2018-07-26_11-10_c82ee8a39623_add_implicit_tags.py b/superset/migrations/versions/2018-07-26_11-10_c82ee8a39623_add_implicit_tags.py index 0179ba7d0348e..c6a66d6b531aa 100644 --- a/superset/migrations/versions/2018-07-26_11-10_c82ee8a39623_add_implicit_tags.py +++ b/superset/migrations/versions/2018-07-26_11-10_c82ee8a39623_add_implicit_tags.py @@ -33,7 +33,7 @@ from sqlalchemy import Column, DateTime, Enum, ForeignKey, Integer, String from sqlalchemy.ext.declarative import declarative_base, declared_attr -from superset.tags.models import ObjectTypes, TagTypes +from superset.tags.models import ObjectType, TagType from superset.utils.core import get_user_id Base = declarative_base() @@ -77,7 +77,7 @@ class Tag(Base, AuditMixinNullable): id = Column(Integer, primary_key=True) name = Column(String(250), unique=True) - type = Column(Enum(TagTypes)) + type = Column(Enum(TagType)) class TaggedObject(Base, AuditMixinNullable): @@ -86,7 +86,7 @@ class TaggedObject(Base, AuditMixinNullable): id = Column(Integer, primary_key=True) tag_id = Column(Integer, ForeignKey("tag.id")) object_id = Column(Integer) - object_type = Column(Enum(ObjectTypes)) + object_type = Column(Enum(ObjectType)) class User(Base): diff --git a/superset/migrations/versions/2018-08-01_11-47_7fcdcde0761c_.py b/superset/migrations/versions/2018-08-01_11-47_7fcdcde0761c_.py index 02021799e96ec..111cea4506bf5 100644 --- a/superset/migrations/versions/2018-08-01_11-47_7fcdcde0761c_.py +++ b/superset/migrations/versions/2018-08-01_11-47_7fcdcde0761c_.py @@ -76,7 +76,6 @@ def upgrade(): dashboard.id, len(original_text), len(text) ) ) - session.merge(dashboard) session.commit() diff --git a/superset/migrations/versions/2019-04-09_16-27_80aa3f04bc82_add_parent_ids_in_dashboard_layout.py b/superset/migrations/versions/2019-04-09_16-27_80aa3f04bc82_add_parent_ids_in_dashboard_layout.py index c6361009ee4fa..47c8a6cbcc99a 100644 --- a/superset/migrations/versions/2019-04-09_16-27_80aa3f04bc82_add_parent_ids_in_dashboard_layout.py +++ b/superset/migrations/versions/2019-04-09_16-27_80aa3f04bc82_add_parent_ids_in_dashboard_layout.py @@ -80,7 +80,6 @@ def upgrade(): dashboard.position_json = json.dumps( layout, indent=None, separators=(",", ":"), sort_keys=True ) - session.merge(dashboard) except Exception as ex: logging.exception(ex) @@ -110,7 +109,6 @@ def downgrade(): dashboard.position_json = json.dumps( layout, indent=None, separators=(",", ":"), sort_keys=True ) - session.merge(dashboard) except Exception as ex: logging.exception(ex) diff --git a/superset/migrations/versions/2019-05-06_14-30_afc69274c25a_alter_sql_column_data_type_in_query_mysql_table.py b/superset/migrations/versions/2019-05-06_14-30_afc69274c25a_alter_sql_column_data_type_in_query_mysql_table.py index 5d3a76798df81..9fb9acae7a90e 100644 --- a/superset/migrations/versions/2019-05-06_14-30_afc69274c25a_alter_sql_column_data_type_in_query_mysql_table.py +++ b/superset/migrations/versions/2019-05-06_14-30_afc69274c25a_alter_sql_column_data_type_in_query_mysql_table.py @@ -24,7 +24,7 @@ """ import sqlalchemy as sa from alembic import op -from sqlalchemy.databases import mysql +from sqlalchemy.dialects import mysql from sqlalchemy.dialects.mysql.base import MySQLDialect # revision identifiers, used by Alembic. diff --git a/superset/migrations/versions/2019-12-03_13-50_89115a40e8ea_change_table_schema_description_to_long_.py b/superset/migrations/versions/2019-12-03_13-50_89115a40e8ea_change_table_schema_description_to_long_.py index a1d37ab40f331..7679baa5aa337 100644 --- a/superset/migrations/versions/2019-12-03_13-50_89115a40e8ea_change_table_schema_description_to_long_.py +++ b/superset/migrations/versions/2019-12-03_13-50_89115a40e8ea_change_table_schema_description_to_long_.py @@ -28,7 +28,7 @@ import sqlalchemy as sa from alembic import op -from sqlalchemy.databases import mysql +from sqlalchemy.dialects import mysql from sqlalchemy.dialects.mysql.base import MySQLDialect diff --git a/superset/migrations/versions/2020-02-07_14-13_3325d4caccc8_dashboard_scoped_filters.py b/superset/migrations/versions/2020-02-07_14-13_3325d4caccc8_dashboard_scoped_filters.py index 5aa38fd13a48d..ec02a8ca8414e 100644 --- a/superset/migrations/versions/2020-02-07_14-13_3325d4caccc8_dashboard_scoped_filters.py +++ b/superset/migrations/versions/2020-02-07_14-13_3325d4caccc8_dashboard_scoped_filters.py @@ -99,8 +99,6 @@ def upgrade(): ) else: dashboard.json_metadata = None - - session.merge(dashboard) except Exception as ex: logging.exception(f"dashboard {dashboard.id} has error: {ex}") diff --git a/superset/migrations/versions/2020-08-12_00-24_978245563a02_migrate_iframe_to_dash_markdown.py b/superset/migrations/versions/2020-08-12_00-24_978245563a02_migrate_iframe_to_dash_markdown.py index 4202de45609fd..70f1fcc07c690 100644 --- a/superset/migrations/versions/2020-08-12_00-24_978245563a02_migrate_iframe_to_dash_markdown.py +++ b/superset/migrations/versions/2020-08-12_00-24_978245563a02_migrate_iframe_to_dash_markdown.py @@ -163,7 +163,6 @@ def upgrade(): separators=(",", ":"), sort_keys=True, ) - session.merge(dashboard) # remove iframe, separator and markup charts slices_to_remove = ( diff --git a/superset/migrations/versions/2020-09-28_17-57_b56500de1855_add_uuid_column_to_import_mixin.py b/superset/migrations/versions/2020-09-28_17-57_b56500de1855_add_uuid_column_to_import_mixin.py index 9ff117b1e2a3c..574ca1536a34c 100644 --- a/superset/migrations/versions/2020-09-28_17-57_b56500de1855_add_uuid_column_to_import_mixin.py +++ b/superset/migrations/versions/2020-09-28_17-57_b56500de1855_add_uuid_column_to_import_mixin.py @@ -96,7 +96,6 @@ def update_position_json(dashboard, session, uuid_map): del object_["meta"]["uuid"] dashboard.position_json = json.dumps(layout, indent=4) - session.merge(dashboard) def update_dashboards(session, uuid_map): diff --git a/superset/migrations/versions/2021-02-14_11-46_1412ec1e5a7b_legacy_force_directed_to_echart.py b/superset/migrations/versions/2021-02-14_11-46_1412ec1e5a7b_legacy_force_directed_to_echart.py index 4407c1f8b7d41..24a81270d1e37 100644 --- a/superset/migrations/versions/2021-02-14_11-46_1412ec1e5a7b_legacy_force_directed_to_echart.py +++ b/superset/migrations/versions/2021-02-14_11-46_1412ec1e5a7b_legacy_force_directed_to_echart.py @@ -70,7 +70,6 @@ def upgrade(): slc.params = json.dumps(params) slc.viz_type = "graph_chart" - session.merge(slc) session.commit() session.close() @@ -100,6 +99,5 @@ def downgrade(): slc.params = json.dumps(params) slc.viz_type = "directed_force" - session.merge(slc) session.commit() session.close() diff --git a/superset/migrations/versions/2022-08-16_15-23_6d3c6f9d665d_fix_table_chart_conditional_formatting_.py b/superset/migrations/versions/2022-08-16_15-23_6d3c6f9d665d_fix_table_chart_conditional_formatting_.py index 30caf7efa111b..8d9f07093542b 100644 --- a/superset/migrations/versions/2022-08-16_15-23_6d3c6f9d665d_fix_table_chart_conditional_formatting_.py +++ b/superset/migrations/versions/2022-08-16_15-23_6d3c6f9d665d_fix_table_chart_conditional_formatting_.py @@ -72,7 +72,6 @@ def upgrade(): new_conditional_formatting.append(formatter) params["conditional_formatting"] = new_conditional_formatting slc.params = json.dumps(params) - session.merge(slc) session.commit() session.close() diff --git a/superset/migrations/versions/2023-09-06_13-18_317970b4400c_added_time_secondary_column_to_.py b/superset/migrations/versions/2023-09-06_13-18_317970b4400c_added_time_secondary_column_to_.py index 859a6fe5903ea..4972a869110ad 100755 --- a/superset/migrations/versions/2023-09-06_13-18_317970b4400c_added_time_secondary_column_to_.py +++ b/superset/migrations/versions/2023-09-06_13-18_317970b4400c_added_time_secondary_column_to_.py @@ -32,7 +32,7 @@ from sqlalchemy.orm import Session from superset import db -from superset.migrations.shared.utils import paginated_update +from superset.migrations.shared.utils import paginated_update, table_has_column Base = declarative_base() @@ -45,23 +45,25 @@ class SqlaTable(Base): def upgrade(): - op.add_column( - "tables", - sa.Column( - "always_filter_main_dttm", - sa.Boolean(), - nullable=True, - default=False, - server_default=sa.false(), - ), - ) + if not table_has_column("tables", "always_filter_main_dttm"): + op.add_column( + "tables", + sa.Column( + "always_filter_main_dttm", + sa.Boolean(), + nullable=True, + default=False, + server_default=sa.false(), + ), + ) - bind = op.get_bind() - session = db.Session(bind=bind) + bind = op.get_bind() + session = db.Session(bind=bind) - for table in paginated_update(session.query(SqlaTable)): - table.always_filter_main_dttm = False + for table in paginated_update(session.query(SqlaTable)): + table.always_filter_main_dttm = False def downgrade(): - op.drop_column("tables", "always_filter_main_dttm") + if table_has_column("tables", "always_filter_main_dttm"): + op.drop_column("tables", "always_filter_main_dttm") diff --git a/superset/temporary_cache/commands/__init__.py b/superset/migrations/versions/2023-12-01_12-03_b7851ee5522f_replay_317970b4400c.py similarity index 60% rename from superset/temporary_cache/commands/__init__.py rename to superset/migrations/versions/2023-12-01_12-03_b7851ee5522f_replay_317970b4400c.py index 13a83393a9124..b4286736f04d5 100644 --- a/superset/temporary_cache/commands/__init__.py +++ b/superset/migrations/versions/2023-12-01_12-03_b7851ee5522f_replay_317970b4400c.py @@ -14,3 +14,31 @@ # KIND, either express or implied. See the License for the # specific language governing permissions and limitations # under the License. +"""replay 317970b4400c + +Revision ID: b7851ee5522f +Revises: 4b85906e5b91 +Create Date: 2023-12-01 12:03:27.538945 + +""" + +# revision identifiers, used by Alembic. +revision = "b7851ee5522f" +down_revision = "4b85906e5b91" + +from importlib import import_module + +import sqlalchemy as sa +from alembic import op + +module = import_module( + "superset.migrations.versions.2023-09-06_13-18_317970b4400c_added_time_secondary_column_to_" +) + + +def upgrade(): + module.upgrade() + + +def downgrade(): + module.downgrade() diff --git a/superset/models/core.py b/superset/models/core.py index f6e4b972b48db..eece661ec5143 100755 --- a/superset/models/core.py +++ b/superset/models/core.py @@ -59,8 +59,8 @@ from sqlalchemy.sql import ColumnElement, expression, Select from superset import app, db_engine_specs +from superset.commands.database.exceptions import DatabaseInvalidError from superset.constants import LRU_CACHE_MAX_SIZE, PASSWORD_MASK -from superset.databases.commands.exceptions import DatabaseInvalidError from superset.databases.utils import make_url_safe from superset.db_engine_specs.base import MetricType, TimeGrain from superset.extensions import ( @@ -185,6 +185,7 @@ class Database( "is_managed_externally", "external_url", "encrypted_extra", + "impersonate_user", ] export_children = ["tables"] @@ -236,6 +237,11 @@ def disable_data_preview(self) -> bool: # this will prevent any 'trash value' strings from going through return self.get_extra().get("disable_data_preview", False) is True + @property + def schema_options(self) -> dict[str, Any]: + """Additional schema display config for engines with complex schemas""" + return self.get_extra().get("schema_options", {}) + @property def data(self) -> dict[str, Any]: return { @@ -247,6 +253,7 @@ def data(self) -> dict[str, Any]: "allows_cost_estimate": self.allows_cost_estimate, "allows_virtual_table_explore": self.allows_virtual_table_explore, "explore_database_id": self.explore_database_id, + "schema_options": self.schema_options, "parameters": self.parameters, "disable_data_preview": self.disable_data_preview, "parameters_schema": self.parameters_schema, @@ -837,7 +844,9 @@ def get_columns( self, table_name: str, schema: str | None = None ) -> list[ResultSetColumnType]: with self.get_inspector_with_context() as inspector: - return self.db_engine_spec.get_columns(inspector, table_name, schema) + return self.db_engine_spec.get_columns( + inspector, table_name, schema, self.schema_options + ) def get_metrics( self, @@ -965,7 +974,7 @@ def make_sqla_column_compatible( """ label_expected = label or sqla_col.name # add quotes to tables - if self.db_engine_spec.allows_alias_in_select: + if self.db_engine_spec.get_allows_alias_in_select(self): label = self.db_engine_spec.make_label_compatible(label_expected) sqla_col = sqla_col.label(label) sqla_col.key = label_expected diff --git a/superset/models/dashboard.py b/superset/models/dashboard.py index 18c94aa179cee..919c832ab5b4f 100644 --- a/superset/models/dashboard.py +++ b/superset/models/dashboard.py @@ -47,8 +47,12 @@ from sqlalchemy.sql.elements import BinaryExpression from superset import app, db, is_feature_enabled, security_manager -from superset.connectors.base.models import BaseDatasource -from superset.connectors.sqla.models import SqlaTable, SqlMetric, TableColumn +from superset.connectors.sqla.models import ( + BaseDatasource, + SqlaTable, + SqlMetric, + TableColumn, +) from superset.daos.datasource import DatasourceDAO from superset.extensions import cache_manager from superset.models.filter_set import FilterSet diff --git a/superset/models/helpers.py b/superset/models/helpers.py index 83ec9ba37c4fa..3e88bec44f9a3 100644 --- a/superset/models/helpers.py +++ b/superset/models/helpers.py @@ -68,7 +68,12 @@ ) from superset.extensions import feature_flag_manager from superset.jinja_context import BaseTemplateProcessor -from superset.sql_parse import has_table_query, insert_rls, ParsedQuery, sanitize_clause +from superset.sql_parse import ( + has_table_query, + insert_rls_in_predicate, + ParsedQuery, + sanitize_clause, +) from superset.superset_typing import ( AdhocMetric, Column as ColumnTyping, @@ -128,7 +133,7 @@ def validate_adhoc_subquery( level=ErrorLevel.ERROR, ) ) - statement = insert_rls(statement, database_id, default_schema) + statement = insert_rls_in_predicate(statement, database_id, default_schema) statements.append(statement) return ";\n".join(str(statement) for statement in statements) @@ -700,10 +705,7 @@ class ExploreMixin: # pylint: disable=too-many-public-methods "MIN": sa.func.MIN, "MAX": sa.func.MAX, } - - @property - def fetch_value_predicate(self) -> str: - return "fix this!" + fetch_values_predicate = None @property def type(self) -> str: @@ -765,7 +767,7 @@ def db_engine_spec(self) -> builtins.type["BaseEngineSpec"]: raise NotImplementedError() @property - def database(self) -> builtins.type["Database"]: + def database(self) -> "Database": raise NotImplementedError() @property @@ -780,17 +782,20 @@ def sql(self) -> str: def columns(self) -> list[Any]: raise NotImplementedError() - def get_fetch_values_predicate( - self, template_processor: Optional[BaseTemplateProcessor] = None - ) -> TextClause: - raise NotImplementedError() - def get_extra_cache_keys(self, query_obj: dict[str, Any]) -> list[Hashable]: raise NotImplementedError() def get_template_processor(self, **kwargs: Any) -> BaseTemplateProcessor: raise NotImplementedError() + def get_fetch_values_predicate( + self, + template_processor: Optional[ # pylint: disable=unused-argument + BaseTemplateProcessor + ] = None, + ) -> TextClause: + return self.fetch_values_predicate + def get_sqla_row_level_filters( self, template_processor: BaseTemplateProcessor, @@ -865,7 +870,7 @@ def make_sqla_column_compatible( label_expected = label or sqla_col.name db_engine_spec = self.db_engine_spec # add quotes to tables - if db_engine_spec.allows_alias_in_select: + if db_engine_spec.get_allows_alias_in_select(self.database): label = db_engine_spec.make_label_compatible(label_expected) sqla_col = sqla_col.label(label) sqla_col.key = label_expected @@ -900,7 +905,7 @@ def get_query_str_extended( self, query_obj: QueryObjectDict, mutate: bool = True ) -> QueryStringExtended: sqlaq = self.get_sqla_query(**query_obj) - sql = self.database.compile_sqla_query(sqlaq.sqla_query) # type: ignore + sql = self.database.compile_sqla_query(sqlaq.sqla_query) sql = self._apply_cte(sql, sqlaq.cte) sql = sqlparse.format(sql, reindent=True) if mutate: @@ -939,7 +944,7 @@ def _normalize_prequery_result_type( value = value.item() column_ = columns_by_name[dimension] - db_extra: dict[str, Any] = self.database.get_extra() # type: ignore + db_extra: dict[str, Any] = self.database.get_extra() if isinstance(column_, dict): if ( @@ -1024,9 +1029,7 @@ def assign_column_label(df: pd.DataFrame) -> Optional[pd.DataFrame]: return df try: - df = self.database.get_df( - sql, self.schema, mutator=assign_column_label # type: ignore - ) + df = self.database.get_df(sql, self.schema, mutator=assign_column_label) except Exception as ex: # pylint: disable=broad-except df = pd.DataFrame() status = QueryStatus.FAILED @@ -1337,37 +1340,46 @@ def get_time_filter( # pylint: disable=too-many-arguments ) return and_(*l) - def values_for_column(self, column_name: str, limit: int = 10000) -> list[Any]: - """Runs query against sqla to retrieve some - sample values for the given column. - """ - cols = {} - for col in self.columns: - if isinstance(col, dict): - cols[col.get("column_name")] = col - else: - cols[col.column_name] = col - - target_col = cols[column_name] - tp = None # todo(hughhhh): add back self.get_template_processor() + def values_for_column( + self, column_name: str, limit: int = 10000, denormalize_column: bool = False + ) -> list[Any]: + # denormalize column name before querying for values + # unless disabled in the dataset configuration + db_dialect = self.database.get_dialect() + column_name_ = ( + self.database.db_engine_spec.denormalize_name(db_dialect, column_name) + if denormalize_column + else column_name + ) + cols = {col.column_name: col for col in self.columns} + target_col = cols[column_name_] + tp = self.get_template_processor() tbl, cte = self.get_from_clause(tp) - if isinstance(target_col, dict): - sql_column = sa.column(target_col.get("name")) - else: - sql_column = target_col - - qry = sa.select([sql_column]).select_from(tbl).distinct() + qry = ( + sa.select( + # The alias (label) here is important because some dialects will + # automatically add a random alias to the projection because of the + # call to DISTINCT; others will uppercase the column names. This + # gives us a deterministic column name in the dataframe. + [target_col.get_sqla_col(template_processor=tp).label("column_values")] + ) + .select_from(tbl) + .distinct() + ) if limit: qry = qry.limit(limit) - with self.database.get_sqla_engine_with_context() as engine: # type: ignore + if self.fetch_values_predicate: + qry = qry.where(self.get_fetch_values_predicate(template_processor=tp)) + + with self.database.get_sqla_engine_with_context() as engine: sql = qry.compile(engine, compile_kwargs={"literal_binds": True}) sql = self._apply_cte(sql, cte) sql = self.mutate_query_from_config(sql) df = pd.read_sql_query(sql=sql, con=engine) - return df[column_name].to_list() + return df["column_values"].to_list() def get_timestamp_expression( self, @@ -1939,7 +1951,7 @@ def get_sqla_query( # pylint: disable=too-many-arguments,too-many-locals,too-ma ) having_clause_and += [self.text(having)] - if apply_fetch_values_predicate and self.fetch_values_predicate: # type: ignore + if apply_fetch_values_predicate and self.fetch_values_predicate: qry = qry.where( self.get_fetch_values_predicate(template_processor=template_processor) ) @@ -1958,7 +1970,7 @@ def get_sqla_query( # pylint: disable=too-many-arguments,too-many-locals,too-ma col = col.element if ( - db_engine_spec.allows_alias_in_select + db_engine_spec.get_allows_alias_in_select(self.database) and db_engine_spec.allows_hidden_cc_in_orderby and col.name in [select_col.name for select_col in select_exprs] ): diff --git a/superset/models/slice.py b/superset/models/slice.py index 248f4ee947e7d..b41bb72a85496 100644 --- a/superset/models/slice.py +++ b/superset/models/slice.py @@ -51,7 +51,7 @@ if TYPE_CHECKING: from superset.common.query_context import QueryContext from superset.common.query_context_factory import QueryContextFactory - from superset.connectors.base.models import BaseDatasource + from superset.connectors.sqla.models import BaseDatasource metadata = Model.metadata # pylint: disable=no-member slice_user = Table( diff --git a/superset/queries/saved_queries/api.py b/superset/queries/saved_queries/api.py index 69e1a6191bca5..ce283dd6d6797 100644 --- a/superset/queries/saved_queries/api.py +++ b/superset/queries/saved_queries/api.py @@ -32,19 +32,17 @@ NoValidFilesFoundError, ) from superset.commands.importers.v1.utils import get_contents_from_bundle +from superset.commands.query.delete import DeleteSavedQueryCommand +from superset.commands.query.exceptions import ( + SavedQueryDeleteFailedError, + SavedQueryNotFoundError, +) +from superset.commands.query.export import ExportSavedQueriesCommand +from superset.commands.query.importers.dispatcher import ImportSavedQueriesCommand from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP, RouteMethod from superset.databases.filters import DatabaseFilter from superset.extensions import event_logger from superset.models.sql_lab import SavedQuery -from superset.queries.saved_queries.commands.delete import DeleteSavedQueryCommand -from superset.queries.saved_queries.commands.exceptions import ( - SavedQueryDeleteFailedError, - SavedQueryNotFoundError, -) -from superset.queries.saved_queries.commands.export import ExportSavedQueriesCommand -from superset.queries.saved_queries.commands.importers.dispatcher import ( - ImportSavedQueriesCommand, -) from superset.queries.saved_queries.filters import ( SavedQueryAllTextFilter, SavedQueryFavoriteFilter, @@ -84,7 +82,11 @@ class SavedQueryRestApi(BaseSupersetModelRestApi): base_filters = [["id", SavedQueryFilter, lambda: []]] show_columns = [ + "changed_on", "changed_on_delta_humanized", + "changed_by.first_name", + "changed_by.id", + "changed_by.last_name", "created_by.first_name", "created_by.id", "created_by.last_name", @@ -99,7 +101,11 @@ class SavedQueryRestApi(BaseSupersetModelRestApi): "template_parameters", ] list_columns = [ + "changed_on", "changed_on_delta_humanized", + "changed_by.first_name", + "changed_by.id", + "changed_by.last_name", "created_on", "created_by.first_name", "created_by.id", @@ -142,7 +148,7 @@ class SavedQueryRestApi(BaseSupersetModelRestApi): "last_run_delta_humanized", ] - search_columns = ["id", "database", "label", "schema", "created_by"] + search_columns = ["id", "database", "label", "schema", "created_by", "changed_by"] if is_feature_enabled("TAGGING_SYSTEM"): search_columns += ["tags"] search_filters = { @@ -163,7 +169,7 @@ class SavedQueryRestApi(BaseSupersetModelRestApi): "database": "database_name", } base_related_field_filters = {"database": [["id", DatabaseFilter, lambda: []]]} - allowed_rel_fields = {"database"} + allowed_rel_fields = {"database", "changed_by", "created_by"} allowed_distinct_fields = {"schema"} def pre_add(self, item: SavedQuery) -> None: diff --git a/superset/reports/api.py b/superset/reports/api.py index 3116aef3b825f..8238213fefd72 100644 --- a/superset/reports/api.py +++ b/superset/reports/api.py @@ -26,13 +26,9 @@ from superset import is_feature_enabled from superset.charts.filters import ChartFilter -from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP, RouteMethod -from superset.dashboards.filters import DashboardAccessFilter -from superset.databases.filters import DatabaseFilter -from superset.extensions import event_logger -from superset.reports.commands.create import CreateReportScheduleCommand -from superset.reports.commands.delete import DeleteReportScheduleCommand -from superset.reports.commands.exceptions import ( +from superset.commands.report.create import CreateReportScheduleCommand +from superset.commands.report.delete import DeleteReportScheduleCommand +from superset.commands.report.exceptions import ( ReportScheduleCreateFailedError, ReportScheduleDeleteFailedError, ReportScheduleForbiddenError, @@ -40,7 +36,11 @@ ReportScheduleNotFoundError, ReportScheduleUpdateFailedError, ) -from superset.reports.commands.update import UpdateReportScheduleCommand +from superset.commands.report.update import UpdateReportScheduleCommand +from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP, RouteMethod +from superset.dashboards.filters import DashboardAccessFilter +from superset.databases.filters import DatabaseFilter +from superset.extensions import event_logger from superset.reports.filters import ReportScheduleAllTextFilter, ReportScheduleFilter from superset.reports.models import ReportSchedule from superset.reports.schemas import ( @@ -198,6 +198,7 @@ def ensure_alert_reports_enabled(self) -> Optional[Response]: search_columns = [ "name", "active", + "changed_by", "created_by", "owners", "type", @@ -207,7 +208,14 @@ def ensure_alert_reports_enabled(self) -> Optional[Response]: "chart_id", ] search_filters = {"name": [ReportScheduleAllTextFilter]} - allowed_rel_fields = {"owners", "chart", "dashboard", "database", "created_by"} + allowed_rel_fields = { + "owners", + "chart", + "dashboard", + "database", + "created_by", + "changed_by", + } base_related_field_filters = { "chart": [["id", ChartFilter, lambda: []]], diff --git a/superset/reports/notifications/slack.py b/superset/reports/notifications/slack.py index a769622b57640..fbae398bc50d6 100644 --- a/superset/reports/notifications/slack.py +++ b/superset/reports/notifications/slack.py @@ -44,6 +44,7 @@ NotificationParamException, NotificationUnprocessableException, ) +from superset.utils.core import get_email_address_list from superset.utils.decorators import statsd_gauge logger = logging.getLogger(__name__) @@ -60,7 +61,15 @@ class SlackNotification(BaseNotification): # pylint: disable=too-few-public-met type = ReportRecipientType.SLACK def _get_channel(self) -> str: - return json.loads(self._recipient.recipient_config_json)["target"] + """ + Get the recipient's channel(s). + Note Slack SDK uses "channel" to refer to one or more + channels. Multiple channels are demarcated by a comma. + :returns: The comma separated list of channel(s) + """ + recipient_str = json.loads(self._recipient.recipient_config_json)["target"] + + return ",".join(get_email_address_list(recipient_str)) def _message_template(self, table: str = "") -> str: return __( diff --git a/superset/result_set.py b/superset/result_set.py index 82832eb8ea4ac..5483271035031 100644 --- a/superset/result_set.py +++ b/superset/result_set.py @@ -29,6 +29,7 @@ from superset.db_engine_specs import BaseEngineSpec from superset.superset_typing import DbapiDescription, DbapiResult, ResultSetColumnType from superset.utils import core as utils +from superset.utils.core import GenericDataType logger = logging.getLogger(__name__) @@ -222,6 +223,18 @@ def is_temporal(self, db_type_str: Optional[str]) -> bool: return False return column_spec.is_dttm + def type_generic( + self, db_type_str: Optional[str] + ) -> Optional[utils.GenericDataType]: + column_spec = self.db_engine_spec.get_column_spec(db_type_str) + if column_spec is None: + return None + + if column_spec.is_dttm: + return GenericDataType.TEMPORAL + + return column_spec.generic_type + def data_type(self, col_name: str, pa_dtype: pa.DataType) -> Optional[str]: """Given a pyarrow data type, Returns a generic database type""" if set_type := self._type_dict.get(col_name): @@ -255,7 +268,8 @@ def columns(self) -> list[ResultSetColumnType]: "column_name": col.name, "name": col.name, "type": db_type_str, - "is_dttm": self.is_temporal(db_type_str), + "type_generic": self.type_generic(db_type_str), + "is_dttm": self.is_temporal(db_type_str) or False, } columns.append(column) return columns diff --git a/superset/row_level_security/api.py b/superset/row_level_security/api.py index 0a823f74d627d..fc505e724ffa3 100644 --- a/superset/row_level_security/api.py +++ b/superset/row_level_security/api.py @@ -28,14 +28,14 @@ DatasourceNotFoundValidationError, RolesNotFoundValidationError, ) +from superset.commands.security.create import CreateRLSRuleCommand +from superset.commands.security.delete import DeleteRLSRuleCommand +from superset.commands.security.exceptions import RLSRuleNotFoundError +from superset.commands.security.update import UpdateRLSRuleCommand from superset.connectors.sqla.models import RowLevelSecurityFilter from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP, RouteMethod from superset.daos.exceptions import DAOCreateFailedError, DAOUpdateFailedError from superset.extensions import event_logger -from superset.row_level_security.commands.create import CreateRLSRuleCommand -from superset.row_level_security.commands.delete import DeleteRLSRuleCommand -from superset.row_level_security.commands.exceptions import RLSRuleNotFoundError -from superset.row_level_security.commands.update import UpdateRLSRuleCommand from superset.row_level_security.schemas import ( get_delete_ids_schema, openapi_spec_methods_override, @@ -77,6 +77,9 @@ class RLSRestApi(BaseSupersetModelRestApi): "roles.name", "clause", "changed_on_delta_humanized", + "changed_by.first_name", + "changed_by.last_name", + "changed_by.id", "group_key", ] order_columns = [ @@ -115,6 +118,8 @@ class RLSRestApi(BaseSupersetModelRestApi): "roles", "group_key", "clause", + "created_by", + "changed_by", ) edit_columns = add_columns @@ -123,7 +128,7 @@ class RLSRestApi(BaseSupersetModelRestApi): add_model_schema = RLSPostSchema() edit_model_schema = RLSPutSchema() - allowed_rel_fields = {"tables", "roles"} + allowed_rel_fields = {"tables", "roles", "created_by", "changed_by"} base_related_field_filters = { "tables": [["id", DatasourceFilter, lambda: []]], "roles": [["id", BaseFilterRelatedRoles, lambda: []]], diff --git a/superset/row_level_security/schemas.py b/superset/row_level_security/schemas.py index 6c8249b875a05..f02767ec1334d 100644 --- a/superset/row_level_security/schemas.py +++ b/superset/row_level_security/schemas.py @@ -20,6 +20,7 @@ from marshmallow.validate import Length, OneOf from superset.connectors.sqla.models import RowLevelSecurityFilter +from superset.dashboards.schemas import UserSchema from superset.utils.core import RowLevelSecurityFilterType id_description = "Unique if of rls filter" @@ -81,6 +82,7 @@ class RLSListSchema(Schema): ) group_key = fields.String(metadata={"description": "group_key_description"}) description = fields.String(metadata={"description": "description_description"}) + changed_by = fields.Nested(UserSchema(exclude=["username"])) class RLSShowSchema(Schema): diff --git a/superset/security/api.py b/superset/security/api.py index b4a306975976a..acafc3257028b 100644 --- a/superset/security/api.py +++ b/superset/security/api.py @@ -24,7 +24,7 @@ from flask_wtf.csrf import generate_csrf from marshmallow import EXCLUDE, fields, post_load, Schema, ValidationError -from superset.embedded_dashboard.commands.exceptions import ( +from superset.commands.dashboard.embedded.exceptions import ( EmbeddedDashboardNotFoundError, ) from superset.extensions import event_logger diff --git a/superset/security/manager.py b/superset/security/manager.py index a14d6ecfc4e5a..a2bb67464845b 100644 --- a/superset/security/manager.py +++ b/superset/security/manager.py @@ -79,8 +79,11 @@ if TYPE_CHECKING: from superset.common.query_context import QueryContext - from superset.connectors.base.models import BaseDatasource - from superset.connectors.sqla.models import RowLevelSecurityFilter, SqlaTable + from superset.connectors.sqla.models import ( + BaseDatasource, + RowLevelSecurityFilter, + SqlaTable, + ) from superset.models.core import Database from superset.models.dashboard import Dashboard from superset.models.sql_lab import Query @@ -878,7 +881,6 @@ def copy_role( ): role_from_permissions.append(permission_view) role_to.permissions = role_from_permissions - self.get_session.merge(role_to) self.get_session.commit() def set_role( @@ -900,7 +902,6 @@ def set_role( permission_view for permission_view in pvms if pvm_check(permission_view) ] role.permissions = role_pvms - self.get_session.merge(role) self.get_session.commit() def _is_admin_only(self, pvm: PermissionView) -> bool: @@ -2154,10 +2155,10 @@ def _get_guest_token_jwt_audience() -> str: @staticmethod def validate_guest_token_resources(resources: GuestTokenResources) -> None: # pylint: disable=import-outside-toplevel - from superset.daos.dashboard import EmbeddedDashboardDAO - from superset.embedded_dashboard.commands.exceptions import ( + from superset.commands.dashboard.embedded.exceptions import ( EmbeddedDashboardNotFoundError, ) + from superset.daos.dashboard import EmbeddedDashboardDAO from superset.models.dashboard import Dashboard for resource in resources: diff --git a/superset/sql_lab.py b/superset/sql_lab.py index 4d71e23d88cee..efbef6560a366 100644 --- a/superset/sql_lab.py +++ b/superset/sql_lab.py @@ -48,7 +48,12 @@ from superset.models.core import Database from superset.models.sql_lab import Query from superset.result_set import SupersetResultSet -from superset.sql_parse import CtasMethod, insert_rls, ParsedQuery +from superset.sql_parse import ( + CtasMethod, + insert_rls_as_subquery, + insert_rls_in_predicate, + ParsedQuery, +) from superset.sqllab.limiting_factor import LimitingFactor from superset.sqllab.utils import write_ipc_buffer from superset.utils.celery import session_scope @@ -191,7 +196,7 @@ def get_sql_results( # pylint: disable=too-many-arguments return handle_query_error(ex, query, session) -def execute_sql_statement( # pylint: disable=too-many-arguments +def execute_sql_statement( # pylint: disable=too-many-arguments, too-many-locals sql_statement: str, query: Query, session: Session, @@ -205,6 +210,16 @@ def execute_sql_statement( # pylint: disable=too-many-arguments parsed_query = ParsedQuery(sql_statement) if is_feature_enabled("RLS_IN_SQLLAB"): + # There are two ways to insert RLS: either replacing the table with a subquery + # that has the RLS, or appending the RLS to the ``WHERE`` clause. The former is + # safer, but not supported in all databases. + insert_rls = ( + insert_rls_as_subquery + if database.db_engine_spec.allows_subqueries + and database.db_engine_spec.allows_alias_in_select + else insert_rls_in_predicate + ) + # Insert any applicable RLS predicates parsed_query = ParsedQuery( str( diff --git a/superset/sql_parse.py b/superset/sql_parse.py index d75551bef0767..cecd673276976 100644 --- a/superset/sql_parse.py +++ b/superset/sql_parse.py @@ -44,6 +44,7 @@ Punctuation, String, Whitespace, + Wildcard, ) from sqlparse.utils import imt @@ -660,18 +661,116 @@ def get_rls_for_table( return None rls = sqlparse.parse(predicate)[0] - add_table_name(rls, str(dataset)) + add_table_name(rls, table.table) return rls -def insert_rls( +def insert_rls_as_subquery( token_list: TokenList, database_id: int, default_schema: Optional[str], ) -> TokenList: """ Update a statement inplace applying any associated RLS predicates. + + The RLS predicate is applied as subquery replacing the original table: + + before: SELECT * FROM some_table WHERE 1=1 + after: SELECT * FROM ( + SELECT * FROM some_table WHERE some_table.id=42 + ) AS some_table + WHERE 1=1 + + This method is safer than ``insert_rls_in_predicate``, but doesn't work in all + databases. + """ + rls: Optional[TokenList] = None + state = InsertRLSState.SCANNING + for token in token_list.tokens: + # Recurse into child token list + if isinstance(token, TokenList): + i = token_list.tokens.index(token) + token_list.tokens[i] = insert_rls_as_subquery( + token, + database_id, + default_schema, + ) + + # Found a source keyword (FROM/JOIN) + if imt(token, m=[(Keyword, "FROM"), (Keyword, "JOIN")]): + state = InsertRLSState.SEEN_SOURCE + + # Found identifier/keyword after FROM/JOIN, test for table + elif state == InsertRLSState.SEEN_SOURCE and ( + isinstance(token, Identifier) or token.ttype == Keyword + ): + rls = get_rls_for_table(token, database_id, default_schema) + if rls: + # replace table with subquery + subquery_alias = ( + token.tokens[-1].value + if isinstance(token, Identifier) + else token.value + ) + i = token_list.tokens.index(token) + + # strip alias from table name + if isinstance(token, Identifier) and token.has_alias(): + whitespace_index = token.token_next_by(t=Whitespace)[0] + token.tokens = token.tokens[:whitespace_index] + + token_list.tokens[i] = Identifier( + [ + Parenthesis( + [ + Token(Punctuation, "("), + Token(DML, "SELECT"), + Token(Whitespace, " "), + Token(Wildcard, "*"), + Token(Whitespace, " "), + Token(Keyword, "FROM"), + Token(Whitespace, " "), + token, + Token(Whitespace, " "), + Where( + [ + Token(Keyword, "WHERE"), + Token(Whitespace, " "), + rls, + ] + ), + Token(Punctuation, ")"), + ] + ), + Token(Whitespace, " "), + Token(Keyword, "AS"), + Token(Whitespace, " "), + Identifier([Token(Name, subquery_alias)]), + ] + ) + state = InsertRLSState.SCANNING + + # Found nothing, leaving source + elif state == InsertRLSState.SEEN_SOURCE and token.ttype != Whitespace: + state = InsertRLSState.SCANNING + + return token_list + + +def insert_rls_in_predicate( + token_list: TokenList, + database_id: int, + default_schema: Optional[str], +) -> TokenList: + """ + Update a statement inplace applying any associated RLS predicates. + + The RLS predicate is ``AND``ed to any existing predicates: + + before: SELECT * FROM some_table WHERE 1=1 + after: SELECT * FROM some_table WHERE ( 1=1) AND some_table.id=42 + """ rls: Optional[TokenList] = None state = InsertRLSState.SCANNING @@ -679,7 +778,11 @@ def insert_rls( # Recurse into child token list if isinstance(token, TokenList): i = token_list.tokens.index(token) - token_list.tokens[i] = insert_rls(token, database_id, default_schema) + token_list.tokens[i] = insert_rls_in_predicate( + token, + database_id, + default_schema, + ) # Found a source keyword (FROM/JOIN) if imt(token, m=[(Keyword, "FROM"), (Keyword, "JOIN")]): diff --git a/superset/sqllab/api.py b/superset/sqllab/api.py index 16070b52cc4a0..6be378a9b5117 100644 --- a/superset/sqllab/api.py +++ b/superset/sqllab/api.py @@ -27,6 +27,10 @@ from marshmallow import ValidationError from superset import app, is_feature_enabled +from superset.commands.sql_lab.estimate import QueryEstimationCommand +from superset.commands.sql_lab.execute import CommandResult, ExecuteSqlCommand +from superset.commands.sql_lab.export import SqlResultExportCommand +from superset.commands.sql_lab.results import SqlExecutionResultsCommand from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP from superset.daos.database import DatabaseDAO from superset.daos.query import QueryDAO @@ -35,10 +39,6 @@ from superset.models.sql_lab import Query from superset.sql_lab import get_sql_results from superset.sqllab.command_status import SqlJsonExecutionStatus -from superset.sqllab.commands.estimate import QueryEstimationCommand -from superset.sqllab.commands.execute import CommandResult, ExecuteSqlCommand -from superset.sqllab.commands.export import SqlResultExportCommand -from superset.sqllab.commands.results import SqlExecutionResultsCommand from superset.sqllab.exceptions import ( QueryIsForbiddenToAccessException, SqlLabException, diff --git a/superset/sqllab/query_render.py b/superset/sqllab/query_render.py index 4fb64c8ce2d76..f4c1c26c6eb4e 100644 --- a/superset/sqllab/query_render.py +++ b/superset/sqllab/query_render.py @@ -24,9 +24,9 @@ from jinja2.meta import find_undeclared_variables from superset import is_feature_enabled +from superset.commands.sql_lab.execute import SqlQueryRender from superset.errors import SupersetErrorType from superset.sql_parse import ParsedQuery -from superset.sqllab.commands.execute import SqlQueryRender from superset.sqllab.exceptions import SqlLabException from superset.utils import core as utils diff --git a/superset/sqllab/validators.py b/superset/sqllab/validators.py index 5bc8a622531c2..b79789da4ccfe 100644 --- a/superset/sqllab/validators.py +++ b/superset/sqllab/validators.py @@ -20,7 +20,7 @@ from typing import TYPE_CHECKING from superset import security_manager -from superset.sqllab.commands.execute import CanAccessQueryValidator +from superset.commands.sql_lab.execute import CanAccessQueryValidator if TYPE_CHECKING: from superset.models.sql_lab import Query diff --git a/superset/superset_typing.py b/superset/superset_typing.py index 953683b5dcd01..c71dcea3f1a2d 100644 --- a/superset/superset_typing.py +++ b/superset/superset_typing.py @@ -84,6 +84,8 @@ class ResultSetColumnType(TypedDict): scale: NotRequired[Any] max_length: NotRequired[Any] + query_as: NotRequired[Any] + CacheConfig = dict[str, Any] DbapiDescriptionRow = tuple[ diff --git a/superset/tags/api.py b/superset/tags/api.py index e9842f5a6a658..c0df921e3ebf1 100644 --- a/superset/tags/api.py +++ b/superset/tags/api.py @@ -22,16 +22,12 @@ from flask_appbuilder.models.sqla.interface import SQLAInterface from marshmallow import ValidationError -from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP, RouteMethod -from superset.daos.tag import TagDAO -from superset.exceptions import MissingUserContextException -from superset.extensions import event_logger -from superset.tags.commands.create import ( +from superset.commands.tag.create import ( CreateCustomTagCommand, CreateCustomTagWithRelationshipsCommand, ) -from superset.tags.commands.delete import DeleteTaggedObjectCommand, DeleteTagsCommand -from superset.tags.commands.exceptions import ( +from superset.commands.tag.delete import DeleteTaggedObjectCommand, DeleteTagsCommand +from superset.commands.tag.exceptions import ( TagDeleteFailedError, TaggedObjectDeleteFailedError, TaggedObjectNotFoundError, @@ -39,8 +35,12 @@ TagNotFoundError, TagUpdateFailedError, ) -from superset.tags.commands.update import UpdateTagCommand -from superset.tags.models import ObjectTypes, Tag +from superset.commands.tag.update import UpdateTagCommand +from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP, RouteMethod +from superset.daos.tag import TagDAO +from superset.exceptions import MissingUserContextException +from superset.extensions import event_logger +from superset.tags.models import ObjectType, Tag from superset.tags.schemas import ( delete_tags_schema, openapi_spec_methods_override, @@ -117,7 +117,7 @@ class TagRestApi(BaseSupersetModelRestApi): related_field_filters = { "created_by": RelatedFieldFilter("first_name", FilterRelatedOwners), } - allowed_rel_fields = {"created_by"} + allowed_rel_fields = {"created_by", "changed_by"} add_model_schema = TagPostSchema() edit_model_schema = TagPutSchema() @@ -364,7 +364,7 @@ def put(self, pk: int) -> Response: action=lambda self, *args, **kwargs: f"{self.__class__.__name__}.add_objects", log_to_statsd=False, ) - def add_objects(self, object_type: ObjectTypes, object_id: int) -> Response: + def add_objects(self, object_type: ObjectType, object_id: int) -> Response: """Add tags to an object. Create new tags if they do not already exist. --- post: @@ -429,7 +429,7 @@ def add_objects(self, object_type: ObjectTypes, object_id: int) -> Response: log_to_statsd=True, ) def delete_object( - self, object_type: ObjectTypes, object_id: int, tag: str + self, object_type: ObjectType, object_id: int, tag: str ) -> Response: """Delete a tagged object. --- @@ -584,12 +584,21 @@ def get_objects(self) -> Response: 500: $ref: '#/components/responses/500' """ + tag_ids = [ + tag_id for tag_id in request.args.get("tagIds", "").split(",") if tag_id + ] tags = [tag for tag in request.args.get("tags", "").split(",") if tag] # filter types types = [type_ for type_ in request.args.get("types", "").split(",") if type_] try: - tagged_objects = TagDAO.get_tagged_objects_for_tags(tags, types) + if tag_ids: + # priotize using ids for lookups vs. names mainly using this + # for backward compatibility + tagged_objects = TagDAO.get_tagged_objects_by_tag_id(tag_ids, types) + else: + tagged_objects = TagDAO.get_tagged_objects_for_tags(tags, types) + result = [ self.object_entity_response_schema.dump(tagged_object) for tagged_object in tagged_objects @@ -609,11 +618,11 @@ def get_objects(self) -> Response: log_to_statsd=False, ) def favorite_status(self, **kwargs: Any) -> Response: - """Favorite Stars for Dashboards + """Favorite Stars for Tags --- get: description: >- - Check favorited dashboards for current user + Get favorited tags for current user parameters: - in: query name: q diff --git a/superset/tags/models.py b/superset/tags/models.py index 7825f283bfc95..bae4417507bd4 100644 --- a/superset/tags/models.py +++ b/superset/tags/models.py @@ -19,10 +19,11 @@ import enum from typing import TYPE_CHECKING +from flask import escape from flask_appbuilder import Model -from sqlalchemy import Column, Enum, ForeignKey, Integer, String, Table, Text +from sqlalchemy import Column, Enum, ForeignKey, Integer, orm, String, Table, Text from sqlalchemy.engine.base import Connection -from sqlalchemy.orm import relationship, Session, sessionmaker +from sqlalchemy.orm import relationship, sessionmaker from sqlalchemy.orm.mapper import Mapper from superset import security_manager @@ -35,7 +36,7 @@ from superset.models.slice import Slice from superset.models.sql_lab import Query -Session = sessionmaker(autoflush=False) +Session = sessionmaker() user_favorite_tag_table = Table( "user_favorite_tag", @@ -45,8 +46,7 @@ ) -class TagTypes(enum.Enum): - +class TagType(enum.Enum): """ Types for tags. @@ -65,8 +65,7 @@ class TagTypes(enum.Enum): favorited_by = 4 -class ObjectTypes(enum.Enum): - +class ObjectType(enum.Enum): """Object types.""" # pylint: disable=invalid-name @@ -83,7 +82,7 @@ class Tag(Model, AuditMixinNullable): __tablename__ = "tag" id = Column(Integer, primary_key=True) name = Column(String(250), unique=True) - type = Column(Enum(TagTypes)) + type = Column(Enum(TagType)) description = Column(Text) objects = relationship( @@ -108,27 +107,27 @@ class TaggedObject(Model, AuditMixinNullable): ForeignKey("slices.id"), ForeignKey("saved_query.id"), ) - object_type = Column(Enum(ObjectTypes)) + object_type = Column(Enum(ObjectType)) tag = relationship("Tag", back_populates="objects", overlaps="tags") -def get_tag(name: str, session: Session, type_: TagTypes) -> Tag: +def get_tag(name: str, session: orm.Session, type_: TagType) -> Tag: tag_name = name.strip() tag = session.query(Tag).filter_by(name=tag_name, type=type_).one_or_none() if tag is None: - tag = Tag(name=tag_name, type=type_) + tag = Tag(name=escape(tag_name), type=type_) session.add(tag) session.commit() return tag -def get_object_type(class_name: str) -> ObjectTypes: +def get_object_type(class_name: str) -> ObjectType: mapping = { - "slice": ObjectTypes.chart, - "dashboard": ObjectTypes.dashboard, - "query": ObjectTypes.query, - "dataset": ObjectTypes.dataset, + "slice": ObjectType.chart, + "dashboard": ObjectType.dashboard, + "query": ObjectType.query, + "dataset": ObjectType.dataset, } try: return mapping[class_name.lower()] @@ -150,12 +149,12 @@ def get_owners_ids( @classmethod def _add_owners( cls, - session: Session, + session: orm.Session, target: Dashboard | FavStar | Slice | Query | SqlaTable, ) -> None: for owner_id in cls.get_owners_ids(target): name = f"owner:{owner_id}" - tag = get_tag(name, session, TagTypes.owner) + tag = get_tag(name, session, TagType.owner) tagged_object = TaggedObject( tag_id=tag.id, object_id=target.id, object_type=cls.object_type ) @@ -168,21 +167,17 @@ def after_insert( connection: Connection, target: Dashboard | FavStar | Slice | Query | SqlaTable, ) -> None: - session = Session(bind=connection) - - try: + with Session(bind=connection) as session: # add `owner:` tags cls._add_owners(session, target) # add `type:` tags - tag = get_tag(f"type:{cls.object_type}", session, TagTypes.type) + tag = get_tag(f"type:{cls.object_type}", session, TagType.type) tagged_object = TaggedObject( tag_id=tag.id, object_id=target.id, object_type=cls.object_type ) session.add(tagged_object) session.commit() - finally: - session.close() @classmethod def after_update( @@ -191,9 +186,7 @@ def after_update( connection: Connection, target: Dashboard | FavStar | Slice | Query | SqlaTable, ) -> None: - session = Session(bind=connection) - - try: + with Session(bind=connection) as session: # delete current `owner:` tags query = ( session.query(TaggedObject.id) @@ -201,7 +194,7 @@ def after_update( .filter( TaggedObject.object_type == cls.object_type, TaggedObject.object_id == target.id, - Tag.type == TagTypes.owner, + Tag.type == TagType.owner, ) ) ids = [row[0] for row in query] @@ -212,8 +205,6 @@ def after_update( # add `owner:` tags cls._add_owners(session, target) session.commit() - finally: - session.close() @classmethod def after_delete( @@ -222,9 +213,7 @@ def after_delete( connection: Connection, target: Dashboard | FavStar | Slice | Query | SqlaTable, ) -> None: - session = Session(bind=connection) - - try: + with Session(bind=connection) as session: # delete row from `tagged_objects` session.query(TaggedObject).filter( TaggedObject.object_type == cls.object_type, @@ -232,8 +221,6 @@ def after_delete( ).delete() session.commit() - finally: - session.close() class ChartUpdater(ObjectUpdater): @@ -273,10 +260,9 @@ class FavStarUpdater: def after_insert( cls, _mapper: Mapper, connection: Connection, target: FavStar ) -> None: - session = Session(bind=connection) - try: + with Session(bind=connection) as session: name = f"favorited_by:{target.user_id}" - tag = get_tag(name, session, TagTypes.favorited_by) + tag = get_tag(name, session, TagType.favorited_by) tagged_object = TaggedObject( tag_id=tag.id, object_id=target.obj_id, @@ -284,22 +270,19 @@ def after_insert( ) session.add(tagged_object) session.commit() - finally: - session.close() @classmethod def after_delete( cls, _mapper: Mapper, connection: Connection, target: FavStar ) -> None: - session = Session(bind=connection) - try: + with Session(bind=connection) as session: name = f"favorited_by:{target.user_id}" query = ( session.query(TaggedObject.id) .join(Tag) .filter( TaggedObject.object_id == target.obj_id, - Tag.type == TagTypes.favorited_by, + Tag.type == TagType.favorited_by, Tag.name == name, ) ) @@ -309,5 +292,3 @@ def after_delete( ) session.commit() - finally: - session.close() diff --git a/superset/tasks/async_queries.py b/superset/tasks/async_queries.py index 609af3bc8ed3d..61970ca1f3801 100644 --- a/superset/tasks/async_queries.py +++ b/superset/tasks/async_queries.py @@ -64,7 +64,7 @@ def load_chart_data_into_cache( form_data: dict[str, Any], ) -> None: # pylint: disable=import-outside-toplevel - from superset.charts.data.commands.get_data_command import ChartDataCommand + from superset.commands.chart.data.get_data_command import ChartDataCommand user = ( security_manager.get_user_by_id(job_metadata.get("user_id")) diff --git a/superset/tasks/scheduler.py b/superset/tasks/scheduler.py index f3cc270b86347..7b1350a07d3af 100644 --- a/superset/tasks/scheduler.py +++ b/superset/tasks/scheduler.py @@ -22,11 +22,11 @@ from superset import app, is_feature_enabled from superset.commands.exceptions import CommandException +from superset.commands.report.exceptions import ReportScheduleUnexpectedError +from superset.commands.report.execute import AsyncExecuteReportScheduleCommand +from superset.commands.report.log_prune import AsyncPruneReportScheduleLogCommand from superset.daos.report import ReportScheduleDAO from superset.extensions import celery_app -from superset.reports.commands.exceptions import ReportScheduleUnexpectedError -from superset.reports.commands.execute import AsyncExecuteReportScheduleCommand -from superset.reports.commands.log_prune import AsyncPruneReportScheduleLogCommand from superset.stats_logger import BaseStatsLogger from superset.tasks.cron_util import cron_schedule_window from superset.utils.celery import session_scope diff --git a/superset/templates/superset/basic.html b/superset/templates/superset/basic.html index 62bd6c81fc216..992fdea6b9385 100644 --- a/superset/templates/superset/basic.html +++ b/superset/templates/superset/basic.html @@ -1,74 +1,81 @@ -{# - Licensed to the Apache Software Foundation (ASF) under one - or more contributor license agreements. See the NOTICE file - distributed with this work for additional information - regarding copyright ownership. The ASF licenses this file - to you under the Apache License, Version 2.0 (the - "License"); you may not use this file except in compliance - with the License. You may obtain a copy of the License at - - http://www.apache.org/licenses/LICENSE-2.0 - - Unless required by applicable law or agreed to in writing, - software distributed under the License is distributed on an - "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY - KIND, either express or implied. See the License for the - specific language governing permissions and limitations - under the License. -#} +{# Licensed to the Apache Software Foundation (ASF) under one or more +contributor license agreements. See the NOTICE file distributed with this work +for additional information regarding copyright ownership. The ASF licenses this +file to you under the Apache License, Version 2.0 (the "License"); you may not +use this file except in compliance with the License. You may obtain a copy of +the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by +applicable law or agreed to in writing, software distributed under the License +is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +KIND, either express or implied. See the License for the specific language +governing permissions and limitations under the License. #} <!DOCTYPE html> -{% import 'appbuilder/general/lib.html' as lib %} -{% from 'superset/partials/asset_bundle.html' import css_bundle, js_bundle with context %} - -{% set favicons = appbuilder.app.config['FAVICONS'] %} +{% import 'appbuilder/general/lib.html' as lib %} {% from +'superset/partials/asset_bundle.html' import css_bundle, js_bundle with context +%} {% set favicons = appbuilder.app.config['FAVICONS'] %} <html> <head> <title> - {% block title %} - {% if title %} - {{ title }} - {% elif appbuilder and appbuilder.app_name %} - {{ appbuilder.app_name }} - {% endif %} - {% endblock %} + {% block title %} {% if title %} {{ title }} {% elif appbuilder and + appbuilder.app_name %} {{ appbuilder.app_name }} {% endif %} {% endblock + %} - {% block head_meta %}{% endblock %} - {% block head_css %} - {% for favicon in favicons %} - - {% endfor %} - - - - - - - - - - - {{ css_bundle("theme") }} + {% block head_meta %}{% endblock %} {% block head_css %} {% for favicon in + favicons %} {% + endfor %} + + + + + + + + - {% if entry %} - {{ css_bundle(entry) }} - {% endif %} - - {% endblock %} - - {{ js_bundle("theme") }} + {{ css_bundle("theme") }} {% if entry %} {{ css_bundle(entry) }} {% endif %} + {% endblock %} {{ js_bundle("theme") }} + /> - - {% block navbar %} - {% if not standalone_mode %} - {% include 'appbuilder/navbar.html' %} - {% endif %} - {% endblock %} - - {% block body %} -
- -
+ + {% block navbar %} {% if not standalone_mode %} {% include + 'appbuilder/navbar.html' %} {% endif %} {% endblock %} {% block body %} +
+ +
{% endblock %} -