Compare commits

...

81 Commits

Author SHA1 Message Date
guyu
9e0e128244 fix 2025-03-10 18:10:27 +08:00
Peter Lee
34e932556b Merge branch 'master' into redis-lock 2025-03-08 11:25:44 +08:00
Tsuneo Yoshioka
594e2f24ef Upgrade plotly.js to version 2 to fix the UI crashing issue (#7359)
* Upgrade plotly.js to version 2

* Fix styling error reported by styled
2025-03-05 14:30:28 +00:00
github-actions[bot]
3275a9e459 Snapshot: 25.03.0-dev 2025-03-01 00:35:44 +00:00
Shunki
3bad8c8e8c TiDB: Exclude INFORMATION_SCHEMA (#7352)
Co-authored-by: snickerjp <snickerjp@gmail.com>
2025-02-28 11:09:46 +09:00
Tsuneo Yoshioka
d0af4499d6 Sanitize NaN, Infinite, -Infinite causing error when saving as PostgreSQL JSON #7339 (2nd try) (#7348)
* Sanitize NaN, Infinite, -Infinite causing error when saving as PostgreSQL JSON #7339 (2nd try)

* Move json nsanitaize to on the top of json_dumps

* Fix comment
2025-02-27 01:40:43 -08:00
Ran Benita
4357ea56ae Fix UnboundLocalError when checking alerts for query (#7346)
This fixes the following exception:

```
UnboundLocalError: local variable 'value_is_number' referenced before assignment
  File "rq/worker.py", line 1431, in perform_job
    rv = job.perform()
  File "rq/job.py", line 1280, in perform
    self._result = self._execute()
  File "rq/job.py", line 1317, in _execute
    result = self.func(*self.args, **self.kwargs)
  File "redash/tasks/alerts.py", line 36, in check_alerts_for_query
    new_state = alert.evaluate()
  File "redash/models/__init__.py", line 1002, in evaluate
    new_state = next_state(op, value, threshold)
  File "redash/models/__init__.py", line 928, in next_state
    elif not value_is_number and op not in [OPERATORS.get("!="), OPERATORS.get("=="), OPERATORS.get("equals")]:
```
2025-02-25 09:15:20 -05:00
Tsuneo Yoshioka
5df5ca87a2 add NULLS LAST option for Query order (#7341) 2025-02-25 10:58:48 +08:00
Tsuneo Yoshioka
8387fe6fcb Fix the issue that chart(scatter, line, bubble...) having same x-value have wrong y-value (#7330) 2025-02-18 20:04:12 +00:00
snickerjp
e95de2ee4c Update oracledb package to version 2.5.1 and adjust Python version compatibility (#7316) 2025-02-18 23:00:09 +09:00
Lee2532
71902e5933 FIX : redash docker image TAG (#7280)
Co-authored-by: snickerjp <snickerjp@gmail.com>
2025-02-15 01:38:23 +09:00
Tsuneo Yoshioka
53eab14cef Make autocomplete always available (#7326) 2025-02-13 15:25:39 -05:00
Eric Radman
925bb91d8e Use absolute path for image resources (#7322)
When MULTI_ORG is enabled, 'static/' resolves to '<org>/static/'
2025-02-12 08:37:40 -05:00
Peter Lee
a50ea05b19 Merge branch 'master' into redis-lock 2025-02-08 12:18:16 +08:00
Tsuneo Yoshioka
ec2ca6f986 BigQuery: show column type on Schema Browser (#7257) 2025-02-05 18:25:39 +00:00
Matt Nelson
96ea0194e8 Fix errors in webex alert destination. Add formatting support for QUERY_RESULT_TABLE. (#7296)
* prevent text values in payload being detected as 'set' on send.
Webex send ERROR:: Object of type set is not JSON serializable

Signed-off-by: Matt Nelson <metheos@gmail.com>

* add support for formatted QUERY_RESULT_TABLE in webex card

Signed-off-by: Matt Nelson <metheos@gmail.com>

* don't try to send to blank destinations

Signed-off-by: Matt Nelson <metheos@gmail.com>

* fix handling of the encoded QUERY_RESULTS_TABLE text

Signed-off-by: Matt Nelson <metheos@gmail.com>

* re-sort imports for ruff

Signed-off-by: Matt Nelson <metheos@gmail.com>

* change formatter to black

Signed-off-by: Matt Nelson <metheos@gmail.com>

* Add additional tests for Webex notification handling

ensure blank entries are handled for room IDs and person emails.
ensure that the API is not called when no valid destinations are provided.
ensure proper attachment formatting for alerts containing 2D arrays.

Signed-off-by: Matt Nelson <metheos@gmail.com>

* Add test for Webex notification with 1D array handling

This commit introduces a new test case to verify that the Webex
notification function correctly handles a 1D array input in the alert body.
The test ensures that the expected payload is constructed properly and that
the requests.post method is called with the correct parameters.

Signed-off-by: Matt Nelson <metheos@gmail.com>

---------

Signed-off-by: Matt Nelson <metheos@gmail.com>
2025-02-04 11:05:13 +00:00
github-actions[bot]
2776992101 Snapshot: 25.02.0-dev 2025-02-01 00:33:52 +00:00
Arik Fraimovich
5cfa6bc217 Update ci.yml to match latest master 2025-01-31 10:29:54 +02:00
Arik Fraimovich
85f001982e GitHub Actions Workflow updates (#7298)
* Split out secrets requiring workflows

* Update target

* Update Cypress run command
2025-01-31 10:20:04 +02:00
guyu
06c9a2b21a fix 2025-01-24 11:48:29 +08:00
guyu
f841b217e8 format 2025-01-24 11:46:17 +08:00
guyu
af496fe5e3 for same query_text refresh just execution once 2025-01-24 11:41:18 +08:00
Motoi Washida
d03a2c4096 Fix error in rehash DB migration with Elasticsearch queries (#7292)
Fixes #7272
2025-01-22 21:19:59 -05:00
SeongTae Jeong
8c5890482a Use ARM64 runners instead of virtualization for ARM64 image builds (#7291) 2025-01-19 16:00:19 +10:00
Ezra Odio
10ce280a96 Default to not allow HTML content in tables (#7064)
Co-authored-by: Ezra Odio <eodio@starfishstorage.com>
2025-01-15 10:09:24 -05:00
dependabot[bot]
0dd7ac3d2e Bump virtualenv from 20.25.0 to 20.26.6 (#7276) 2025-01-14 01:45:58 +00:00
github-actions[bot]
4ee53a9445 Snapshot: 25.01.0-dev 2025-01-01 00:35:12 +00:00
SeongTae Jeong
c08292d90e Use Codecov token (#7265) 2024-12-30 21:06:09 +00:00
SeongTae Jeong
3142131cdd Bump actions/upload-artifact from v3 to v4 (#7266)
Related: https://github.blog/changelog/2024-04-16-deprecation-notice-v3-of-the-artifact-actions/
2024-12-30 15:31:03 -05:00
Daisuke Taniwaki
530c1a0734 Support result reuse in Athena data sources (#7202)
* Support result reuse

* Update pyathena to 2.25.2

* Separate options

* Regenerate the Poetry lock file

---------

Co-authored-by: SeongTae Jeong <seongtaejg@gmail.com>
2024-12-28 05:50:16 +09:00
dependabot[bot]
52dc1769a1 Bump jinja2 from 3.1.4 to 3.1.5 (#7262)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-12-27 13:55:21 +10:00
Eric Radman
b9583c0b48 Create workflow trigger for publishing release image (#7259)
Co-authored-by: Justin Clift <justin@postgresql.org>
2024-12-27 12:19:32 +10:00
Arik Fraimovich
89d7f54e90 Handle the case when query runner configuration is an empty dict. (#7258) 2024-12-24 09:42:39 -05:00
Tsuneo Yoshioka
d884da2b0b BigQuery: add date, datetime type mapping (#7252) 2024-12-18 14:24:45 +02:00
dependabot[bot]
f7d485082c Bump nanoid from 3.3.6 to 3.3.8 (#7249)
Bumps [nanoid](https://github.com/ai/nanoid) from 3.3.6 to 3.3.8.
- [Release notes](https://github.com/ai/nanoid/releases)
- [Changelog](https://github.com/ai/nanoid/blob/main/CHANGELOG.md)
- [Commits](https://github.com/ai/nanoid/compare/3.3.6...3.3.8)

---
updated-dependencies:
- dependency-name: nanoid
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-12-13 17:57:05 +09:00
Eric Radman
130ab1fe1a Update to paramiko-3.4.1 (#7240)
Solves the deprecation warning for TripleDES
Related: https://github.com/paramiko/paramiko/issues/2419
2024-12-07 11:23:45 +09:00
github-actions[bot]
2ff83679fe Snapshot: 24.12.0-dev 2024-12-01 00:40:40 +00:00
Eric Radman
de49b73855 Replace ptvsd with debugpy to match modern VS Code (#7234) 2024-11-27 08:19:05 +10:00
thiagogds
c12e68f5d1 Only evaluate the next state if there's a value (#7222)
I've experience this on my Redash in production. I'm not sure what can cause the value to exist, but be None. I guess it depends on the SQL query.

I followed the same idea of returning a self.UNKNOWN_STATE for cases that we can't know what's happening.
2024-11-26 12:57:34 -05:00
Eric Radman
baa9bbd505 Use head.sha for restyled checkout (#7227) 2024-11-22 10:34:16 +10:00
Arik Fraimovich
349cd5d031 Bring back version check & beacon reporting (#7211)
Co-authored-by: Restyled.io <commits@restyled.io>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
2024-11-06 01:21:03 +00:00
github-actions[bot]
49277d27f8 Snapshot: 24.11.0-dev 2024-11-01 00:35:04 +00:00
Yeger
2aae5705c9 don't crash when there is no data (#7208)
* don't crash when there is no data

* Add test
2024-10-31 08:49:57 +00:00
dependabot[bot]
38d0579660 Bump elliptic from 6.5.7 to 6.6.0 (#7214)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-10-31 02:41:33 +00:00
Ezra Odio
673ba769c7 Fix issue with scheduled queries (#7111)
Co-authored-by: Ezra Odio <eodio@starfishstorage.com>
Co-authored-by: Arik Fraimovich <arik@arikfr.com>
2024-10-29 10:36:05 +00:00
Eric Radman
b922730482 Docker build: use heredoc for multi-line actions (#7210) 2024-10-29 10:23:15 +10:00
Arik Fraimovich
ba973eb1fe Fixes #6767: correctly rehash queries in a migration (#7184) 2024-10-25 01:00:29 +00:00
dependabot[bot]
d8dde6c544 Bump cryptography from 42.0.8 to 43.0.1 (#7205)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-10-25 00:27:23 +00:00
dependabot[bot]
d359a716a7 Bump http-proxy-middleware from 2.0.6 to 2.0.7 (#7204)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-10-24 23:56:09 +00:00
dependabot[bot]
ba4293912b Bump snowflake-connector-python from 3.12.0 to 3.12.3 (#7203)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-10-24 23:23:28 +00:00
thiagogds
ee359120ee Use correct redis connection (#7077) 2024-10-24 17:54:09 +10:00
thiagogds
04a25f4327 Fix RQ wrongly moving jobs to FailedJobRegistry (#7186)
Something changed in python-rq and the old code was behaving in a way that if a job ran for longer than 2 min it would be automatically set as failed, but it would continue running.

This causes a problem in the UI because it is as if the job stopped, but it actually didn't
2024-10-17 13:30:02 -04:00
Eric Radman
7c22756e66 Move restyled to a github action (#7191) 2024-10-16 09:45:25 +03:00
dependabot[bot]
a03668f5b2 Bump restrictedpython from 6.2 to 7.3 (#7181)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-10-01 13:22:44 +10:00
github-actions[bot]
e4a841a0c5 Snapshot: 24.10.0-dev 2024-10-01 00:34:37 +00:00
Zach Liu
38dc31a49b Get rid of the strange looking 0 following "Running..." and "runtime" (#7099)
* Snapshot: 24.08.0-dev

* no more Running...0 or runtime0

* also missing a space

* Restyled by prettier

* check if data_scanned is defined

otherwise we could get "Data Scanned ?" if it's not supported
by some data sources

---------

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: Restyled.io <commits@restyled.io>
2024-09-19 13:25:05 +03:00
Justin Clift
c42b15125c Automatically remove orphans when running make up (#7164) 2024-09-17 05:11:51 +00:00
dependabot[bot]
590d39bc8d Bump dompurify from 2.0.17 to 2.5.4 in /viz-lib (#7163)
Bumps [dompurify](https://github.com/cure53/DOMPurify) from 2.0.17 to 2.5.4.
- [Release notes](https://github.com/cure53/DOMPurify/releases)
- [Commits](https://github.com/cure53/DOMPurify/compare/2.0.17...2.5.4)

---
updated-dependencies:
- dependency-name: dompurify
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-09-17 13:54:32 +10:00
Justin Clift
79bbb248bb Update make up to automatically initialise the db (#7161)
It does this by (very quickly) checking if the organization table
is present, running `make create_database` if not.
2024-09-14 16:29:04 +08:00
Zach Liu
5cf0b7b038 Better error msg for token validation (#7159)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
2024-09-14 12:03:20 +10:00
Justin Clift
fb1a056561 Add REDASH_HOST to the docker compose file (#7157)
This ensures emails generated in the development environment have the port number included in their urls.
2024-09-12 10:06:52 +00:00
dependabot[bot]
75e1ce4c9c Bump body-parser from 1.20.1 to 1.20.3 (#7156)
Bumps [body-parser](https://github.com/expressjs/body-parser) from 1.20.1 to 1.20.3.
- [Release notes](https://github.com/expressjs/body-parser/releases)
- [Changelog](https://github.com/expressjs/body-parser/blob/master/HISTORY.md)
- [Commits](https://github.com/expressjs/body-parser/compare/1.20.1...1.20.3)

---
updated-dependencies:
- dependency-name: body-parser
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-09-12 15:33:59 +10:00
dependabot[bot]
d6c6e3bb7a Bump express from 4.19.2 to 4.21.0 (#7155)
Bumps [express](https://github.com/expressjs/express) from 4.19.2 to 4.21.0.
- [Release notes](https://github.com/expressjs/express/releases)
- [Changelog](https://github.com/expressjs/express/blob/4.21.0/History.md)
- [Commits](https://github.com/expressjs/express/compare/4.19.2...4.21.0)

---
updated-dependencies:
- dependency-name: express
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-09-12 14:46:25 +10:00
dependabot[bot]
821c1a9488 Bump path-to-regexp from 3.2.0 to 3.3.0 (#7154)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-09-12 12:42:27 +10:00
Zach Liu
76eeea1f64 Make schema refresh timeout configurable via env var (#7114)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
2024-09-09 01:40:35 +00:00
Justin Clift
2ab07f9fc3 Remove left over compose.base.yaml file (#7142) 2024-09-06 17:47:56 +10:00
Justin Clift
a85b9d7801 Update pymssql to fix some problems with macOS ARM64 (2.3.1) (#7140)
Related: https://github.com/pymssql/pymssql/blob/master/ChangeLog.rst
2024-09-04 17:27:02 +09:00
github-actions[bot]
3330815081 Snapshot: 24.09.0-dev 2024-09-01 00:35:07 +00:00
dependabot[bot]
c25c65bc04 Bump webpack from 5.88.2 to 5.94.0 in /viz-lib (#7135)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-31 13:48:45 +10:00
Eric Radman
79a4c4c9c9 Revert "Adding ability to fix table columns in place (#7019)" (#7131) 2024-08-26 22:57:47 +10:00
Justin Clift
58a7438cc8 Bump python-rapidjson to 1.20 (#7126) 2024-08-20 08:35:54 +00:00
Zach Liu
c073c1e154 Fix mismatched poetry version (#7122)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
2024-08-16 16:01:53 +10:00
Justin Clift
159a329e26 Bump elliptic to version 6.5.7 to fix a Dependabot warning (#7120) 2024-08-14 14:11:38 +10:00
Ezra Odio
9de135c0bd Add option to choose color scheme for charts (#7062) 2024-08-08 13:08:49 -04:00
Zach Liu
285c2b6e56 Add data type to athena query runner (#7112)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
2024-08-07 03:36:58 +00:00
dependabot[bot]
b1fe2d4162 Bump sentry-sdk from 1.28.1 to 2.8.0 (#7069)
The Dependabot alert for sentry-sdk says that the security fix has
been backported to the 1.x series as well, in version 1.45.1.

So, lets use that as it should be more compatible that jumping to
a new major series version.
2024-08-06 10:05:21 +10:00
Zach Liu
a4f92a8fb5 Add data type to redshift query runner (#7109)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
2024-08-06 08:43:13 +10:00
Ezra Odio
51ef625a30 Fix alert evaluation logic and issue with calculating min and max of columns without numbers (#7103) 2024-08-05 15:20:26 +00:00
Masayuki Takahashi
a2611b89a3 Fix a display order bug in MongoDB Query Runner (#7106) 2024-08-04 04:22:59 +10:00
SeongTae Jeong
a531597016 Add the option to take new custom version for Snapshot (#7096) 2024-08-02 06:08:16 +00:00
Justin Clift
e59c02f497 Bump bootstrap to 3.4.1
Related:
- https://blog.getbootstrap.com/2018/12/13/bootstrap-3-4-0/
- https://blog.getbootstrap.com/2019/02/13/bootstrap-4-3-1-and-3-4-1/
2024-08-02 13:37:17 +09:00
99 changed files with 3497 additions and 3812 deletions

View File

@@ -3,7 +3,7 @@ on:
push: push:
branches: branches:
- master - master
pull_request_target: pull_request:
branches: branches:
- master - master
env: env:
@@ -60,15 +60,17 @@ jobs:
mkdir -p /tmp/test-results/unit-tests mkdir -p /tmp/test-results/unit-tests
docker cp tests:/app/coverage.xml ./coverage.xml docker cp tests:/app/coverage.xml ./coverage.xml
docker cp tests:/app/junit.xml /tmp/test-results/unit-tests/results.xml docker cp tests:/app/junit.xml /tmp/test-results/unit-tests/results.xml
- name: Upload coverage reports to Codecov # - name: Upload coverage reports to Codecov
uses: codecov/codecov-action@v3 # uses: codecov/codecov-action@v3
# with:
# token: ${{ secrets.CODECOV_TOKEN }}
- name: Store Test Results - name: Store Test Results
uses: actions/upload-artifact@v3 uses: actions/upload-artifact@v4
with: with:
name: test-results name: backend-test-results
path: /tmp/test-results path: /tmp/test-results
- name: Store Coverage Results - name: Store Coverage Results
uses: actions/upload-artifact@v3 uses: actions/upload-artifact@v4
with: with:
name: coverage name: coverage
path: coverage.xml path: coverage.xml
@@ -94,9 +96,9 @@ jobs:
- name: Run Lint - name: Run Lint
run: yarn lint:ci run: yarn lint:ci
- name: Store Test Results - name: Store Test Results
uses: actions/upload-artifact@v3 uses: actions/upload-artifact@v4
with: with:
name: test-results name: frontend-test-results
path: /tmp/test-results path: /tmp/test-results
frontend-unit-tests: frontend-unit-tests:
@@ -132,9 +134,9 @@ jobs:
COMPOSE_PROJECT_NAME: cypress COMPOSE_PROJECT_NAME: cypress
CYPRESS_INSTALL_BINARY: 0 CYPRESS_INSTALL_BINARY: 0
PUPPETEER_SKIP_CHROMIUM_DOWNLOAD: 1 PUPPETEER_SKIP_CHROMIUM_DOWNLOAD: 1
PERCY_TOKEN: ${{ secrets.PERCY_TOKEN }} # PERCY_TOKEN: ${{ secrets.PERCY_TOKEN }}
CYPRESS_PROJECT_ID: ${{ secrets.CYPRESS_PROJECT_ID }} # CYPRESS_PROJECT_ID: ${{ secrets.CYPRESS_PROJECT_ID }}
CYPRESS_RECORD_KEY: ${{ secrets.CYPRESS_RECORD_KEY }} # CYPRESS_RECORD_KEY: ${{ secrets.CYPRESS_RECORD_KEY }}
steps: steps:
- if: github.event.pull_request.mergeable == 'false' - if: github.event.pull_request.mergeable == 'false'
name: Exit if PR is not mergeable name: Exit if PR is not mergeable
@@ -169,7 +171,7 @@ jobs:
- name: Copy Code Coverage Results - name: Copy Code Coverage Results
run: docker cp cypress:/usr/src/app/coverage ./coverage || true run: docker cp cypress:/usr/src/app/coverage ./coverage || true
- name: Store Coverage Results - name: Store Coverage Results
uses: actions/upload-artifact@v3 uses: actions/upload-artifact@v4
with: with:
name: coverage name: coverage
path: coverage path: coverage

View File

@@ -1,11 +1,24 @@
name: Periodic Snapshot name: Periodic Snapshot
# 10 minutes after midnight on the first of every month
on: on:
schedule: schedule:
- cron: '10 0 1 * *' - cron: '10 0 1 * *' # 10 minutes after midnight on the first of every month
workflow_dispatch:
inputs:
bump:
description: 'Bump the last digit of the version'
required: false
type: boolean
version:
description: 'Specific version to set'
required: false
default: ''
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
permissions: permissions:
actions: write
contents: write contents: write
jobs: jobs:
@@ -14,17 +27,59 @@ jobs:
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
with: with:
ssh-key: ${{secrets.ACTION_PUSH_KEY}} ssh-key: ${{ secrets.ACTION_PUSH_KEY }}
- run: | - run: |
# https://api.github.com/users/github-actions[bot]
git config user.name 'github-actions[bot]' git config user.name 'github-actions[bot]'
git config user.email '41898282+github-actions[bot]@users.noreply.github.com' git config user.email '41898282+github-actions[bot]@users.noreply.github.com'
TAG_NAME="$(date +%y.%m).0-dev" # Function to bump the version
bump_version() {
local version="$1"
local IFS=.
read -r major minor patch <<< "$version"
patch=$((patch + 1))
echo "$major.$minor.$patch-dev"
}
# Determine the new version tag
if [ "${{ github.event_name }}" = "workflow_dispatch" ]; then
BUMP_INPUT="${{ github.event.inputs.bump }}"
SPECIFIC_VERSION="${{ github.event.inputs.version }}"
# Check if both bump and specific version are provided
if [ "$BUMP_INPUT" = "true" ] && [ -n "$SPECIFIC_VERSION" ]; then
echo "::error::Error: Cannot specify both bump and specific version."
exit 1
fi
if [ -n "$SPECIFIC_VERSION" ]; then
TAG_NAME="$SPECIFIC_VERSION-dev"
elif [ "$BUMP_INPUT" = "true" ]; then
CURRENT_VERSION=$(grep '"version":' package.json | awk -F\" '{print $4}')
TAG_NAME=$(bump_version "$CURRENT_VERSION")
else
echo "No version bump or specific version provided for manual dispatch."
exit 1
fi
else
TAG_NAME="$(date +%y.%m).0-dev"
fi
echo "New version tag: $TAG_NAME"
# Update version in files
gawk -i inplace -F: -v q=\" -v tag=${TAG_NAME} '/^ "version": / { print $1 FS, q tag q ","; next} { print }' package.json gawk -i inplace -F: -v q=\" -v tag=${TAG_NAME} '/^ "version": / { print $1 FS, q tag q ","; next} { print }' package.json
gawk -i inplace -F= -v q=\" -v tag=${TAG_NAME} '/^__version__ =/ { print $1 FS, q tag q; next} { print }' redash/__init__.py gawk -i inplace -F= -v q=\" -v tag=${TAG_NAME} '/^__version__ =/ { print $1 FS, q tag q; next} { print }' redash/__init__.py
gawk -i inplace -F= -v q=\" -v tag=${TAG_NAME} '/^version =/ { print $1 FS, q tag q; next} { print }' pyproject.toml gawk -i inplace -F= -v q=\" -v tag=${TAG_NAME} '/^version =/ { print $1 FS, q tag q; next} { print }' pyproject.toml
git add package.json redash/__init__.py pyproject.toml git add package.json redash/__init__.py pyproject.toml
git commit -m "Snapshot: ${TAG_NAME}" git commit -m "Snapshot: ${TAG_NAME}"
git tag ${TAG_NAME} git tag ${TAG_NAME}
git push --atomic origin master refs/tags/${TAG_NAME} git push --atomic origin master refs/tags/${TAG_NAME}
# Run the 'preview-image' workflow if run this workflow manually
# For more information, please see the: https://docs.github.com/en/actions/security-guides/automatic-token-authentication
if [ "$BUMP_INPUT" = "true" ] || [ -n "$SPECIFIC_VERSION" ]; then
gh workflow run preview-image.yml --ref $TAG_NAME
fi

View File

@@ -3,6 +3,16 @@ on:
push: push:
tags: tags:
- '*-dev' - '*-dev'
workflow_dispatch:
inputs:
dockerRepository:
description: 'Docker repository'
required: true
default: 'preview'
type: choice
options:
- preview
- redash
env: env:
NODE_VERSION: 18 NODE_VERSION: 18
@@ -29,7 +39,20 @@ jobs:
fi fi
build-docker-image: build-docker-image:
runs-on: ubuntu-22.04 runs-on: ${{ matrix.os }}
strategy:
fail-fast: false
matrix:
arch:
- amd64
- arm64
include:
- arch: amd64
os: ubuntu-22.04
- arch: arm64
os: ubuntu-22.04-arm
outputs:
VERSION_TAG: ${{ steps.version.outputs.VERSION_TAG }}
needs: needs:
- build-skip-check - build-skip-check
if: needs.build-skip-check.outputs.skip == 'false' if: needs.build-skip-check.outputs.skip == 'false'
@@ -44,11 +67,6 @@ jobs:
node-version: ${{ env.NODE_VERSION }} node-version: ${{ env.NODE_VERSION }}
cache: 'yarn' cache: 'yarn'
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
with:
platforms: arm64
- name: Set up Docker Buildx - name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3 uses: docker/setup-buildx-action@v3
@@ -59,6 +77,8 @@ jobs:
password: ${{ secrets.DOCKER_PASS }} password: ${{ secrets.DOCKER_PASS }}
- name: Install Dependencies - name: Install Dependencies
env:
PUPPETEER_SKIP_CHROMIUM_DOWNLOAD: true
run: | run: |
npm install --global --force yarn@1.22.22 npm install --global --force yarn@1.22.22
yarn cache clean && yarn --frozen-lockfile --network-concurrency 1 yarn cache clean && yarn --frozen-lockfile --network-concurrency 1
@@ -71,23 +91,92 @@ jobs:
VERSION_TAG=$(jq -r .version package.json) VERSION_TAG=$(jq -r .version package.json)
echo "VERSION_TAG=$VERSION_TAG" >> "$GITHUB_OUTPUT" echo "VERSION_TAG=$VERSION_TAG" >> "$GITHUB_OUTPUT"
# TODO: We can use GitHub Actions's matrix option to reduce the build time.
- name: Build and push preview image to Docker Hub - name: Build and push preview image to Docker Hub
id: build-preview
uses: docker/build-push-action@v4 uses: docker/build-push-action@v4
if: ${{ github.event.inputs.dockerRepository == 'preview' || !github.event.workflow_run }}
with: with:
push: true
tags: | tags: |
redash/redash:preview ${{ vars.DOCKER_USER }}/redash
redash/preview:${{ steps.version.outputs.VERSION_TAG }} ${{ vars.DOCKER_USER }}/preview
context: . context: .
build-args: | build-args: |
test_all_deps=true test_all_deps=true
cache-from: type=gha,scope=multi-platform outputs: type=image,push-by-digest=true,push=true
cache-to: type=gha,mode=max,scope=multi-platform cache-from: type=gha,scope=${{ matrix.arch }}
platforms: linux/amd64,linux/arm64 cache-to: type=gha,mode=max,scope=${{ matrix.arch }}
env:
DOCKER_CONTENT_TRUST: true
- name: Build and push release image to Docker Hub
id: build-release
uses: docker/build-push-action@v4
if: ${{ github.event.inputs.dockerRepository == 'redash' }}
with:
tags: |
${{ vars.DOCKER_USER }}/redash:${{ steps.version.outputs.VERSION_TAG }}
context: .
build-args: |
test_all_deps=true
outputs: type=image,push-by-digest=true,push=true
cache-from: type=gha,scope=${{ matrix.arch }}
cache-to: type=gha,mode=max,scope=${{ matrix.arch }}
env: env:
DOCKER_CONTENT_TRUST: true DOCKER_CONTENT_TRUST: true
- name: "Failure: output container logs to console" - name: "Failure: output container logs to console"
if: failure() if: failure()
run: docker compose logs run: docker compose logs
- name: Export digest
run: |
mkdir -p ${{ runner.temp }}/digests
if [[ "${{ github.event.inputs.dockerRepository }}" == 'preview' || !github.event.workflow_run ]]; then
digest="${{ steps.build-preview.outputs.digest}}"
else
digest="${{ steps.build-release.outputs.digest}}"
fi
touch "${{ runner.temp }}/digests/${digest#sha256:}"
- name: Upload digest
uses: actions/upload-artifact@v4
with:
name: digests-${{ matrix.arch }}
path: ${{ runner.temp }}/digests/*
if-no-files-found: error
merge-docker-image:
runs-on: ubuntu-22.04
needs: build-docker-image
steps:
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Login to DockerHub
uses: docker/login-action@v3
with:
username: ${{ vars.DOCKER_USER }}
password: ${{ secrets.DOCKER_PASS }}
- name: Download digests
uses: actions/download-artifact@v4
with:
path: ${{ runner.temp }}/digests
pattern: digests-*
merge-multiple: true
- name: Create and push manifest for the preview image
if: ${{ github.event.inputs.dockerRepository == 'preview' || !github.event.workflow_run }}
working-directory: ${{ runner.temp }}/digests
run: |
docker buildx imagetools create -t ${{ vars.DOCKER_USER }}/redash:preview \
$(printf '${{ vars.DOCKER_USER }}/redash:preview@sha256:%s ' *)
docker buildx imagetools create -t ${{ vars.DOCKER_USER }}/preview:${{ needs.build-docker-image.outputs.VERSION_TAG }} \
$(printf '${{ vars.DOCKER_USER }}/preview:${{ needs.build-docker-image.outputs.VERSION_TAG }}@sha256:%s ' *)
- name: Create and push manifest for the release image
if: ${{ github.event.inputs.dockerRepository == 'redash' }}
working-directory: ${{ runner.temp }}/digests
run: |
docker buildx imagetools create -t ${{ vars.DOCKER_USER }}/redash:${{ needs.build-docker-image.outputs.VERSION_TAG }} \
$(printf '${{ vars.DOCKER_USER }}/redash:${{ needs.build-docker-image.outputs.VERSION_TAG }}@sha256:%s ' *)

36
.github/workflows/restyled.yml vendored Normal file
View File

@@ -0,0 +1,36 @@
name: Restyled
on:
pull_request:
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
restyled:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
- uses: restyled-io/actions/setup@v4
- id: restyler
uses: restyled-io/actions/run@v4
with:
fail-on-differences: true
- if: |
!cancelled() &&
steps.restyler.outputs.success == 'true' &&
github.event.pull_request.head.repo.full_name == github.repository
uses: peter-evans/create-pull-request@v6
with:
base: ${{ steps.restyler.outputs.restyled-base }}
branch: ${{ steps.restyler.outputs.restyled-head }}
title: ${{ steps.restyler.outputs.restyled-title }}
body: ${{ steps.restyler.outputs.restyled-body }}
labels: "restyled"
reviewers: ${{ github.event.pull_request.user.login }}
delete-branch: true

View File

@@ -27,7 +27,15 @@ RUN if [ "x$skip_frontend_build" = "x" ] ; then yarn --frozen-lockfile --network
COPY --chown=redash client /frontend/client COPY --chown=redash client /frontend/client
COPY --chown=redash webpack.config.js /frontend/ COPY --chown=redash webpack.config.js /frontend/
RUN if [ "x$skip_frontend_build" = "x" ] ; then yarn build; else mkdir -p /frontend/client/dist && touch /frontend/client/dist/multi_org.html && touch /frontend/client/dist/index.html; fi RUN <<EOF
if [ "x$skip_frontend_build" = "x" ]; then
yarn build
else
mkdir -p /frontend/client/dist
touch /frontend/client/dist/multi_org.html
touch /frontend/client/dist/index.html
fi
EOF
FROM python:3.10-slim-bookworm FROM python:3.10-slim-bookworm
@@ -67,24 +75,27 @@ RUN apt-get update && \
ARG TARGETPLATFORM ARG TARGETPLATFORM
ARG databricks_odbc_driver_url=https://databricks-bi-artifacts.s3.us-east-2.amazonaws.com/simbaspark-drivers/odbc/2.6.26/SimbaSparkODBC-2.6.26.1045-Debian-64bit.zip ARG databricks_odbc_driver_url=https://databricks-bi-artifacts.s3.us-east-2.amazonaws.com/simbaspark-drivers/odbc/2.6.26/SimbaSparkODBC-2.6.26.1045-Debian-64bit.zip
RUN if [ "$TARGETPLATFORM" = "linux/amd64" ]; then \ RUN <<EOF
curl https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor -o /usr/share/keyrings/microsoft-prod.gpg \ if [ "$TARGETPLATFORM" = "linux/amd64" ]; then
&& curl https://packages.microsoft.com/config/debian/12/prod.list > /etc/apt/sources.list.d/mssql-release.list \ curl https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor -o /usr/share/keyrings/microsoft-prod.gpg
&& apt-get update \ curl https://packages.microsoft.com/config/debian/12/prod.list > /etc/apt/sources.list.d/mssql-release.list
&& ACCEPT_EULA=Y apt-get install -y --no-install-recommends msodbcsql18 \ apt-get update
&& apt-get clean \ ACCEPT_EULA=Y apt-get install -y --no-install-recommends msodbcsql18
&& rm -rf /var/lib/apt/lists/* \ apt-get clean
&& curl "$databricks_odbc_driver_url" --location --output /tmp/simba_odbc.zip \ rm -rf /var/lib/apt/lists/*
&& chmod 600 /tmp/simba_odbc.zip \ curl "$databricks_odbc_driver_url" --location --output /tmp/simba_odbc.zip
&& unzip /tmp/simba_odbc.zip -d /tmp/simba \ chmod 600 /tmp/simba_odbc.zip
&& dpkg -i /tmp/simba/*.deb \ unzip /tmp/simba_odbc.zip -d /tmp/simba
&& printf "[Simba]\nDriver = /opt/simba/spark/lib/64/libsparkodbc_sb64.so" >> /etc/odbcinst.ini \ dpkg -i /tmp/simba/*.deb
&& rm /tmp/simba_odbc.zip \ printf "[Simba]\nDriver = /opt/simba/spark/lib/64/libsparkodbc_sb64.so" >> /etc/odbcinst.ini
&& rm -rf /tmp/simba; fi rm /tmp/simba_odbc.zip
rm -rf /tmp/simba
fi
EOF
WORKDIR /app WORKDIR /app
ENV POETRY_VERSION=1.6.1 ENV POETRY_VERSION=1.8.3
ENV POETRY_HOME=/etc/poetry ENV POETRY_HOME=/etc/poetry
ENV POETRY_VIRTUALENVS_CREATE=false ENV POETRY_VIRTUALENVS_CREATE=false
RUN curl -sSL https://install.python-poetry.org | python3 - RUN curl -sSL https://install.python-poetry.org | python3 -

View File

@@ -4,7 +4,11 @@ compose_build: .env
COMPOSE_DOCKER_CLI_BUILD=1 DOCKER_BUILDKIT=1 docker compose build COMPOSE_DOCKER_CLI_BUILD=1 DOCKER_BUILDKIT=1 docker compose build
up: up:
COMPOSE_DOCKER_CLI_BUILD=1 DOCKER_BUILDKIT=1 docker compose up -d --build docker compose up -d redis postgres --remove-orphans
docker compose exec -u postgres postgres psql postgres --csv \
-1tqc "SELECT table_name FROM information_schema.tables WHERE table_name = 'organizations'" 2> /dev/null \
| grep -q "organizations" || make create_database
COMPOSE_DOCKER_CLI_BUILD=1 DOCKER_BUILDKIT=1 docker compose up -d --build --remove-orphans
test_db: test_db:
@for i in `seq 1 5`; do \ @for i in `seq 1 5`; do \
@@ -30,7 +34,7 @@ clean:
clean-all: clean clean-all: clean
docker image rm --force \ docker image rm --force \
redash/redash:10.1.0.b50633 redis:7-alpine maildev/maildev:latest \ redash/redash:latest redis:7-alpine maildev/maildev:latest \
pgautoupgrade/pgautoupgrade:15-alpine3.8 pgautoupgrade/pgautoupgrade:latest pgautoupgrade/pgautoupgrade:15-alpine3.8 pgautoupgrade/pgautoupgrade:latest
down: down:

View File

@@ -67,7 +67,7 @@ help() {
echo "" echo ""
echo "shell -- open shell" echo "shell -- open shell"
echo "dev_server -- start Flask development server with debugger and auto reload" echo "dev_server -- start Flask development server with debugger and auto reload"
echo "debug -- start Flask development server with remote debugger via ptvsd" echo "debug -- start Flask development server with remote debugger via debugpy"
echo "create_db -- create database tables" echo "create_db -- create database tables"
echo "manage -- CLI to manage redash" echo "manage -- CLI to manage redash"
echo "tests -- run tests" echo "tests -- run tests"

View File

@@ -1,5 +1,6 @@
import React from "react"; import React from "react";
import { clientConfig } from "@/services/auth"; import Link from "@/components/Link";
import { clientConfig, currentUser } from "@/services/auth";
import frontendVersion from "@/version.json"; import frontendVersion from "@/version.json";
export default function VersionInfo() { export default function VersionInfo() {
@@ -9,6 +10,15 @@ export default function VersionInfo() {
Version: {clientConfig.version} Version: {clientConfig.version}
{frontendVersion !== clientConfig.version && ` (${frontendVersion.substring(0, 8)})`} {frontendVersion !== clientConfig.version && ` (${frontendVersion.substring(0, 8)})`}
</div> </div>
{clientConfig.newVersionAvailable && currentUser.hasPermission("super_admin") && (
<div className="m-t-10">
{/* eslint-disable react/jsx-no-target-blank */}
<Link href="https://version.redash.io/" className="update-available" target="_blank" rel="noopener">
Update Available <i className="fa fa-external-link m-l-5" aria-hidden="true" />
<span className="sr-only">(opens in a new tab)</span>
</Link>
</div>
)}
</React.Fragment> </React.Fragment>
); );
} }

View File

@@ -0,0 +1,79 @@
import React, { useState } from "react";
import Card from "antd/lib/card";
import Button from "antd/lib/button";
import Typography from "antd/lib/typography";
import { clientConfig } from "@/services/auth";
import Link from "@/components/Link";
import HelpTrigger from "@/components/HelpTrigger";
import DynamicComponent from "@/components/DynamicComponent";
import OrgSettings from "@/services/organizationSettings";
const Text = Typography.Text;
function BeaconConsent() {
const [hide, setHide] = useState(false);
if (!clientConfig.showBeaconConsentMessage || hide) {
return null;
}
const hideConsentCard = () => {
clientConfig.showBeaconConsentMessage = false;
setHide(true);
};
const confirmConsent = (confirm) => {
let message = "🙏 Thank you.";
if (!confirm) {
message = "Settings Saved.";
}
OrgSettings.save({ beacon_consent: confirm }, message)
// .then(() => {
// // const settings = get(response, 'settings');
// // this.setState({ settings, formValues: { ...settings } });
// })
.finally(hideConsentCard);
};
return (
<DynamicComponent name="BeaconConsent">
<div className="m-t-10 tiled">
<Card
title={
<>
Would you be ok with sharing anonymous usage data with the Redash team?{" "}
<HelpTrigger type="USAGE_DATA_SHARING" />
</>
}
bordered={false}
>
<Text>Help Redash improve by automatically sending anonymous usage data:</Text>
<div className="m-t-5">
<ul>
<li> Number of users, queries, dashboards, alerts, widgets and visualizations.</li>
<li> Types of data sources, alert destinations and visualizations.</li>
</ul>
</div>
<Text>All data is aggregated and will never include any sensitive or private data.</Text>
<div className="m-t-5">
<Button type="primary" className="m-r-5" onClick={() => confirmConsent(true)}>
Yes
</Button>
<Button type="default" onClick={() => confirmConsent(false)}>
No
</Button>
</div>
<div className="m-t-15">
<Text type="secondary">
You can change this setting anytime from the <Link href="settings/general">Settings</Link> page.
</Text>
</div>
</Card>
</div>
</DynamicComponent>
);
}
export default BeaconConsent;

View File

@@ -23,6 +23,7 @@ export const TYPES = mapValues(
VALUE_SOURCE_OPTIONS: ["/user-guide/querying/query-parameters#Value-Source-Options", "Guide: Value Source Options"], VALUE_SOURCE_OPTIONS: ["/user-guide/querying/query-parameters#Value-Source-Options", "Guide: Value Source Options"],
SHARE_DASHBOARD: ["/user-guide/dashboards/sharing-dashboards", "Guide: Sharing and Embedding Dashboards"], SHARE_DASHBOARD: ["/user-guide/dashboards/sharing-dashboards", "Guide: Sharing and Embedding Dashboards"],
AUTHENTICATION_OPTIONS: ["/user-guide/users/authentication-options", "Guide: Authentication Options"], AUTHENTICATION_OPTIONS: ["/user-guide/users/authentication-options", "Guide: Authentication Options"],
USAGE_DATA_SHARING: ["/open-source/admin-guide/usage-data", "Help: Anonymous Usage Data Sharing"],
DS_ATHENA: ["/data-sources/amazon-athena-setup", "Guide: Help Setting up Amazon Athena"], DS_ATHENA: ["/data-sources/amazon-athena-setup", "Guide: Help Setting up Amazon Athena"],
DS_BIGQUERY: ["/data-sources/bigquery-setup", "Guide: Help Setting up BigQuery"], DS_BIGQUERY: ["/data-sources/bigquery-setup", "Guide: Help Setting up BigQuery"],
DS_URL: ["/data-sources/querying-urls", "Guide: Help Setting up URL"], DS_URL: ["/data-sources/querying-urls", "Guide: Help Setting up URL"],
@@ -100,7 +101,7 @@ export function helpTriggerWithTypes(types, allowedDomains = [], drawerClassName
clearTimeout(this.iframeLoadingTimeout); clearTimeout(this.iframeLoadingTimeout);
} }
loadIframe = url => { loadIframe = (url) => {
clearTimeout(this.iframeLoadingTimeout); clearTimeout(this.iframeLoadingTimeout);
this.setState({ loading: true, error: false }); this.setState({ loading: true, error: false });
@@ -115,8 +116,8 @@ export function helpTriggerWithTypes(types, allowedDomains = [], drawerClassName
clearTimeout(this.iframeLoadingTimeout); clearTimeout(this.iframeLoadingTimeout);
}; };
onPostMessageReceived = event => { onPostMessageReceived = (event) => {
if (!some(allowedDomains, domain => startsWith(event.origin, domain))) { if (!some(allowedDomains, (domain) => startsWith(event.origin, domain))) {
return; return;
} }
@@ -133,7 +134,7 @@ export function helpTriggerWithTypes(types, allowedDomains = [], drawerClassName
return helpTriggerType ? helpTriggerType[0] : this.props.href; return helpTriggerType ? helpTriggerType[0] : this.props.href;
}; };
openDrawer = e => { openDrawer = (e) => {
// keep "open in new tab" behavior // keep "open in new tab" behavior
if (!e.shiftKey && !e.ctrlKey && !e.metaKey) { if (!e.shiftKey && !e.ctrlKey && !e.metaKey) {
e.preventDefault(); e.preventDefault();
@@ -143,7 +144,7 @@ export function helpTriggerWithTypes(types, allowedDomains = [], drawerClassName
} }
}; };
closeDrawer = event => { closeDrawer = (event) => {
if (event) { if (event) {
event.preventDefault(); event.preventDefault();
} }
@@ -160,7 +161,7 @@ export function helpTriggerWithTypes(types, allowedDomains = [], drawerClassName
const tooltip = get(types, `${this.props.type}[1]`, this.props.title); const tooltip = get(types, `${this.props.type}[1]`, this.props.title);
const className = cx("help-trigger", this.props.className); const className = cx("help-trigger", this.props.className);
const url = this.state.currentUrl; const url = this.state.currentUrl;
const isAllowedDomain = some(allowedDomains, domain => startsWith(url || targetUrl, domain)); const isAllowedDomain = some(allowedDomains, (domain) => startsWith(url || targetUrl, domain));
const shouldRenderAsLink = this.props.renderAsLink || !isAllowedDomain; const shouldRenderAsLink = this.props.renderAsLink || !isAllowedDomain;
return ( return (
@@ -179,13 +180,15 @@ export function helpTriggerWithTypes(types, allowedDomains = [], drawerClassName
)} )}
</> </>
) : null ) : null
}> }
>
<Link <Link
href={url || this.getUrl()} href={url || this.getUrl()}
className={className} className={className}
rel="noopener noreferrer" rel="noopener noreferrer"
target="_blank" target="_blank"
onClick={shouldRenderAsLink ? () => {} : this.openDrawer}> onClick={shouldRenderAsLink ? () => {} : this.openDrawer}
>
{this.props.children} {this.props.children}
</Link> </Link>
</Tooltip> </Tooltip>
@@ -196,7 +199,8 @@ export function helpTriggerWithTypes(types, allowedDomains = [], drawerClassName
visible={this.state.visible} visible={this.state.visible}
className={cx("help-drawer", drawerClassName)} className={cx("help-drawer", drawerClassName)}
destroyOnClose destroyOnClose
width={400}> width={400}
>
<div className="drawer-wrapper"> <div className="drawer-wrapper">
<div className="drawer-menu"> <div className="drawer-menu">
{url && ( {url && (

View File

@@ -69,7 +69,7 @@ UserPreviewCard.defaultProps = {
// DataSourcePreviewCard // DataSourcePreviewCard
export function DataSourcePreviewCard({ dataSource, withLink, children, ...props }) { export function DataSourcePreviewCard({ dataSource, withLink, children, ...props }) {
const imageUrl = `static/images/db-logos/${dataSource.type}.png`; const imageUrl = `/static/images/db-logos/${dataSource.type}.png`;
const title = withLink ? <Link href={"data_sources/" + dataSource.id}>{dataSource.name}</Link> : dataSource.name; const title = withLink ? <Link href={"data_sources/" + dataSource.id}>{dataSource.name}</Link> : dataSource.name;
return ( return (
<PreviewCard {...props} imageUrl={imageUrl} title={title}> <PreviewCard {...props} imageUrl={imageUrl} title={title}>

View File

@@ -96,7 +96,7 @@ function EmptyState({
}, []); }, []);
// Show if `onboardingMode=false` or any requested step not completed // Show if `onboardingMode=false` or any requested step not completed
const shouldShow = !onboardingMode || some(keys(isAvailable), step => isAvailable[step] && !isCompleted[step]); const shouldShow = !onboardingMode || some(keys(isAvailable), (step) => isAvailable[step] && !isCompleted[step]);
if (!shouldShow) { if (!shouldShow) {
return null; return null;
@@ -181,7 +181,7 @@ function EmptyState({
]; ];
const stepsItems = getStepsItems ? getStepsItems(defaultStepsItems) : defaultStepsItems; const stepsItems = getStepsItems ? getStepsItems(defaultStepsItems) : defaultStepsItems;
const imageSource = illustrationPath ? illustrationPath : "static/images/illustrations/" + illustration + ".svg"; const imageSource = illustrationPath ? illustrationPath : "/static/images/illustrations/" + illustration + ".svg";
return ( return (
<div className="empty-state-wrapper"> <div className="empty-state-wrapper">
@@ -196,7 +196,7 @@ function EmptyState({
</div> </div>
<div className="empty-state__steps"> <div className="empty-state__steps">
<h4>Let&apos;s get started</h4> <h4>Let&apos;s get started</h4>
<ol>{stepsItems.map(item => item.node)}</ol> <ol>{stepsItems.map((item) => item.node)}</ol>
{helpMessage} {helpMessage}
</div> </div>
</div> </div>

View File

@@ -67,7 +67,9 @@ export default function Criteria({ columnNames, resultValues, alertOptions, onCh
<small className="alert-criteria-hint"> <small className="alert-criteria-hint">
Max column value is{" "} Max column value is{" "}
<code className="p-0"> <code className="p-0">
{toString(Math.max(...resultValues.map((o) => o[alertOptions.column]))) || "unknown"} {toString(
Math.max(...resultValues.map((o) => Number(o[alertOptions.column])).filter((value) => !isNaN(value)))
) || "unknown"}
</code> </code>
</small> </small>
); );
@@ -76,7 +78,9 @@ export default function Criteria({ columnNames, resultValues, alertOptions, onCh
<small className="alert-criteria-hint"> <small className="alert-criteria-hint">
Min column value is{" "} Min column value is{" "}
<code className="p-0"> <code className="p-0">
{toString(Math.min(...resultValues.map((o) => o[alertOptions.column]))) || "unknown"} {toString(
Math.min(...resultValues.map((o) => Number(o[alertOptions.column])).filter((value) => !isNaN(value)))
) || "unknown"}
</code> </code>
</small> </small>
); );

View File

@@ -6,6 +6,7 @@ import Link from "@/components/Link";
import routeWithUserSession from "@/components/ApplicationArea/routeWithUserSession"; import routeWithUserSession from "@/components/ApplicationArea/routeWithUserSession";
import EmptyState, { EmptyStateHelpMessage } from "@/components/empty-state/EmptyState"; import EmptyState, { EmptyStateHelpMessage } from "@/components/empty-state/EmptyState";
import DynamicComponent from "@/components/DynamicComponent"; import DynamicComponent from "@/components/DynamicComponent";
import BeaconConsent from "@/components/BeaconConsent";
import PlainButton from "@/components/PlainButton"; import PlainButton from "@/components/PlainButton";
import { axios } from "@/services/axios"; import { axios } from "@/services/axios";
@@ -30,7 +31,8 @@ function DeprecatedEmbedFeatureAlert() {
<Link <Link
href="https://discuss.redash.io/t/support-for-parameters-in-embedded-visualizations/3337" href="https://discuss.redash.io/t/support-for-parameters-in-embedded-visualizations/3337"
target="_blank" target="_blank"
rel="noopener noreferrer"> rel="noopener noreferrer"
>
Read more Read more
</Link> </Link>
. .
@@ -42,7 +44,7 @@ function DeprecatedEmbedFeatureAlert() {
function EmailNotVerifiedAlert() { function EmailNotVerifiedAlert() {
const verifyEmail = () => { const verifyEmail = () => {
axios.post("verification_email/").then(data => { axios.post("verification_email/").then((data) => {
notification.success(data.message); notification.success(data.message);
}); });
}; };
@@ -88,6 +90,7 @@ export default function Home() {
</DynamicComponent> </DynamicComponent>
<DynamicComponent name="HomeExtra" /> <DynamicComponent name="HomeExtra" />
<DashboardAndQueryFavoritesList /> <DashboardAndQueryFavoritesList />
<BeaconConsent />
</div> </div>
</div> </div>
); );
@@ -98,6 +101,6 @@ routes.register(
routeWithUserSession({ routeWithUserSession({
path: "/", path: "/",
title: "Redash", title: "Redash",
render: pageProps => <Home {...pageProps} />, render: (pageProps) => <Home {...pageProps} />,
}) })
); );

View File

@@ -9,6 +9,7 @@ import QueryControlDropdown from "@/components/EditVisualizationButton/QueryCont
import EditVisualizationButton from "@/components/EditVisualizationButton"; import EditVisualizationButton from "@/components/EditVisualizationButton";
import useQueryResultData from "@/lib/useQueryResultData"; import useQueryResultData from "@/lib/useQueryResultData";
import { durationHumanize, pluralize, prettySize } from "@/lib/utils"; import { durationHumanize, pluralize, prettySize } from "@/lib/utils";
import { isUndefined } from "lodash";
import "./QueryExecutionMetadata.less"; import "./QueryExecutionMetadata.less";
@@ -51,7 +52,8 @@ export default function QueryExecutionMetadata({
"Result truncated to " + "Result truncated to " +
queryResultData.rows.length + queryResultData.rows.length +
" rows. Databricks may truncate query results that are unstably large." " rows. Databricks may truncate query results that are unstably large."
}> }
>
<WarningTwoTone twoToneColor="#FF9800" /> <WarningTwoTone twoToneColor="#FF9800" />
</Tooltip> </Tooltip>
</span> </span>
@@ -67,10 +69,9 @@ export default function QueryExecutionMetadata({
)} )}
{isQueryExecuting && <span>Running&hellip;</span>} {isQueryExecuting && <span>Running&hellip;</span>}
</span> </span>
{queryResultData.metadata.data_scanned && ( {!isUndefined(queryResultData.metadata.data_scanned) && !isQueryExecuting && (
<span className="m-l-5"> <span className="m-l-5">
Data Scanned Data Scanned <strong>{prettySize(queryResultData.metadata.data_scanned)}</strong>
<strong>{prettySize(queryResultData.metadata.data_scanned)}</strong>
</span> </span>
)} )}
</span> </span>

View File

@@ -2,7 +2,7 @@ import PropTypes from "prop-types";
import React from "react"; import React from "react";
export function QuerySourceTypeIcon(props) { export function QuerySourceTypeIcon(props) {
return <img src={`static/images/db-logos/${props.type}.png`} width="20" alt={props.alt} />; return <img src={`/static/images/db-logos/${props.type}.png`} width="20" alt={props.alt} />;
} }
QuerySourceTypeIcon.propTypes = { QuerySourceTypeIcon.propTypes = {

View File

@@ -18,7 +18,7 @@ function EmptyState({ title, message, refreshButton }) {
<div className="query-results-empty-state"> <div className="query-results-empty-state">
<div className="empty-state-content"> <div className="empty-state-content">
<div> <div>
<img src="static/images/illustrations/no-query-results.svg" alt="No Query Results Illustration" /> <img src="/static/images/illustrations/no-query-results.svg" alt="No Query Results Illustration" />
</div> </div>
<h3>{title}</h3> <h3>{title}</h3>
<div className="m-b-20">{message}</div> <div className="m-b-20">{message}</div>
@@ -40,7 +40,7 @@ EmptyState.defaultProps = {
function TabWithDeleteButton({ visualizationName, canDelete, onDelete, ...props }) { function TabWithDeleteButton({ visualizationName, canDelete, onDelete, ...props }) {
const handleDelete = useCallback( const handleDelete = useCallback(
e => { (e) => {
e.stopPropagation(); e.stopPropagation();
Modal.confirm({ Modal.confirm({
title: "Delete Visualization", title: "Delete Visualization",
@@ -111,7 +111,8 @@ export default function QueryVisualizationTabs({
className="add-visualization-button" className="add-visualization-button"
data-test="NewVisualization" data-test="NewVisualization"
type="link" type="link"
onClick={() => onAddVisualization()}> onClick={() => onAddVisualization()}
>
<i className="fa fa-plus" aria-hidden="true" /> <i className="fa fa-plus" aria-hidden="true" />
<span className="m-l-5 hidden-xs">Add Visualization</span> <span className="m-l-5 hidden-xs">Add Visualization</span>
</Button> </Button>
@@ -119,7 +120,7 @@ export default function QueryVisualizationTabs({
} }
const orderedVisualizations = useMemo(() => orderBy(visualizations, ["id"]), [visualizations]); const orderedVisualizations = useMemo(() => orderBy(visualizations, ["id"]), [visualizations]);
const isFirstVisualization = useCallback(visId => visId === orderedVisualizations[0].id, [orderedVisualizations]); const isFirstVisualization = useCallback((visId) => visId === orderedVisualizations[0].id, [orderedVisualizations]);
const isMobile = useMedia({ maxWidth: 768 }); const isMobile = useMedia({ maxWidth: 768 });
const [filters, setFilters] = useState([]); const [filters, setFilters] = useState([]);
@@ -132,9 +133,10 @@ export default function QueryVisualizationTabs({
data-test="QueryPageVisualizationTabs" data-test="QueryPageVisualizationTabs"
animated={false} animated={false}
tabBarGutter={0} tabBarGutter={0}
onChange={activeKey => onChangeTab(+activeKey)} onChange={(activeKey) => onChangeTab(+activeKey)}
destroyInactiveTabPane> destroyInactiveTabPane
{orderedVisualizations.map(visualization => ( >
{orderedVisualizations.map((visualization) => (
<TabPane <TabPane
key={`${visualization.id}`} key={`${visualization.id}`}
tab={ tab={
@@ -144,7 +146,8 @@ export default function QueryVisualizationTabs({
visualizationName={visualization.name} visualizationName={visualization.name}
onDelete={() => onDeleteVisualization(visualization.id)} onDelete={() => onDeleteVisualization(visualization.id)}
/> />
}> }
>
{queryResult ? ( {queryResult ? (
<VisualizationRenderer <VisualizationRenderer
visualization={visualization} visualization={visualization}

View File

@@ -1,16 +1,11 @@
import { useCallback, useMemo, useState } from "react"; import { useCallback, useMemo, useState } from "react";
import { reduce } from "lodash";
import localOptions from "@/lib/localOptions"; import localOptions from "@/lib/localOptions";
function calculateTokensCount(schema) {
return reduce(schema, (totalLength, table) => totalLength + table.columns.length, 0);
}
export default function useAutocompleteFlags(schema) { export default function useAutocompleteFlags(schema) {
const isAvailable = useMemo(() => calculateTokensCount(schema) <= 5000, [schema]); const isAvailable = true;
const [isEnabled, setIsEnabled] = useState(localOptions.get("liveAutocomplete", true)); const [isEnabled, setIsEnabled] = useState(localOptions.get("liveAutocomplete", true));
const toggleAutocomplete = useCallback(state => { const toggleAutocomplete = useCallback((state) => {
setIsEnabled(state); setIsEnabled(state);
localOptions.set("liveAutocomplete", state); localOptions.set("liveAutocomplete", state);
}, []); }, []);

View File

@@ -0,0 +1,40 @@
import React from "react";
import Form from "antd/lib/form";
import Checkbox from "antd/lib/checkbox";
import Skeleton from "antd/lib/skeleton";
import HelpTrigger from "@/components/HelpTrigger";
import DynamicComponent from "@/components/DynamicComponent";
import { SettingsEditorPropTypes, SettingsEditorDefaultProps } from "../prop-types";
export default function BeaconConsentSettings(props) {
const { values, onChange, loading } = props;
return (
<DynamicComponent name="OrganizationSettings.BeaconConsentSettings" {...props}>
<Form.Item
label={
<span>
Anonymous Usage Data Sharing
<HelpTrigger className="m-l-5 m-r-5" type="USAGE_DATA_SHARING" />
</span>
}
>
{loading ? (
<Skeleton title={{ width: 300 }} paragraph={false} active />
) : (
<Checkbox
name="beacon_consent"
checked={values.beacon_consent}
onChange={(e) => onChange({ beacon_consent: e.target.checked })}
>
Help Redash improve by automatically sending anonymous usage data
</Checkbox>
)}
</Form.Item>
</DynamicComponent>
);
}
BeaconConsentSettings.propTypes = SettingsEditorPropTypes;
BeaconConsentSettings.defaultProps = SettingsEditorDefaultProps;

View File

@@ -4,6 +4,7 @@ import DynamicComponent from "@/components/DynamicComponent";
import FormatSettings from "./FormatSettings"; import FormatSettings from "./FormatSettings";
import PlotlySettings from "./PlotlySettings"; import PlotlySettings from "./PlotlySettings";
import FeatureFlagsSettings from "./FeatureFlagsSettings"; import FeatureFlagsSettings from "./FeatureFlagsSettings";
import BeaconConsentSettings from "./BeaconConsentSettings";
export default function GeneralSettings(props) { export default function GeneralSettings(props) {
return ( return (
@@ -13,6 +14,7 @@ export default function GeneralSettings(props) {
<FormatSettings {...props} /> <FormatSettings {...props} />
<PlotlySettings {...props} /> <PlotlySettings {...props} />
<FeatureFlagsSettings {...props} /> <FeatureFlagsSettings {...props} />
<BeaconConsentSettings {...props} />
</DynamicComponent> </DynamicComponent>
); );
} }

View File

@@ -4,19 +4,19 @@ import { fetchDataFromJob } from "@/services/query-result";
export const SCHEMA_NOT_SUPPORTED = 1; export const SCHEMA_NOT_SUPPORTED = 1;
export const SCHEMA_LOAD_ERROR = 2; export const SCHEMA_LOAD_ERROR = 2;
export const IMG_ROOT = "static/images/db-logos"; export const IMG_ROOT = "/static/images/db-logos";
function mapSchemaColumnsToObject(columns) { function mapSchemaColumnsToObject(columns) {
return map(columns, column => (isObject(column) ? column : { name: column })); return map(columns, (column) => (isObject(column) ? column : { name: column }));
} }
const DataSource = { const DataSource = {
query: () => axios.get("api/data_sources"), query: () => axios.get("api/data_sources"),
get: ({ id }) => axios.get(`api/data_sources/${id}`), get: ({ id }) => axios.get(`api/data_sources/${id}`),
types: () => axios.get("api/data_sources/types"), types: () => axios.get("api/data_sources/types"),
create: data => axios.post(`api/data_sources`, data), create: (data) => axios.post(`api/data_sources`, data),
save: data => axios.post(`api/data_sources/${data.id}`, data), save: (data) => axios.post(`api/data_sources/${data.id}`, data),
test: data => axios.post(`api/data_sources/${data.id}/test`), test: (data) => axios.post(`api/data_sources/${data.id}/test`),
delete: ({ id }) => axios.delete(`api/data_sources/${id}`), delete: ({ id }) => axios.delete(`api/data_sources/${id}`),
fetchSchema: (data, refresh = false) => { fetchSchema: (data, refresh = false) => {
const params = {}; const params = {};
@@ -27,15 +27,15 @@ const DataSource = {
return axios return axios
.get(`api/data_sources/${data.id}/schema`, { params }) .get(`api/data_sources/${data.id}/schema`, { params })
.then(data => { .then((data) => {
if (has(data, "job")) { if (has(data, "job")) {
return fetchDataFromJob(data.job.id).catch(error => return fetchDataFromJob(data.job.id).catch((error) =>
error.code === SCHEMA_NOT_SUPPORTED ? [] : Promise.reject(new Error(data.job.error)) error.code === SCHEMA_NOT_SUPPORTED ? [] : Promise.reject(new Error(data.job.error))
); );
} }
return has(data, "schema") ? data.schema : Promise.reject(); return has(data, "schema") ? data.schema : Promise.reject();
}) })
.then(tables => map(tables, table => ({ ...table, columns: mapSchemaColumnsToObject(table.columns) }))); .then((tables) => map(tables, (table) => ({ ...table, columns: mapSchemaColumnsToObject(table.columns) })));
}, },
}; };

View File

@@ -63,7 +63,7 @@ function runCypressCI() {
CYPRESS_OPTIONS, // eslint-disable-line no-unused-vars CYPRESS_OPTIONS, // eslint-disable-line no-unused-vars
} = process.env; } = process.env;
if (GITHUB_REPOSITORY === "getredash/redash") { if (GITHUB_REPOSITORY === "getredash/redash" && process.env.CYPRESS_RECORD_KEY) {
process.env.CYPRESS_OPTIONS = "--record"; process.env.CYPRESS_OPTIONS = "--record";
} }

View File

@@ -26,33 +26,33 @@ const SQL = `
describe("Chart", () => { describe("Chart", () => {
beforeEach(() => { beforeEach(() => {
cy.login(); cy.login();
cy.createQuery({ name: "Chart Visualization", query: SQL }) cy.createQuery({ name: "Chart Visualization", query: SQL }).its("id").as("queryId");
.its("id")
.as("queryId");
}); });
it("creates Bar charts", function() { it("creates Bar charts", function () {
cy.visit(`queries/${this.queryId}/source`); cy.visit(`queries/${this.queryId}/source`);
cy.getByTestId("ExecuteButton").click(); cy.getByTestId("ExecuteButton").click();
const getBarChartAssertionFunction = (specificBarChartAssertionFn = () => {}) => () => { const getBarChartAssertionFunction =
// checks for TabbedEditor standard tabs (specificBarChartAssertionFn = () => {}) =>
assertTabbedEditor(); () => {
// checks for TabbedEditor standard tabs
assertTabbedEditor();
// standard chart should be bar // standard chart should be bar
cy.getByTestId("Chart.GlobalSeriesType").contains(".ant-select-selection-item", "Bar"); cy.getByTestId("Chart.GlobalSeriesType").contains(".ant-select-selection-item", "Bar");
// checks the plot canvas exists and is empty // checks the plot canvas exists and is empty
assertPlotPreview("not.exist"); assertPlotPreview("not.exist");
// creates a chart and checks it is plotted // creates a chart and checks it is plotted
cy.getByTestId("Chart.ColumnMapping.x").selectAntdOption("Chart.ColumnMapping.x.stage"); cy.getByTestId("Chart.ColumnMapping.x").selectAntdOption("Chart.ColumnMapping.x.stage");
cy.getByTestId("Chart.ColumnMapping.y").selectAntdOption("Chart.ColumnMapping.y.value1"); cy.getByTestId("Chart.ColumnMapping.y").selectAntdOption("Chart.ColumnMapping.y.value1");
cy.getByTestId("Chart.ColumnMapping.y").selectAntdOption("Chart.ColumnMapping.y.value2"); cy.getByTestId("Chart.ColumnMapping.y").selectAntdOption("Chart.ColumnMapping.y.value2");
assertPlotPreview("exist"); assertPlotPreview("exist");
specificBarChartAssertionFn(); specificBarChartAssertionFn();
}; };
const chartTests = [ const chartTests = [
{ {
@@ -95,8 +95,8 @@ describe("Chart", () => {
const withDashboardWidgetsAssertionFn = (widgetGetters, dashboardUrl) => { const withDashboardWidgetsAssertionFn = (widgetGetters, dashboardUrl) => {
cy.visit(dashboardUrl); cy.visit(dashboardUrl);
widgetGetters.forEach(widgetGetter => { widgetGetters.forEach((widgetGetter) => {
cy.get(`@${widgetGetter}`).then(widget => { cy.get(`@${widgetGetter}`).then((widget) => {
cy.getByTestId(getWidgetTestId(widget)).within(() => { cy.getByTestId(getWidgetTestId(widget)).within(() => {
cy.get("g.points").should("exist"); cy.get("g.points").should("exist");
}); });
@@ -107,4 +107,34 @@ describe("Chart", () => {
createDashboardWithCharts("Bar chart visualizations", chartGetters, withDashboardWidgetsAssertionFn); createDashboardWithCharts("Bar chart visualizations", chartGetters, withDashboardWidgetsAssertionFn);
cy.percySnapshot("Visualizations - Charts - Bar"); cy.percySnapshot("Visualizations - Charts - Bar");
}); });
it("colors Bar charts", function () {
cy.visit(`queries/${this.queryId}/source`);
cy.getByTestId("ExecuteButton").click();
cy.getByTestId("NewVisualization").click();
cy.getByTestId("Chart.ColumnMapping.x").selectAntdOption("Chart.ColumnMapping.x.stage");
cy.getByTestId("Chart.ColumnMapping.y").selectAntdOption("Chart.ColumnMapping.y.value1");
cy.getByTestId("VisualizationEditor.Tabs.Colors").click();
cy.getByTestId("ColorScheme").click();
cy.getByTestId("ColorOptionViridis").click();
cy.getByTestId("ColorScheme").click();
cy.getByTestId("ColorOptionTableau 10").click();
cy.getByTestId("ColorScheme").click();
cy.getByTestId("ColorOptionD3 Category 10").click();
});
it("colors Pie charts", function () {
cy.visit(`queries/${this.queryId}/source`);
cy.getByTestId("ExecuteButton").click();
cy.getByTestId("NewVisualization").click();
cy.getByTestId("Chart.GlobalSeriesType").click();
cy.getByTestId("Chart.ChartType.pie").click();
cy.getByTestId("Chart.ColumnMapping.x").selectAntdOption("Chart.ColumnMapping.x.stage");
cy.getByTestId("Chart.ColumnMapping.y").selectAntdOption("Chart.ColumnMapping.y.value1");
cy.getByTestId("VisualizationEditor.Tabs.Colors").click();
cy.getByTestId("ColorScheme").click();
cy.getByTestId("ColorOptionViridis").click();
cy.getByTestId("ColorScheme").click();
cy.getByTestId("ColorOptionTableau 10").click();
cy.getByTestId("ColorScheme").click();
cy.getByTestId("ColorOptionD3 Category 10").click();
});
}); });

View File

@@ -1,33 +0,0 @@
const loremIpsum =
"Lorem ipsum dolor sit amet consectetur adipiscing elit" +
"sed do eiusmod tempor incididunt ut labore et dolore magna aliqua";
export const query = `
SELECT '${loremIpsum}' AS a, '${loremIpsum}' AS b, '${loremIpsum}' AS c, '${loremIpsum}' AS d, '${loremIpsum}' as e
`;
export const config = {
itemsPerPage: 10,
columns: [
{
name: "a",
displayAs: "string",
},
{
name: "b",
displayAs: "string",
},
{
name: "c",
displayAs: "string",
},
{
name: "d",
displayAs: "string",
},
{
name: "e",
displayAs: "string",
}
]
}

View File

@@ -8,7 +8,6 @@ import * as AllCellTypes from "./.mocks/all-cell-types";
import * as MultiColumnSort from "./.mocks/multi-column-sort"; import * as MultiColumnSort from "./.mocks/multi-column-sort";
import * as SearchInData from "./.mocks/search-in-data"; import * as SearchInData from "./.mocks/search-in-data";
import * as LargeDataset from "./.mocks/large-dataset"; import * as LargeDataset from "./.mocks/large-dataset";
import * as WideDataSet from "./.mocks/wide-dataset";
function prepareVisualization(query, type, name, options) { function prepareVisualization(query, type, name, options) {
return cy return cy
@@ -23,10 +22,7 @@ function prepareVisualization(query, type, name, options) {
cy.get("body").type("{alt}D"); cy.get("body").type("{alt}D");
// do some pre-checks here to ensure that visualization was created and is visible // do some pre-checks here to ensure that visualization was created and is visible
cy.getByTestId("TableVisualization") cy.getByTestId("TableVisualization").should("exist").find("table").should("exist");
.should("exist")
.find("table")
.should("exist");
return cy.then(() => ({ queryId, visualizationId })); return cy.then(() => ({ queryId, visualizationId }));
}); });
@@ -54,7 +50,7 @@ describe("Table", () => {
}); });
describe("Sorting data", () => { describe("Sorting data", () => {
beforeEach(function() { beforeEach(function () {
const { query, config } = MultiColumnSort; const { query, config } = MultiColumnSort;
prepareVisualization(query, "TABLE", "Sort data", config).then(({ queryId, visualizationId }) => { prepareVisualization(query, "TABLE", "Sort data", config).then(({ queryId, visualizationId }) => {
this.queryId = queryId; this.queryId = queryId;
@@ -62,94 +58,30 @@ describe("Table", () => {
}); });
}); });
it("sorts data by a single column", function() { it("sorts data by a single column", function () {
cy.getByTestId("TableVisualization") cy.getByTestId("TableVisualization").find("table th").contains("c").should("exist").click();
.find("table th")
.contains("c")
.should("exist")
.click();
cy.percySnapshot("Visualizations - Table (Single-column sort)", { widths: [viewportWidth] }); cy.percySnapshot("Visualizations - Table (Single-column sort)", { widths: [viewportWidth] });
}); });
it("sorts data by a multiple columns", function() { it("sorts data by a multiple columns", function () {
cy.getByTestId("TableVisualization") cy.getByTestId("TableVisualization").find("table th").contains("a").should("exist").click();
.find("table th")
.contains("a")
.should("exist")
.click();
cy.get("body").type("{shift}", { release: false }); cy.get("body").type("{shift}", { release: false });
cy.getByTestId("TableVisualization") cy.getByTestId("TableVisualization").find("table th").contains("b").should("exist").click();
.find("table th")
.contains("b")
.should("exist")
.click();
cy.percySnapshot("Visualizations - Table (Multi-column sort)", { widths: [viewportWidth] }); cy.percySnapshot("Visualizations - Table (Multi-column sort)", { widths: [viewportWidth] });
}); });
it("sorts data in reverse order", function() { it("sorts data in reverse order", function () {
cy.getByTestId("TableVisualization") cy.getByTestId("TableVisualization").find("table th").contains("c").should("exist").click().click();
.find("table th")
.contains("c")
.should("exist")
.click()
.click();
cy.percySnapshot("Visualizations - Table (Single-column reverse sort)", { widths: [viewportWidth] }); cy.percySnapshot("Visualizations - Table (Single-column reverse sort)", { widths: [viewportWidth] });
}); });
}); });
describe("Fixing columns", () => {
it("fixes the correct number of columns", () => {
const { query, config } = WideDataSet;
prepareVisualization(query, "TABLE", "All cell types", config);
cy.getByTestId("EditVisualization").click();
cy.contains("span", "Grid").click();
cy.getByTestId("FixedColumns").click();
cy.contains(".ant-select-item-option-content", "1").click();
cy.contains("Save").click();
// eslint-disable-next-line cypress/no-unnecessary-waiting
cy.wait(500); //add some waiting to make sure table visualization is saved
cy.get(".ant-table-thead")
.find("th.ant-table-cell-fix-left")
.then(fixedCols => {
expect(fixedCols.length).to.equal(1);
});
cy.get(".ant-table-content").scrollTo("right", { duration: 1000 });
cy.get(".ant-table-content").scrollTo("left", { duration: 1000 });
});
it("doesn't let user fix too many columns", () => {
const { query, config } = MultiColumnSort;
prepareVisualization(query, "TABLE", "Test data", config);
cy.getByTestId("EditVisualization").click();
cy.contains("span", "Grid").click();
cy.getByTestId("FixedColumns").click();
cy.get(".ant-select-item-option-content");
cy.contains(".ant-select-item-option-content", "3").should("not.exist");
cy.contains(".ant-select-item-option-content", "4").should("not.exist");
});
it("doesn't cause issues when freezing column off of page", () => {
const { query, config } = WideDataSet;
prepareVisualization(query, "TABLE", "Test data", config);
cy.getByTestId("EditVisualization").click();
cy.contains("span", "Grid").click();
cy.getByTestId("FixedColumns").click();
cy.contains(".ant-select-item-option-content", "4").click();
cy.contains("Save").click();
});
});
it("searches in multiple columns", () => { it("searches in multiple columns", () => {
const { query, config } = SearchInData; const { query, config } = SearchInData;
prepareVisualization(query, "TABLE", "Search", config).then(({ visualizationId }) => { prepareVisualization(query, "TABLE", "Search", config).then(({ visualizationId }) => {
cy.getByTestId("TableVisualization") cy.getByTestId("TableVisualization").find("table input").should("exist").type("test");
.find("table input")
.should("exist")
.type("test");
cy.percySnapshot("Visualizations - Table (Search in data)", { widths: [viewportWidth] }); cy.percySnapshot("Visualizations - Table (Search in data)", { widths: [viewportWidth] });
}); });
}); });

View File

@@ -3,36 +3,26 @@
* @param should Passed to should expression after plot points are captured * @param should Passed to should expression after plot points are captured
*/ */
export function assertPlotPreview(should = "exist") { export function assertPlotPreview(should = "exist") {
cy.getByTestId("VisualizationPreview") cy.getByTestId("VisualizationPreview").find("g.overplot").should("exist").find("g.points").should(should);
.find("g.plot")
.should("exist")
.find("g.points")
.should(should);
} }
export function createChartThroughUI(chartName, chartSpecificAssertionFn = () => {}) { export function createChartThroughUI(chartName, chartSpecificAssertionFn = () => {}) {
cy.getByTestId("NewVisualization").click(); cy.getByTestId("NewVisualization").click();
cy.getByTestId("VisualizationType").selectAntdOption("VisualizationType.CHART"); cy.getByTestId("VisualizationType").selectAntdOption("VisualizationType.CHART");
cy.getByTestId("VisualizationName") cy.getByTestId("VisualizationName").clear().type(chartName);
.clear()
.type(chartName);
chartSpecificAssertionFn(); chartSpecificAssertionFn();
cy.server(); cy.server();
cy.route("POST", "**/api/visualizations").as("SaveVisualization"); cy.route("POST", "**/api/visualizations").as("SaveVisualization");
cy.getByTestId("EditVisualizationDialog") cy.getByTestId("EditVisualizationDialog").contains("button", "Save").click();
.contains("button", "Save")
.click();
cy.getByTestId("QueryPageVisualizationTabs") cy.getByTestId("QueryPageVisualizationTabs").contains("span", chartName).should("exist");
.contains("span", chartName)
.should("exist");
cy.wait("@SaveVisualization").should("have.property", "status", 200); cy.wait("@SaveVisualization").should("have.property", "status", 200);
return cy.get("@SaveVisualization").then(xhr => { return cy.get("@SaveVisualization").then((xhr) => {
const { id, name, options } = xhr.response.body; const { id, name, options } = xhr.response.body;
return cy.wrap({ id, name, options }); return cy.wrap({ id, name, options });
}); });
@@ -42,19 +32,13 @@ export function assertTabbedEditor(chartSpecificTabbedEditorAssertionFn = () =>
cy.getByTestId("Chart.GlobalSeriesType").should("exist"); cy.getByTestId("Chart.GlobalSeriesType").should("exist");
cy.getByTestId("VisualizationEditor.Tabs.Series").click(); cy.getByTestId("VisualizationEditor.Tabs.Series").click();
cy.getByTestId("VisualizationEditor") cy.getByTestId("VisualizationEditor").find("table").should("exist");
.find("table")
.should("exist");
cy.getByTestId("VisualizationEditor.Tabs.Colors").click(); cy.getByTestId("VisualizationEditor.Tabs.Colors").click();
cy.getByTestId("VisualizationEditor") cy.getByTestId("VisualizationEditor").find("table").should("exist");
.find("table")
.should("exist");
cy.getByTestId("VisualizationEditor.Tabs.DataLabels").click(); cy.getByTestId("VisualizationEditor.Tabs.DataLabels").click();
cy.getByTestId("VisualizationEditor") cy.getByTestId("VisualizationEditor").getByTestId("Chart.DataLabels.ShowDataLabels").should("exist");
.getByTestId("Chart.DataLabels.ShowDataLabels")
.should("exist");
chartSpecificTabbedEditorAssertionFn(); chartSpecificTabbedEditorAssertionFn();
@@ -63,39 +47,29 @@ export function assertTabbedEditor(chartSpecificTabbedEditorAssertionFn = () =>
export function assertAxesAndAddLabels(xaxisLabel, yaxisLabel) { export function assertAxesAndAddLabels(xaxisLabel, yaxisLabel) {
cy.getByTestId("VisualizationEditor.Tabs.XAxis").click(); cy.getByTestId("VisualizationEditor.Tabs.XAxis").click();
cy.getByTestId("Chart.XAxis.Type") cy.getByTestId("Chart.XAxis.Type").contains(".ant-select-selection-item", "Auto Detect").should("exist");
.contains(".ant-select-selection-item", "Auto Detect")
.should("exist");
cy.getByTestId("Chart.XAxis.Name") cy.getByTestId("Chart.XAxis.Name").clear().type(xaxisLabel);
.clear()
.type(xaxisLabel);
cy.getByTestId("VisualizationEditor.Tabs.YAxis").click(); cy.getByTestId("VisualizationEditor.Tabs.YAxis").click();
cy.getByTestId("Chart.LeftYAxis.Type") cy.getByTestId("Chart.LeftYAxis.Type").contains(".ant-select-selection-item", "Linear").should("exist");
.contains(".ant-select-selection-item", "Linear")
.should("exist");
cy.getByTestId("Chart.LeftYAxis.Name") cy.getByTestId("Chart.LeftYAxis.Name").clear().type(yaxisLabel);
.clear()
.type(yaxisLabel);
cy.getByTestId("Chart.LeftYAxis.TickFormat") cy.getByTestId("Chart.LeftYAxis.TickFormat").clear().type("+");
.clear()
.type("+");
cy.getByTestId("VisualizationEditor.Tabs.General").click(); cy.getByTestId("VisualizationEditor.Tabs.General").click();
} }
export function createDashboardWithCharts(title, chartGetters, widgetsAssertionFn = () => {}) { export function createDashboardWithCharts(title, chartGetters, widgetsAssertionFn = () => {}) {
cy.createDashboard(title).then(dashboard => { cy.createDashboard(title).then((dashboard) => {
const dashboardUrl = `/dashboards/${dashboard.id}`; const dashboardUrl = `/dashboards/${dashboard.id}`;
const widgetGetters = chartGetters.map(chartGetter => `${chartGetter}Widget`); const widgetGetters = chartGetters.map((chartGetter) => `${chartGetter}Widget`);
chartGetters.forEach((chartGetter, i) => { chartGetters.forEach((chartGetter, i) => {
const position = { autoHeight: false, sizeY: 8, sizeX: 3, col: (i % 2) * 3 }; const position = { autoHeight: false, sizeY: 8, sizeX: 3, col: (i % 2) * 3 };
cy.get(`@${chartGetter}`) cy.get(`@${chartGetter}`)
.then(chart => cy.addWidget(dashboard.id, chart.id, { position })) .then((chart) => cy.addWidget(dashboard.id, chart.id, { position }))
.as(widgetGetters[i]); .as(widgetGetters[i]);
}); });

View File

@@ -1,12 +1,10 @@
export function expectTableToHaveLength(length) { export function expectTableToHaveLength(length) {
cy.getByTestId("TableVisualization") cy.getByTestId("TableVisualization").find("tbody tr").should("have.length", length);
.find("tbody tr.ant-table-row")
.should("have.length", length);
} }
export function expectFirstColumnToHaveMembers(values) { export function expectFirstColumnToHaveMembers(values) {
cy.getByTestId("TableVisualization") cy.getByTestId("TableVisualization")
.find("tbody tr.ant-table-row td:first-child") .find("tbody tr td:first-child")
.then($cell => Cypress.$.map($cell, item => Cypress.$(item).text())) .then(($cell) => Cypress.$.map($cell, (item) => Cypress.$(item).text()))
.then(firstColumnCells => expect(firstColumnCells).to.have.members(values)); .then((firstColumnCells) => expect(firstColumnCells).to.have.members(values));
} }

View File

@@ -1,24 +0,0 @@
services:
.redash:
build:
context: .
args:
FRONTEND_BUILD_MODE: ${FRONTEND_BUILD_MODE:-2}
INSTALL_GROUPS: ${INSTALL_GROUPS:-main,all_ds,dev}
volumes:
- $PWD:${SERVER_MOUNT:-/ignore}
command: manage version
environment:
REDASH_LOG_LEVEL: INFO
REDASH_REDIS_URL: redis://redis:6379/0
REDASH_DATABASE_URL: postgresql://postgres@postgres/postgres
REDASH_RATELIMIT_ENABLED: false
REDASH_MAIL_DEFAULT_SENDER: redash@example.com
REDASH_MAIL_SERVER: email
REDASH_MAIL_PORT: 1025
REDASH_ENFORCE_CSRF: true
REDASH_COOKIE_SECRET: ${REDASH_COOKIE_SECRET}
REDASH_SECRET_KEY: ${REDASH_SECRET_KEY}
REDASH_PRODUCTION: ${REDASH_PRODUCTION:-true}
env_file:
- .env

View File

@@ -10,6 +10,7 @@ x-redash-service: &redash-service
env_file: env_file:
- .env - .env
x-redash-environment: &redash-environment x-redash-environment: &redash-environment
REDASH_HOST: http://localhost:5001
REDASH_LOG_LEVEL: "INFO" REDASH_LOG_LEVEL: "INFO"
REDASH_REDIS_URL: "redis://redis:6379/0" REDASH_REDIS_URL: "redis://redis:6379/0"
REDASH_DATABASE_URL: "postgresql://postgres@postgres/postgres" REDASH_DATABASE_URL: "postgresql://postgres@postgres/postgres"

View File

@@ -0,0 +1,64 @@
"""fix_hash
Revision ID: 9e8c841d1a30
Revises: 7205816877ec
Create Date: 2024-10-05 18:55:35.730573
"""
import logging
from alembic import op
import sqlalchemy as sa
from sqlalchemy.sql import table
from sqlalchemy import select
from redash.query_runner import BaseQueryRunner, get_query_runner
# revision identifiers, used by Alembic.
revision = '9e8c841d1a30'
down_revision = '7205816877ec'
branch_labels = None
depends_on = None
def update_query_hash(record):
should_apply_auto_limit = record['options'].get("apply_auto_limit", False) if record['options'] else False
query_runner = get_query_runner(record['type'], {}) if record['type'] else BaseQueryRunner({})
query_text = record['query']
parameters_dict = {p["name"]: p.get("value") for p in record['options'].get('parameters', [])} if record.options else {}
if any(parameters_dict):
print(f"Query {record['query_id']} has parameters. Hash might be incorrect.")
return query_runner.gen_query_hash(query_text, should_apply_auto_limit)
def upgrade():
conn = op.get_bind()
metadata = sa.MetaData(bind=conn)
queries = sa.Table("queries", metadata, autoload=True)
data_sources = sa.Table("data_sources", metadata, autoload=True)
joined_table = queries.outerjoin(data_sources, queries.c.data_source_id == data_sources.c.id)
query = select([
queries.c.id.label("query_id"),
queries.c.query,
queries.c.query_hash,
queries.c.options,
data_sources.c.id.label("data_source_id"),
data_sources.c.type
]).select_from(joined_table)
for record in conn.execute(query):
new_hash = update_query_hash(record)
print(f"Updating hash for query {record['query_id']} from {record['query_hash']} to {new_hash}")
conn.execute(
queries.update()
.where(queries.c.id == record['query_id'])
.values(query_hash=new_hash))
def downgrade():
pass

View File

@@ -1,6 +1,6 @@
{ {
"name": "redash-client", "name": "redash-client",
"version": "24.08.1-dev", "version": "25.03.0-dev",
"description": "The frontend part of Redash.", "description": "The frontend part of Redash.",
"main": "index.js", "main": "index.js",
"scripts": { "scripts": {
@@ -50,11 +50,12 @@
"antd": "^4.4.3", "antd": "^4.4.3",
"axios": "0.27.2", "axios": "0.27.2",
"axios-auth-refresh": "3.3.6", "axios-auth-refresh": "3.3.6",
"bootstrap": "^3.3.7", "bootstrap": "^3.4.1",
"classnames": "^2.2.6", "classnames": "^2.2.6",
"d3": "^3.5.17", "d3": "^3.5.17",
"debug": "^3.2.7", "debug": "^3.2.7",
"dompurify": "^2.0.17", "dompurify": "^2.0.17",
"elliptic": "^6.6.0",
"font-awesome": "^4.7.0", "font-awesome": "^4.7.0",
"history": "^4.10.1", "history": "^4.10.1",
"hoist-non-react-statics": "^3.3.0", "hoist-non-react-statics": "^3.3.0",
@@ -63,7 +64,7 @@
"mousetrap": "^1.6.1", "mousetrap": "^1.6.1",
"mustache": "^2.3.0", "mustache": "^2.3.0",
"numeral": "^2.0.6", "numeral": "^2.0.6",
"path-to-regexp": "^3.1.0", "path-to-regexp": "^3.3.0",
"prop-types": "^15.6.1", "prop-types": "^15.6.1",
"query-string": "^6.9.0", "query-string": "^6.9.0",
"react": "16.14.0", "react": "16.14.0",
@@ -179,8 +180,8 @@
] ]
}, },
"browser": { "browser": {
"fs": false, "fs": false,
"path": false "path": false
}, },
"//": "browserslist set to 'Async functions' compatibility", "//": "browserslist set to 'Async functions' compatibility",
"browserslist": [ "browserslist": [

545
poetry.lock generated
View File

@@ -1,4 +1,4 @@
# This file is automatically @generated by Poetry 1.8.3 and should not be changed by hand. # This file is automatically @generated by Poetry 1.8.5 and should not be changed by hand.
[[package]] [[package]]
name = "adal" name = "adal"
@@ -891,43 +891,38 @@ files = [
[[package]] [[package]]
name = "cryptography" name = "cryptography"
version = "42.0.8" version = "43.0.1"
description = "cryptography is a package which provides cryptographic recipes and primitives to Python developers." description = "cryptography is a package which provides cryptographic recipes and primitives to Python developers."
optional = false optional = false
python-versions = ">=3.7" python-versions = ">=3.7"
files = [ files = [
{file = "cryptography-42.0.8-cp37-abi3-macosx_10_12_universal2.whl", hash = "sha256:81d8a521705787afe7a18d5bfb47ea9d9cc068206270aad0b96a725022e18d2e"}, {file = "cryptography-43.0.1-cp37-abi3-macosx_10_9_universal2.whl", hash = "sha256:8385d98f6a3bf8bb2d65a73e17ed87a3ba84f6991c155691c51112075f9ffc5d"},
{file = "cryptography-42.0.8-cp37-abi3-macosx_10_12_x86_64.whl", hash = "sha256:961e61cefdcb06e0c6d7e3a1b22ebe8b996eb2bf50614e89384be54c48c6b63d"}, {file = "cryptography-43.0.1-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:27e613d7077ac613e399270253259d9d53872aaf657471473ebfc9a52935c062"},
{file = "cryptography-42.0.8-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e3ec3672626e1b9e55afd0df6d774ff0e953452886e06e0f1eb7eb0c832e8902"}, {file = "cryptography-43.0.1-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:68aaecc4178e90719e95298515979814bda0cbada1256a4485414860bd7ab962"},
{file = "cryptography-42.0.8-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e599b53fd95357d92304510fb7bda8523ed1f79ca98dce2f43c115950aa78801"}, {file = "cryptography-43.0.1-cp37-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:de41fd81a41e53267cb020bb3a7212861da53a7d39f863585d13ea11049cf277"},
{file = "cryptography-42.0.8-cp37-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:5226d5d21ab681f432a9c1cf8b658c0cb02533eece706b155e5fbd8a0cdd3949"}, {file = "cryptography-43.0.1-cp37-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:f98bf604c82c416bc829e490c700ca1553eafdf2912a91e23a79d97d9801372a"},
{file = "cryptography-42.0.8-cp37-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:6b7c4f03ce01afd3b76cf69a5455caa9cfa3de8c8f493e0d3ab7d20611c8dae9"}, {file = "cryptography-43.0.1-cp37-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:61ec41068b7b74268fa86e3e9e12b9f0c21fcf65434571dbb13d954bceb08042"},
{file = "cryptography-42.0.8-cp37-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:2346b911eb349ab547076f47f2e035fc8ff2c02380a7cbbf8d87114fa0f1c583"}, {file = "cryptography-43.0.1-cp37-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:014f58110f53237ace6a408b5beb6c427b64e084eb451ef25a28308270086494"},
{file = "cryptography-42.0.8-cp37-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:ad803773e9df0b92e0a817d22fd8a3675493f690b96130a5e24f1b8fabbea9c7"}, {file = "cryptography-43.0.1-cp37-abi3-win32.whl", hash = "sha256:2bd51274dcd59f09dd952afb696bf9c61a7a49dfc764c04dd33ef7a6b502a1e2"},
{file = "cryptography-42.0.8-cp37-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:2f66d9cd9147ee495a8374a45ca445819f8929a3efcd2e3df6428e46c3cbb10b"}, {file = "cryptography-43.0.1-cp37-abi3-win_amd64.whl", hash = "sha256:666ae11966643886c2987b3b721899d250855718d6d9ce41b521252a17985f4d"},
{file = "cryptography-42.0.8-cp37-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:d45b940883a03e19e944456a558b67a41160e367a719833c53de6911cabba2b7"}, {file = "cryptography-43.0.1-cp39-abi3-macosx_10_9_universal2.whl", hash = "sha256:ac119bb76b9faa00f48128b7f5679e1d8d437365c5d26f1c2c3f0da4ce1b553d"},
{file = "cryptography-42.0.8-cp37-abi3-win32.whl", hash = "sha256:a0c5b2b0585b6af82d7e385f55a8bc568abff8923af147ee3c07bd8b42cda8b2"}, {file = "cryptography-43.0.1-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1bbcce1a551e262dfbafb6e6252f1ae36a248e615ca44ba302df077a846a8806"},
{file = "cryptography-42.0.8-cp37-abi3-win_amd64.whl", hash = "sha256:57080dee41209e556a9a4ce60d229244f7a66ef52750f813bfbe18959770cfba"}, {file = "cryptography-43.0.1-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:58d4e9129985185a06d849aa6df265bdd5a74ca6e1b736a77959b498e0505b85"},
{file = "cryptography-42.0.8-cp39-abi3-macosx_10_12_universal2.whl", hash = "sha256:dea567d1b0e8bc5764b9443858b673b734100c2871dc93163f58c46a97a83d28"}, {file = "cryptography-43.0.1-cp39-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:d03a475165f3134f773d1388aeb19c2d25ba88b6a9733c5c590b9ff7bbfa2e0c"},
{file = "cryptography-42.0.8-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c4783183f7cb757b73b2ae9aed6599b96338eb957233c58ca8f49a49cc32fd5e"}, {file = "cryptography-43.0.1-cp39-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:511f4273808ab590912a93ddb4e3914dfd8a388fed883361b02dea3791f292e1"},
{file = "cryptography-42.0.8-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a0608251135d0e03111152e41f0cc2392d1e74e35703960d4190b2e0f4ca9c70"}, {file = "cryptography-43.0.1-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:80eda8b3e173f0f247f711eef62be51b599b5d425c429b5d4ca6a05e9e856baa"},
{file = "cryptography-42.0.8-cp39-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:dc0fdf6787f37b1c6b08e6dfc892d9d068b5bdb671198c72072828b80bd5fe4c"}, {file = "cryptography-43.0.1-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:38926c50cff6f533f8a2dae3d7f19541432610d114a70808f0926d5aaa7121e4"},
{file = "cryptography-42.0.8-cp39-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:9c0c1716c8447ee7dbf08d6db2e5c41c688544c61074b54fc4564196f55c25a7"}, {file = "cryptography-43.0.1-cp39-abi3-win32.whl", hash = "sha256:a575913fb06e05e6b4b814d7f7468c2c660e8bb16d8d5a1faf9b33ccc569dd47"},
{file = "cryptography-42.0.8-cp39-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:fff12c88a672ab9c9c1cf7b0c80e3ad9e2ebd9d828d955c126be4fd3e5578c9e"}, {file = "cryptography-43.0.1-cp39-abi3-win_amd64.whl", hash = "sha256:d75601ad10b059ec832e78823b348bfa1a59f6b8d545db3a24fd44362a1564cb"},
{file = "cryptography-42.0.8-cp39-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:cafb92b2bc622cd1aa6a1dce4b93307792633f4c5fe1f46c6b97cf67073ec961"}, {file = "cryptography-43.0.1-pp310-pypy310_pp73-macosx_10_9_x86_64.whl", hash = "sha256:ea25acb556320250756e53f9e20a4177515f012c9eaea17eb7587a8c4d8ae034"},
{file = "cryptography-42.0.8-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:31f721658a29331f895a5a54e7e82075554ccfb8b163a18719d342f5ffe5ecb1"}, {file = "cryptography-43.0.1-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:c1332724be35d23a854994ff0b66530119500b6053d0bd3363265f7e5e77288d"},
{file = "cryptography-42.0.8-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:b297f90c5723d04bcc8265fc2a0f86d4ea2e0f7ab4b6994459548d3a6b992a14"}, {file = "cryptography-43.0.1-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:fba1007b3ef89946dbbb515aeeb41e30203b004f0b4b00e5e16078b518563289"},
{file = "cryptography-42.0.8-cp39-abi3-win32.whl", hash = "sha256:2f88d197e66c65be5e42cd72e5c18afbfae3f741742070e3019ac8f4ac57262c"}, {file = "cryptography-43.0.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:5b43d1ea6b378b54a1dc99dd8a2b5be47658fe9a7ce0a58ff0b55f4b43ef2b84"},
{file = "cryptography-42.0.8-cp39-abi3-win_amd64.whl", hash = "sha256:fa76fbb7596cc5839320000cdd5d0955313696d9511debab7ee7278fc8b5c84a"}, {file = "cryptography-43.0.1-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:88cce104c36870d70c49c7c8fd22885875d950d9ee6ab54df2745f83ba0dc365"},
{file = "cryptography-42.0.8-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:ba4f0a211697362e89ad822e667d8d340b4d8d55fae72cdd619389fb5912eefe"}, {file = "cryptography-43.0.1-pp39-pypy39_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:9d3cdb25fa98afdd3d0892d132b8d7139e2c087da1712041f6b762e4f807cc96"},
{file = "cryptography-42.0.8-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:81884c4d096c272f00aeb1f11cf62ccd39763581645b0812e99a91505fa48e0c"}, {file = "cryptography-43.0.1-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:e710bf40870f4db63c3d7d929aa9e09e4e7ee219e703f949ec4073b4294f6172"},
{file = "cryptography-42.0.8-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:c9bb2ae11bfbab395bdd072985abde58ea9860ed84e59dbc0463a5d0159f5b71"}, {file = "cryptography-43.0.1-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:7c05650fe8023c5ed0d46793d4b7d7e6cd9c04e68eabe5b0aeea836e37bdcec2"},
{file = "cryptography-42.0.8-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:7016f837e15b0a1c119d27ecd89b3515f01f90a8615ed5e9427e30d9cdbfed3d"}, {file = "cryptography-43.0.1.tar.gz", hash = "sha256:203e92a75716d8cfb491dc47c79e17d0d9207ccffcbcb35f598fbe463ae3444d"},
{file = "cryptography-42.0.8-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:5a94eccb2a81a309806027e1670a358b99b8fe8bfe9f8d329f27d72c094dde8c"},
{file = "cryptography-42.0.8-pp39-pypy39_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:dec9b018df185f08483f294cae6ccac29e7a6e0678996587363dc352dc65c842"},
{file = "cryptography-42.0.8-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:343728aac38decfdeecf55ecab3264b015be68fc2816ca800db649607aeee648"},
{file = "cryptography-42.0.8-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:013629ae70b40af70c9a7a5db40abe5d9054e6f4380e50ce769947b73bf3caad"},
{file = "cryptography-42.0.8.tar.gz", hash = "sha256:8d09d05439ce7baa8e9e95b07ec5b6c886f548deb7e0f69ef25f64b3bce842f2"},
] ]
[package.dependencies] [package.dependencies]
@@ -940,7 +935,7 @@ nox = ["nox"]
pep8test = ["check-sdist", "click", "mypy", "ruff"] pep8test = ["check-sdist", "click", "mypy", "ruff"]
sdist = ["build"] sdist = ["build"]
ssh = ["bcrypt (>=3.1.5)"] ssh = ["bcrypt (>=3.1.5)"]
test = ["certifi", "pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"] test = ["certifi", "cryptography-vectors (==43.0.1)", "pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
test-randomorder = ["pytest-randomly"] test-randomorder = ["pytest-randomly"]
[[package]] [[package]]
@@ -979,6 +974,41 @@ sqlalchemy = "*"
sqlalchemy = ["sqlalchemy (>1.3.21,<2.0)"] sqlalchemy = ["sqlalchemy (>1.3.21,<2.0)"]
superset = ["apache-superset (>=1.4.1)"] superset = ["apache-superset (>=1.4.1)"]
[[package]]
name = "debugpy"
version = "1.8.9"
description = "An implementation of the Debug Adapter Protocol for Python"
optional = false
python-versions = ">=3.8"
files = [
{file = "debugpy-1.8.9-cp310-cp310-macosx_14_0_x86_64.whl", hash = "sha256:cfe1e6c6ad7178265f74981edf1154ffce97b69005212fbc90ca22ddfe3d017e"},
{file = "debugpy-1.8.9-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ada7fb65102a4d2c9ab62e8908e9e9f12aed9d76ef44880367bc9308ebe49a0f"},
{file = "debugpy-1.8.9-cp310-cp310-win32.whl", hash = "sha256:c36856343cbaa448171cba62a721531e10e7ffb0abff838004701454149bc037"},
{file = "debugpy-1.8.9-cp310-cp310-win_amd64.whl", hash = "sha256:17c5e0297678442511cf00a745c9709e928ea4ca263d764e90d233208889a19e"},
{file = "debugpy-1.8.9-cp311-cp311-macosx_14_0_universal2.whl", hash = "sha256:b74a49753e21e33e7cf030883a92fa607bddc4ede1aa4145172debc637780040"},
{file = "debugpy-1.8.9-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:62d22dacdb0e296966d7d74a7141aaab4bec123fa43d1a35ddcb39bf9fd29d70"},
{file = "debugpy-1.8.9-cp311-cp311-win32.whl", hash = "sha256:8138efff315cd09b8dcd14226a21afda4ca582284bf4215126d87342bba1cc66"},
{file = "debugpy-1.8.9-cp311-cp311-win_amd64.whl", hash = "sha256:ff54ef77ad9f5c425398efb150239f6fe8e20c53ae2f68367eba7ece1e96226d"},
{file = "debugpy-1.8.9-cp312-cp312-macosx_14_0_universal2.whl", hash = "sha256:957363d9a7a6612a37458d9a15e72d03a635047f946e5fceee74b50d52a9c8e2"},
{file = "debugpy-1.8.9-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5e565fc54b680292b418bb809f1386f17081d1346dca9a871bf69a8ac4071afe"},
{file = "debugpy-1.8.9-cp312-cp312-win32.whl", hash = "sha256:3e59842d6c4569c65ceb3751075ff8d7e6a6ada209ceca6308c9bde932bcef11"},
{file = "debugpy-1.8.9-cp312-cp312-win_amd64.whl", hash = "sha256:66eeae42f3137eb428ea3a86d4a55f28da9bd5a4a3d369ba95ecc3a92c1bba53"},
{file = "debugpy-1.8.9-cp313-cp313-macosx_14_0_universal2.whl", hash = "sha256:957ecffff80d47cafa9b6545de9e016ae8c9547c98a538ee96ab5947115fb3dd"},
{file = "debugpy-1.8.9-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1efbb3ff61487e2c16b3e033bc8595aea578222c08aaf3c4bf0f93fadbd662ee"},
{file = "debugpy-1.8.9-cp313-cp313-win32.whl", hash = "sha256:7c4d65d03bee875bcb211c76c1d8f10f600c305dbd734beaed4077e902606fee"},
{file = "debugpy-1.8.9-cp313-cp313-win_amd64.whl", hash = "sha256:e46b420dc1bea64e5bbedd678148be512442bc589b0111bd799367cde051e71a"},
{file = "debugpy-1.8.9-cp38-cp38-macosx_14_0_x86_64.whl", hash = "sha256:472a3994999fe6c0756945ffa359e9e7e2d690fb55d251639d07208dbc37caea"},
{file = "debugpy-1.8.9-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:365e556a4772d7d0d151d7eb0e77ec4db03bcd95f26b67b15742b88cacff88e9"},
{file = "debugpy-1.8.9-cp38-cp38-win32.whl", hash = "sha256:54a7e6d3014c408eb37b0b06021366ee985f1539e12fe49ca2ee0d392d9ceca5"},
{file = "debugpy-1.8.9-cp38-cp38-win_amd64.whl", hash = "sha256:8e99c0b1cc7bf86d83fb95d5ccdc4ad0586d4432d489d1f54e4055bcc795f693"},
{file = "debugpy-1.8.9-cp39-cp39-macosx_14_0_x86_64.whl", hash = "sha256:7e8b079323a56f719977fde9d8115590cb5e7a1cba2fcee0986ef8817116e7c1"},
{file = "debugpy-1.8.9-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6953b335b804a41f16a192fa2e7851bdcfd92173cbb2f9f777bb934f49baab65"},
{file = "debugpy-1.8.9-cp39-cp39-win32.whl", hash = "sha256:7e646e62d4602bb8956db88b1e72fe63172148c1e25c041e03b103a25f36673c"},
{file = "debugpy-1.8.9-cp39-cp39-win_amd64.whl", hash = "sha256:3d9755e77a2d680ce3d2c5394a444cf42be4a592caaf246dbfbdd100ffcf7ae5"},
{file = "debugpy-1.8.9-py2.py3-none-any.whl", hash = "sha256:cc37a6c9987ad743d9c3a14fa1b1a14b7e4e6041f9dd0c8abf8895fe7a97b899"},
{file = "debugpy-1.8.9.zip", hash = "sha256:1339e14c7d980407248f09824d1b25ff5c5616651689f1e0f0e51bdead3ea13e"},
]
[[package]] [[package]]
name = "defusedxml" name = "defusedxml"
version = "0.7.1" version = "0.7.1"
@@ -1321,6 +1351,45 @@ files = [
[package.dependencies] [package.dependencies]
python-dateutil = ">=2.7" python-dateutil = ">=2.7"
[[package]]
name = "fsspec"
version = "2024.10.0"
description = "File-system specification"
optional = false
python-versions = ">=3.8"
files = [
{file = "fsspec-2024.10.0-py3-none-any.whl", hash = "sha256:03b9a6785766a4de40368b88906366755e2819e758b83705c88cd7cb5fe81871"},
{file = "fsspec-2024.10.0.tar.gz", hash = "sha256:eda2d8a4116d4f2429db8550f2457da57279247dd930bb12f821b58391359493"},
]
[package.extras]
abfs = ["adlfs"]
adl = ["adlfs"]
arrow = ["pyarrow (>=1)"]
dask = ["dask", "distributed"]
dev = ["pre-commit", "ruff"]
doc = ["numpydoc", "sphinx", "sphinx-design", "sphinx-rtd-theme", "yarl"]
dropbox = ["dropbox", "dropboxdrivefs", "requests"]
full = ["adlfs", "aiohttp (!=4.0.0a0,!=4.0.0a1)", "dask", "distributed", "dropbox", "dropboxdrivefs", "fusepy", "gcsfs", "libarchive-c", "ocifs", "panel", "paramiko", "pyarrow (>=1)", "pygit2", "requests", "s3fs", "smbprotocol", "tqdm"]
fuse = ["fusepy"]
gcs = ["gcsfs"]
git = ["pygit2"]
github = ["requests"]
gs = ["gcsfs"]
gui = ["panel"]
hdfs = ["pyarrow (>=1)"]
http = ["aiohttp (!=4.0.0a0,!=4.0.0a1)"]
libarchive = ["libarchive-c"]
oci = ["ocifs"]
s3 = ["s3fs"]
sftp = ["paramiko"]
smb = ["smbprotocol"]
ssh = ["paramiko"]
test = ["aiohttp (!=4.0.0a0,!=4.0.0a1)", "numpy", "pytest", "pytest-asyncio (!=0.22.0)", "pytest-benchmark", "pytest-cov", "pytest-mock", "pytest-recording", "pytest-rerunfailures", "requests"]
test-downstream = ["aiobotocore (>=2.5.4,<3.0.0)", "dask-expr", "dask[dataframe,test]", "moto[server] (>4,<5)", "pytest-timeout", "xarray"]
test-full = ["adlfs", "aiohttp (!=4.0.0a0,!=4.0.0a1)", "cloudpickle", "dask", "distributed", "dropbox", "dropboxdrivefs", "fastparquet", "fusepy", "gcsfs", "jinja2", "kerchunk", "libarchive-c", "lz4", "notebook", "numpy", "ocifs", "pandas", "panel", "paramiko", "pyarrow", "pyarrow (>=1)", "pyftpdlib", "pygit2", "pytest", "pytest-asyncio (!=0.22.0)", "pytest-benchmark", "pytest-cov", "pytest-mock", "pytest-recording", "pytest-rerunfailures", "python-snappy", "requests", "smbprotocol", "tqdm", "urllib3", "zarr", "zstandard"]
tqdm = ["tqdm"]
[[package]] [[package]]
name = "funcy" name = "funcy"
version = "1.13" version = "1.13"
@@ -1993,13 +2062,13 @@ testing = ["Django", "attrs", "colorama", "docopt", "pytest (<7.0.0)"]
[[package]] [[package]]
name = "jinja2" name = "jinja2"
version = "3.1.4" version = "3.1.5"
description = "A very fast and expressive template engine." description = "A very fast and expressive template engine."
optional = false optional = false
python-versions = ">=3.7" python-versions = ">=3.7"
files = [ files = [
{file = "jinja2-3.1.4-py3-none-any.whl", hash = "sha256:bc5dd2abb727a5319567b7a813e6a2e7318c39f4f487cfe6c89c6f9c7d25197d"}, {file = "jinja2-3.1.5-py3-none-any.whl", hash = "sha256:aba0f4dc9ed8013c424088f68a5c226f7d6097ed89b246d7749c2ec4175c6adb"},
{file = "jinja2-3.1.4.tar.gz", hash = "sha256:4a3aee7acbbe7303aede8e9648d13b8bf88a429282aa6122a993f0ac800cb369"}, {file = "jinja2-3.1.5.tar.gz", hash = "sha256:8fefff8dc3034e27bb80d67c671eb8a9bc424c0ef4c0826edbff304cceff43bb"},
] ]
[package.dependencies] [package.dependencies]
@@ -2647,42 +2716,42 @@ et-xmlfile = "*"
[[package]] [[package]]
name = "oracledb" name = "oracledb"
version = "2.1.2" version = "2.5.1"
description = "Python interface to Oracle Database" description = "Python interface to Oracle Database"
optional = false optional = false
python-versions = ">=3.7" python-versions = ">=3.8"
files = [ files = [
{file = "oracledb-2.1.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:4ffaba9504c638c29129b484cf547accf750bd0f86df1ca6194646a4d2540691"}, {file = "oracledb-2.5.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:54ea7b4da179eb3fefad338685b44fed657a9cd733fb0bfc09d344cfb266355e"},
{file = "oracledb-2.1.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:71d98deb1e3a500920f5460d457925f0c8cef8d037881fdbd16df1c4734453dd"}, {file = "oracledb-2.5.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:05df7a5a61f4d26c986e235fae6f64a81afaac8f1dbef60e2e9ecf9236218e58"},
{file = "oracledb-2.1.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bde2df672704fbe12ab0653f6e808b1ed62de28c6864b17fc3a1fcac9c1fd472"}, {file = "oracledb-2.5.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d17c80063375a5d87a7ab57c8343e5434a16ea74f7be3b56f9100300ef0b69d6"},
{file = "oracledb-2.1.2-cp310-cp310-win32.whl", hash = "sha256:3b3798a1220fc8736a37b9280d0ae4cdf263bb203fc6e2b3a82c33f9a2010702"}, {file = "oracledb-2.5.1-cp310-cp310-win32.whl", hash = "sha256:51b3911ee822319e20f2e19d816351aac747591a59a0a96cf891c62c2a5c0c0d"},
{file = "oracledb-2.1.2-cp310-cp310-win_amd64.whl", hash = "sha256:92620efd5eb0d23b252d75f2f2ff1deadf25f44546903e3283760cb276d524ed"}, {file = "oracledb-2.5.1-cp310-cp310-win_amd64.whl", hash = "sha256:e4e884625117e50b619c93828affbcffa594029ef8c8b40205394990e6af65a8"},
{file = "oracledb-2.1.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:b913a164e1830d0e955b88d97c5e4da4d2402f8a8b0d38febb6ad5a8ef9e4743"}, {file = "oracledb-2.5.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:85318350fa4837b7b637e436fa5f99c17919d6329065e64d1e18e5a7cae52457"},
{file = "oracledb-2.1.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c53827344c6d001f492aee0a3acb6c1b6c0f3030c2f5dc8cb86dc4f0bb4dd1ab"}, {file = "oracledb-2.5.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:676c221227159d9cee25030c56ff9782f330115cb86164d92d3360f55b07654b"},
{file = "oracledb-2.1.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:50225074841d5f9b281d620c012ced4b0946ff5a941c8b639be7babda5190709"}, {file = "oracledb-2.5.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e78c6de57b4b5df7f932337c57e59b62e34fc4527d2460c0cab10c2ab01825f8"},
{file = "oracledb-2.1.2-cp311-cp311-win32.whl", hash = "sha256:a043b4df2919411b787bcd24ffa4286249a11d05d29bb20bb076d108c3c6f777"}, {file = "oracledb-2.5.1-cp311-cp311-win32.whl", hash = "sha256:0d5974327a1957538a144b073367104cdf8bb39cf056940995b75cb099535589"},
{file = "oracledb-2.1.2-cp311-cp311-win_amd64.whl", hash = "sha256:9edce208c26ee018e43b75323888743031be3e9f0c0e4221abf037129c12d949"}, {file = "oracledb-2.5.1-cp311-cp311-win_amd64.whl", hash = "sha256:541bb5a107917b9d9eba1346318b42f8b6024e7dd3bef1451f0745364f03399c"},
{file = "oracledb-2.1.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:08aa313b801dda950918168d3962ba59a617adce143e0c2bf1ee9b847695faaa"}, {file = "oracledb-2.5.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:970a9420cc351d650cc6716122e9aa50cfb8c27f425ffc9d83651fd3edff6090"},
{file = "oracledb-2.1.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:de5c932b04d3bcdd22c71c0e5c5e1d16b6a3a2fc68dc472ee3a12e677461354c"}, {file = "oracledb-2.5.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:a6788c128af5a3a45689453fc4832f32b4a0dae2696d9917c7631a2e02865148"},
{file = "oracledb-2.1.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1d590caf39b1901bcba394fcda9815438faff0afaf374025f89ef5d65993d0a4"}, {file = "oracledb-2.5.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8778daa3f08639232341d802b95ca6da4c0c798c8530e4df331b3286d32e49d5"},
{file = "oracledb-2.1.2-cp312-cp312-win32.whl", hash = "sha256:1e3ffdfe76c97d1ca13a3fecf239c96d3889015bb5b775dc22b947108044b01e"}, {file = "oracledb-2.5.1-cp312-cp312-win32.whl", hash = "sha256:a44613f3dfacb2b9462c3871ee333fa535fbd0ec21942e14019fcfd572487db0"},
{file = "oracledb-2.1.2-cp312-cp312-win_amd64.whl", hash = "sha256:8c1eaf8c74bb6de5772de768f2f3f5eb935ab935c633d3a012ddff7e691a2073"}, {file = "oracledb-2.5.1-cp312-cp312-win_amd64.whl", hash = "sha256:934d02da80bfc030c644c5c43fbe58119dc170f15b4dfdb6fe04c220a1f8730d"},
{file = "oracledb-2.1.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:e2ee06e154e08cc5e4037855d74dc6e37dc054c91a7a1a372bb60d4442e2ed3d"}, {file = "oracledb-2.5.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:0374481329fa873a2af24eb12de4fd597c6c111e148065200562eb75ea0c6be7"},
{file = "oracledb-2.1.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1a21d84aaf5dddab0cfa8ab7c23272c0295a5c796f212a4ce8a6b499643663dd"}, {file = "oracledb-2.5.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:66e885de106701d1f2a630d19e183e491e4f1ccb8d78855f60396ba15856fb66"},
{file = "oracledb-2.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b337f7cf30753c3a32302fbc25ca80d7ff5049dd9333e681236a674a90c21caf"}, {file = "oracledb-2.5.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:fcf446f6250d8edad5367ff03ad73dbbe672a2e4b060c51a774821dd723b0283"},
{file = "oracledb-2.1.2-cp37-cp37m-win32.whl", hash = "sha256:b5d936763a9b26d32c4e460dbb346c2a962fcc98e6df33dd2d81fdc2eb26f1e4"}, {file = "oracledb-2.5.1-cp313-cp313-win32.whl", hash = "sha256:b02b93199a7073e9b5687fe2dfa83d25ea102ab261c577f9d55820d5ef193dda"},
{file = "oracledb-2.1.2-cp37-cp37m-win_amd64.whl", hash = "sha256:0ea32b87b7202811d85082f10bf7789747ce45f195be4199c5611e7d76a79e78"}, {file = "oracledb-2.5.1-cp313-cp313-win_amd64.whl", hash = "sha256:173b6d132b230f0617380272181e14fc53aec65aaffe68b557a9b6040716a267"},
{file = "oracledb-2.1.2-cp38-cp38-macosx_11_0_universal2.whl", hash = "sha256:f94b22da87e051e3a8620d2b04d99e1cc9d9abb4da6736d6ae0ca436ba03fb86"}, {file = "oracledb-2.5.1-cp38-cp38-macosx_11_0_universal2.whl", hash = "sha256:7d5efc94ce5bb657a5f43e2683e23cc4b4c53c4783e817759869472a113dac26"},
{file = "oracledb-2.1.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:391034ee66717dba514e765263d08d18a2aa7badde373f82599b89e46fa3720a"}, {file = "oracledb-2.5.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6919cb69638a7dda45380d6530b6f2f7fd21ea7bdf8d38936653f9ebc4f7e3d6"},
{file = "oracledb-2.1.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1a2d9891244b9b94465e30af8cc79380bbb41081c5dc0511cbc94cc250e9e26d"}, {file = "oracledb-2.5.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:44f5eb220945a6e092975ebcb9afc3f1eb10420d04d6bfeace1207ba86d60431"},
{file = "oracledb-2.1.2-cp38-cp38-win32.whl", hash = "sha256:9a9a6e0bf61952c2c82614b98fe896d2cda17d81ffca4527556e6607b10e3365"}, {file = "oracledb-2.5.1-cp38-cp38-win32.whl", hash = "sha256:aa6ce0dfc64dc7b30bcf477f978538ba82fa7060ecd7a1b9227925b471ae3b50"},
{file = "oracledb-2.1.2-cp38-cp38-win_amd64.whl", hash = "sha256:d9a6447589f203ca846526c99a667537b099d54ddeff09d24f9da59bdcc8f98b"}, {file = "oracledb-2.5.1-cp38-cp38-win_amd64.whl", hash = "sha256:7a3115e4d445e3430d6f34083b7eed607309411f41472b66d145508f7b0c3770"},
{file = "oracledb-2.1.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8eb688dd1f8ea2038d17bc84fb651aa1e994b155d3cb8b8387df70ab2a7b4c4c"}, {file = "oracledb-2.5.1-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8a2627a0d29390aaef7211c5b3f7182dfd8e76c969b39d57ee3e43c1057c6fe7"},
{file = "oracledb-2.1.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0f22c31b894bb085a33d70e174c9bcd0abafc630c2c941ff0d630ee3852f1aa6"}, {file = "oracledb-2.5.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:730cd03e7fbf05acd32a221ead2a43020b3b91391597eaf728d724548f418b1b"},
{file = "oracledb-2.1.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e5bc03520b8bd4dbf2ac4d937d298a85a7208ffbeec738eea92ad7bb00e7134a"}, {file = "oracledb-2.5.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f42524b586733daa896f675acad8b9f2fc2f4380656d60a22a109a573861fc93"},
{file = "oracledb-2.1.2-cp39-cp39-win32.whl", hash = "sha256:5d4f6bd1036d7edbb96d8d31f0ca53696a013c00ac82fc19ac0ca374d2265b2c"}, {file = "oracledb-2.5.1-cp39-cp39-win32.whl", hash = "sha256:7958c7796df9f8c97484768c88817dec5c6d49220fc4cccdfde12a1a883f3d46"},
{file = "oracledb-2.1.2-cp39-cp39-win_amd64.whl", hash = "sha256:69bde9770392c1c859b1e1d767dbb9ca4c57e3f2946ca90c779d9402a7e96111"}, {file = "oracledb-2.5.1-cp39-cp39-win_amd64.whl", hash = "sha256:92e0d176e3c76a1916f4e34fc3d84994ad74cce6b8664656c4dbecb8fa7e8c37"},
{file = "oracledb-2.1.2.tar.gz", hash = "sha256:3054bcc295d7378834ba7a5aceb865985e954915f9b07a843ea84c3824c6a0b2"}, {file = "oracledb-2.5.1.tar.gz", hash = "sha256:63d17ebb95f9129d0ab9386cb632c9e667e3be2c767278cc11a8e4585468de33"},
] ]
[package.dependencies] [package.dependencies]
@@ -2761,13 +2830,13 @@ test = ["hypothesis (>=3.58)", "pytest (>=6.0)", "pytest-xdist"]
[[package]] [[package]]
name = "paramiko" name = "paramiko"
version = "3.4.0" version = "3.4.1"
description = "SSH2 protocol library" description = "SSH2 protocol library"
optional = false optional = false
python-versions = ">=3.6" python-versions = ">=3.6"
files = [ files = [
{file = "paramiko-3.4.0-py3-none-any.whl", hash = "sha256:43f0b51115a896f9c00f59618023484cb3a14b98bbceab43394a39c6739b7ee7"}, {file = "paramiko-3.4.1-py3-none-any.whl", hash = "sha256:8e49fd2f82f84acf7ffd57c64311aa2b30e575370dc23bdb375b10262f7eac32"},
{file = "paramiko-3.4.0.tar.gz", hash = "sha256:aac08f26a31dc4dffd92821527d1682d99d52f9ef6851968114a8728f3c274d3"}, {file = "paramiko-3.4.1.tar.gz", hash = "sha256:8b15302870af7f6652f2e038975c1d2973f06046cb5d7d65355668b3ecbece0c"},
] ]
[package.dependencies] [package.dependencies]
@@ -3081,40 +3150,6 @@ pygments = "*"
all = ["black"] all = ["black"]
ptipython = ["ipython"] ptipython = ["ipython"]
[[package]]
name = "ptvsd"
version = "4.3.2"
description = "Remote debugging server for Python support in Visual Studio and Visual Studio Code"
optional = false
python-versions = ">=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*"
files = [
{file = "ptvsd-4.3.2-cp27-cp27m-macosx_10_13_x86_64.whl", hash = "sha256:22b699369a18ff28d4d1aa6a452739e50c7b7790cb16c6312d766e023c12fe27"},
{file = "ptvsd-4.3.2-cp27-cp27m-manylinux1_i686.whl", hash = "sha256:3f839fe91d9ddca0d6a3a0afd6a1c824be1768498a737ab9333d084c5c3f3591"},
{file = "ptvsd-4.3.2-cp27-cp27m-manylinux1_x86_64.whl", hash = "sha256:70260b4591c07bff95566d49b6a5dc3051d8558035c43c847bad9a954def46bb"},
{file = "ptvsd-4.3.2-cp27-cp27mu-manylinux1_i686.whl", hash = "sha256:d2662ec37ee049c0f8f2f9a378abeb7e570d9215c19eaf0a6d7189464195009f"},
{file = "ptvsd-4.3.2-cp27-cp27mu-manylinux1_x86_64.whl", hash = "sha256:d9337ebba4d099698982e090b203e85670086c4b29cf1185b2e45cd353a8053e"},
{file = "ptvsd-4.3.2-cp34-cp34m-macosx_10_13_x86_64.whl", hash = "sha256:cf09fd4d90c4c42ddd9bf853290f1a80bc2128993a3923bd3b96b68cc1acd03f"},
{file = "ptvsd-4.3.2-cp34-cp34m-manylinux1_i686.whl", hash = "sha256:ccc5c533135305709461f545feed5061c608714db38fa0f58e3f848a127b7fde"},
{file = "ptvsd-4.3.2-cp34-cp34m-manylinux1_x86_64.whl", hash = "sha256:de5234bec74c47da668e1a1a21bcc9821af0cbb28b5153df78cd5abc744b29a2"},
{file = "ptvsd-4.3.2-cp35-cp35m-macosx_10_13_x86_64.whl", hash = "sha256:c893fb9d1c2ef8f980cc00ced3fd90356f86d9f59b58ee97e0e7e622b8860f76"},
{file = "ptvsd-4.3.2-cp35-cp35m-manylinux1_i686.whl", hash = "sha256:2bbc121bce3608501998afbe742f02b80e7d26b8fecd38f78b903f22f52a81d9"},
{file = "ptvsd-4.3.2-cp35-cp35m-manylinux1_x86_64.whl", hash = "sha256:fad06de012a78f277318d0c308dd3d7cc1f67167f3b2e1e2f7c6caf04c03440c"},
{file = "ptvsd-4.3.2-cp35-cp35m-win32.whl", hash = "sha256:92d26aa7c8f7ffe41cb4b50a00846027027fa17acdf2d9dd8c24de77b25166c6"},
{file = "ptvsd-4.3.2-cp35-cp35m-win_amd64.whl", hash = "sha256:eda10ecd43daacc180a6fbe524992be76a877c3559e2b78016b4ada8fec10273"},
{file = "ptvsd-4.3.2-cp36-cp36m-macosx_10_13_x86_64.whl", hash = "sha256:c01204e3f025c3f7252c79c1a8a028246d29e3ef339e1a01ddf652999f47bdea"},
{file = "ptvsd-4.3.2-cp36-cp36m-manylinux1_i686.whl", hash = "sha256:c97c71835dde7e67fc7b06398bee1c012559a0784ebda9cf8acaf176c7ae766c"},
{file = "ptvsd-4.3.2-cp36-cp36m-manylinux1_x86_64.whl", hash = "sha256:612948a045fcf9c8931cd306972902440278f34de7ca684b49d4caeec9f1ec62"},
{file = "ptvsd-4.3.2-cp36-cp36m-win32.whl", hash = "sha256:72d114baa5737baf29c8068d1ccdd93cbb332d2030601c888eed0e3761b588d7"},
{file = "ptvsd-4.3.2-cp36-cp36m-win_amd64.whl", hash = "sha256:58508485a1609a495dd45829bd6d219303cf9edef5ca1f01a9ed8ffaa87f390c"},
{file = "ptvsd-4.3.2-cp37-cp37m-macosx_10_13_x86_64.whl", hash = "sha256:20f48ffed42a6beb879c250d82662e175ad59cc46a29c95c6a4472ae413199c5"},
{file = "ptvsd-4.3.2-cp37-cp37m-manylinux1_i686.whl", hash = "sha256:b9970e3dc987eb2a6001af6c9d2f726dd6455cfc6d47e0f51925cbdee7ea2157"},
{file = "ptvsd-4.3.2-cp37-cp37m-manylinux1_x86_64.whl", hash = "sha256:1d3d82ecc82186d099992a748556e6e54037f5c5e4d3fc9bba3e2302354be0d4"},
{file = "ptvsd-4.3.2-cp37-cp37m-win32.whl", hash = "sha256:10745fbb788001959b4de405198d8bd5243611a88fb5a2e2c6800245bc0ddd74"},
{file = "ptvsd-4.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:90cbd082e7a9089664888d0d94aca760202f080133fca8f3fe65c48ed6b9e39d"},
{file = "ptvsd-4.3.2-py2.py3-none-any.whl", hash = "sha256:459137736068bb02515040b2ed2738169cb30d69a38e0fd5dffcba255f41e68d"},
{file = "ptvsd-4.3.2.zip", hash = "sha256:3b05c06018fdbce5943c50fb0baac695b5c11326f9e21a5266c854306bda28ab"},
]
[[package]] [[package]]
name = "pure-sasl" name = "pure-sasl"
version = "0.6.2" version = "0.6.2"
@@ -3156,23 +3191,25 @@ pyasn1 = ">=0.4.6,<0.6.0"
[[package]] [[package]]
name = "pyathena" name = "pyathena"
version = "1.11.5" version = "2.25.2"
description = "Python DB API 2.0 (PEP 249) client for Amazon Athena" description = "Python DB API 2.0 (PEP 249) client for Amazon Athena"
optional = false optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*" python-versions = ">=3.7.1,<4.0.0"
files = [ files = [
{file = "PyAthena-1.11.5-py2.py3-none-any.whl", hash = "sha256:8cc5d40236993fe5241bb625e78d0a0a149e629b74569a9636b49168448a7ac8"}, {file = "pyathena-2.25.2-py3-none-any.whl", hash = "sha256:df7855fec5cc675511431d7c72b814346ebd7e51ed32181ec95847154f79210b"},
{file = "PyAthena-1.11.5.tar.gz", hash = "sha256:86c0f4d10528de44fcd63222506949b010dff36ad57116e4c1274c1cfa9477d0"}, {file = "pyathena-2.25.2.tar.gz", hash = "sha256:aebb8254dd7b2a450841ee3552bf443002a2deaed93fae0ae6f4258b5eb2d367"},
] ]
[package.dependencies] [package.dependencies]
boto3 = ">=1.4.4" boto3 = ">=1.26.4"
botocore = ">=1.5.52" botocore = ">=1.29.4"
future = "*" fsspec = "*"
tenacity = ">=4.1.0" tenacity = ">=4.1.0"
[package.extras] [package.extras]
pandas = ["pandas (>=0.24.0)", "pyarrow (>=0.15.0)"] arrow = ["pyarrow (>=7.0.0)"]
fastparquet = ["fastparquet (>=0.4.0)"]
pandas = ["pandas (>=1.3.0)"]
sqlalchemy = ["sqlalchemy (>=1.0.0,<2.0.0)"] sqlalchemy = ["sqlalchemy (>=1.0.0,<2.0.0)"]
[[package]] [[package]]
@@ -3478,43 +3515,92 @@ zstd = ["zstandard"]
[[package]] [[package]]
name = "pymssql" name = "pymssql"
version = "2.2.8" version = "2.3.1"
description = "DB-API interface to Microsoft SQL Server for Python. (new Cython-based version)" description = "DB-API interface to Microsoft SQL Server for Python. (new Cython-based version)"
optional = false optional = false
python-versions = "*" python-versions = "*"
files = [ files = [
{file = "pymssql-2.2.8-cp310-cp310-macosx_11_0_x86_64.whl", hash = "sha256:30bfd7b8edef78097ccd3f52ac3f3a5c3cf0019f8a280f306cacbbb165caaf63"}, {file = "pymssql-2.3.1-cp310-cp310-macosx_13_0_x86_64.whl", hash = "sha256:001b3321a5f620b80d1427933fcca11b05f29a808d7772a84d18d01e640ee60a"},
{file = "pymssql-2.2.8-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:049f2e3de919e8e02504780a21ebbf235e21ca8ed5c7538c5b6e705aa6c43d8c"}, {file = "pymssql-2.3.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:15466dd41be5e32302f0c4791f612aadd608a0e6ec0b10d769e76cbb4c86aa97"},
{file = "pymssql-2.2.8-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0dd86d8e3e346e34f3f03d12e333747b53a1daa74374a727f4714d5b82ee0dd5"}, {file = "pymssql-2.3.1-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:74349040d4ff6f05894aefb5109ecffcd416e1e366d9951085d3225a9d09c46b"},
{file = "pymssql-2.2.8-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:508226a0df7cb6faeda9f8e84e85743690ca427d7b27af9a73d75fcf0c1eef6e"}, {file = "pymssql-2.3.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc79dbe5eca8825b73830c8bb147b6f588300dc7510393822682162dc4ff003f"},
{file = "pymssql-2.2.8-cp310-cp310-win_amd64.whl", hash = "sha256:47859887adeaf184766b5e0bc845dd23611f3808f9521552063bb36eabc10092"}, {file = "pymssql-2.3.1-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:0b93ebe2feb45e772ca708bc4cd70f3e4c72796ec1b157fd5d80cdc589c786aa"},
{file = "pymssql-2.2.8-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:d873e553374d5b1c57fe1c43bb75e3bcc2920678db1ef26f6bfed396c7d21b30"}, {file = "pymssql-2.3.1-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:44b1c8752c0fc6750902c1c521f258bdf4271bfbf7b2a5fee469b6ad00631aab"},
{file = "pymssql-2.2.8-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bf31b8b76634c826a91f9999e15b7bfb0c051a0f53b319fd56481a67e5b903bb"}, {file = "pymssql-2.3.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fdfadb055a9ecad58356decfecc41626999ad7b548cc7ea898cf159e2217f7bb"},
{file = "pymssql-2.2.8-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:821945c2214fe666fd456c61e09a29a00e7719c9e136c801bffb3a254e9c579b"}, {file = "pymssql-2.3.1-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:46f1074c6763e9a899128f22a0f72e9fb0035535f48efabd6a294db1c149e6f1"},
{file = "pymssql-2.2.8-cp311-cp311-win_amd64.whl", hash = "sha256:cc85b609b4e60eac25fa38bbac1ff854fd2c2a276e0ca4a3614c6f97efb644bb"}, {file = "pymssql-2.3.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:ebb11b61d99ec5bbe0b8c411ff748a90263cdaf474881de231da8184e721c42c"},
{file = "pymssql-2.2.8-cp36-cp36m-macosx_10_14_x86_64.whl", hash = "sha256:ebe7f64d5278d807f14bea08951e02512bfbc6219fd4d4f15bb45ded885cf3d4"}, {file = "pymssql-2.3.1-cp310-cp310-win32.whl", hash = "sha256:2ef07fdee3e9652d39b4c081c5c5e1a1031abd122b402ed66813bceb3874ccea"},
{file = "pymssql-2.2.8-cp36-cp36m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:253af3d39fc0235627966817262d5c4c94ad09dcbea59664748063470048c29c"}, {file = "pymssql-2.3.1-cp310-cp310-win_amd64.whl", hash = "sha256:791522339215cb7f88db54c831a2347e0c4d69dd3092a343eea5b9339adf4412"},
{file = "pymssql-2.2.8-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2c9d109df536dc5f7dd851a88d285a4c9cb12a9314b621625f4f5ab1197eb312"}, {file = "pymssql-2.3.1-cp311-cp311-macosx_13_0_universal2.whl", hash = "sha256:0433ffa1c86290a93e81176f377621cb70405be66ade8f3070d3f5ec9cfebdba"},
{file = "pymssql-2.2.8-cp36-cp36m-manylinux_2_28_x86_64.whl", hash = "sha256:358d5acf0298d6618edf7fedc4ce3dc8fb5ce8a9db85e7332d5196d29d841821"}, {file = "pymssql-2.3.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6182d82ebfbe46f0e7748d068c6a1c16c0f4fe1f34f1c390f63375cee79b44b0"},
{file = "pymssql-2.2.8-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:63e1be8936372c07aee2405203ee0161ce76b03893cafe3d46841be9886f5ffe"}, {file = "pymssql-2.3.1-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cfbe07dcf0aaee8ce630624669cb2fb77b76743d4dd925f99331422be8704de3"},
{file = "pymssql-2.2.8-cp37-cp37m-macosx_11_0_x86_64.whl", hash = "sha256:381d8a47c4665d99f114849bed23bcba1922c9d005accc3ac19cee8a1d3522dc"}, {file = "pymssql-2.3.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7d999c8e5d5d48e9305c4132392825de402f13feea15694e4e7103029b6eae06"},
{file = "pymssql-2.2.8-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:4f365033c9b4263b74b8a332bbdf2d7d8d7230f05805439b4f3fbf0a0164acfe"}, {file = "pymssql-2.3.1-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:2dced0a76d8e99c283103a2e3c825ca22c67f1f8fc5cff657510f4d2ffb9d188"},
{file = "pymssql-2.2.8-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:03903bdf23a2aac26e9b772b3998efeba079fcb6fcfa6df7abc614e9afa14af0"}, {file = "pymssql-2.3.1-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:880d3173025dea3babf5ab862875b3c76a5cf8df5b292418050c7793c651c0b2"},
{file = "pymssql-2.2.8-cp37-cp37m-manylinux_2_28_x86_64.whl", hash = "sha256:5c83208138f87942c5f08aa50c5fb8d89b7f15340cde58a77b08f49df277e134"}, {file = "pymssql-2.3.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:9f89c698e29ce5c576e4980ded89c00b45e482ec02759bfbfc1aa326648cf64a"},
{file = "pymssql-2.2.8-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:7e4538e85d7b5fb3867636391f91e9e18ac2e0aef660d25e97268e04339f2c36"}, {file = "pymssql-2.3.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:3f4f2a38ce6e39ed2414c20ca16deaea4340868033a4bb23d5e4e30c72290caf"},
{file = "pymssql-2.2.8-cp38-cp38-macosx_11_0_x86_64.whl", hash = "sha256:e920d6f805a525f19e770e48326a5f96b83d7b8dfd093f5b7015b54ef84bcf4c"}, {file = "pymssql-2.3.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:e34e8aa1d3da555dbf23141b02f401267c0be32104b4f030afd0bae62d26d735"},
{file = "pymssql-2.2.8-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:2446645eb8684c0cb246a3294110455dd89a29608dfa7a58ea88aa42aa1cf005"}, {file = "pymssql-2.3.1-cp311-cp311-win32.whl", hash = "sha256:72e57e20802bf97399e050a0760a4541996fc27bc605a1a25e48ca6fe4913c48"},
{file = "pymssql-2.2.8-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3906993300650844ec140aa58772c0f5f3e9e9d5709c061334fd1551acdcf066"}, {file = "pymssql-2.3.1-cp311-cp311-win_amd64.whl", hash = "sha256:b5d3604bca2fa8d5ba2eed1582a3c8a83970a8d2edabfcfd87c1edecb7617d16"},
{file = "pymssql-2.2.8-cp38-cp38-manylinux_2_28_x86_64.whl", hash = "sha256:7309c7352e4a87c9995c3183ebfe0ff4135e955bb759109637673c61c9f0ca8d"}, {file = "pymssql-2.3.1-cp312-cp312-macosx_13_0_universal2.whl", hash = "sha256:c28f1b9560b82fe1a1e51d8c56f6d36bca7c507a8cdf2caa2a0642503c220d5c"},
{file = "pymssql-2.2.8-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:9b8d603cc1ec7ae585c5a409a1d45e8da067970c79dd550d45c238ae0aa0f79f"}, {file = "pymssql-2.3.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e3509b75747eb22ae89f3d47ae316a4b9eac7d952269e88b356ef117a1b8e3b8"},
{file = "pymssql-2.2.8-cp38-cp38-win_amd64.whl", hash = "sha256:293cb4d0339e221d877d6b19a1905082b658f0100a1e2ccc9dda10de58938901"}, {file = "pymssql-2.3.1-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cca3bed27e1ab867e482fa8b529d408489ad57e8b60452f75ef288da90573db6"},
{file = "pymssql-2.2.8-cp39-cp39-macosx_11_0_x86_64.whl", hash = "sha256:895041edd002a2e91d8a4faf0906b6fbfef29d9164bc6beb398421f5927fa40e"}, {file = "pymssql-2.3.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4fe3276915e6040daec409203e3143aa2826984adb8d223c155dab91010110a4"},
{file = "pymssql-2.2.8-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:6b2d9c6d38a416c6f2db36ff1cd8e69f9a5387a46f9f4f612623192e0c9404b1"}, {file = "pymssql-2.3.1-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:d36d566d0d6997c95442c3d2902800e6b072ccc017c6284e5b1bd4e17dc8fada"},
{file = "pymssql-2.2.8-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d63d6f25cf40fe6a03c49be2d4d337858362b8ab944d6684c268e4990807cf0c"}, {file = "pymssql-2.3.1-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:3564df40a678623a769acd9677dc68228b2694170132c6f296eb62bf766d31e4"},
{file = "pymssql-2.2.8-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:c83ad3ad20951f3a94894b354fa5fa9666dcd5ebb4a635dad507c7d1dd545833"}, {file = "pymssql-2.3.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:3dbd4106faabf97f028d0ac59b30d132cfb5e48cf5314b0476f293123dbf3422"},
{file = "pymssql-2.2.8-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:3933f7f082be74698eea835df51798dab9bc727d94d3d280bffc75ab9265f890"}, {file = "pymssql-2.3.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:acd1690d9b1b2ece9d0e1fd7d68571fc9fa56b6ba8697a3132446419ff7fb3f4"},
{file = "pymssql-2.2.8-cp39-cp39-win_amd64.whl", hash = "sha256:de313375b90b0f554058992f35c4a4beb3f6ec2f5912d8cd6afb649f95b03a9f"}, {file = "pymssql-2.3.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:126e0b78773975136e6385da7286c277e2e0320c1f4bee0e4dc61a5edcf98c41"},
{file = "pymssql-2.2.8.tar.gz", hash = "sha256:9baefbfbd07d0142756e2dfcaa804154361ac5806ab9381350aad4e780c3033e"}, {file = "pymssql-2.3.1-cp312-cp312-win32.whl", hash = "sha256:21803b731b8c8780fc974d9b4931fa8f1ca29c227502a4c317e12773c8bdef43"},
{file = "pymssql-2.3.1-cp312-cp312-win_amd64.whl", hash = "sha256:6b0224fc5ce4cf0703278859f145e3e921c04d9feb59739a104d3020bbf0c0c1"},
{file = "pymssql-2.3.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:709c1df3134e330ee9590437253be363b558154bde5bb54856fc5fe68a03c971"},
{file = "pymssql-2.3.1-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b9381eafaf529815f2d61f22b99e0538e744b31234f17d4384f5b0496bd1fbed"},
{file = "pymssql-2.3.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c3bf78789014f202855f5d00de982bbcd95177fe8bcf920f0ce730b72456c173"},
{file = "pymssql-2.3.1-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:4b44280eedd0a3f031e9464d4fc632a215fadcfb375bb479065b61a6337df402"},
{file = "pymssql-2.3.1-cp313-cp313-musllinux_1_1_i686.whl", hash = "sha256:922f536b925880c260968c8f2130b1c9d6315b83f300f18365b5421933f034a2"},
{file = "pymssql-2.3.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:f00f618d1c0f58617de548e5094f7d55ab6034b94068d7eebba60a034866b10b"},
{file = "pymssql-2.3.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:b363db86a1a3fe16df9b4253e17b02a268d0f2e2753679b8e85cee268e2fe8c4"},
{file = "pymssql-2.3.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:396a26cf576196cc4a3d77890b2b8eb62655ff02846288757dd8b587352cc4f5"},
{file = "pymssql-2.3.1-cp36-cp36m-win32.whl", hash = "sha256:5a1a1c697596f23058697709144d00a44e7af6ecab6a517f2ecf28dcf8fb4280"},
{file = "pymssql-2.3.1-cp36-cp36m-win_amd64.whl", hash = "sha256:4f92e8657d42341dce01f7f57d03f84b35c0ed00a7bef24533ff80a37ffcfb4e"},
{file = "pymssql-2.3.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:095b50e43bfbc4d6f953810175ba275bb3e6136206f3a7146bdd1031e3f0dd9b"},
{file = "pymssql-2.3.1-cp37-cp37m-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:47ac89098732c327725b53464932c6a532367271a3d5c5a988f61e23e0e0e286"},
{file = "pymssql-2.3.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2f284fc052cf1dbc702a2f4d13442d87fc6847ba9054faccfc8d8446fcf00894"},
{file = "pymssql-2.3.1-cp37-cp37m-manylinux_2_28_x86_64.whl", hash = "sha256:40778b65c09eef9e7c25c444b96e76f81d8b5cf1828cb555123d052b7d3b5661"},
{file = "pymssql-2.3.1-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:22c8609bc7f8b13d383729ba09042b4d796a607c93779c616be51b37caa6b384"},
{file = "pymssql-2.3.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:ab2aea2ae8bc1aba0105fccbf9e4f6716648b2b8f9421fd3418c6cc798fca43e"},
{file = "pymssql-2.3.1-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:e594de69832ad13761412f4d5c981a6e5d931b22f25136c8cd3531d9c6cfdf63"},
{file = "pymssql-2.3.1-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:68f879b4ec4b2191a1d8b3bb24db04c3631737653785369c275bd5a574e54093"},
{file = "pymssql-2.3.1-cp37-cp37m-musllinux_1_2_i686.whl", hash = "sha256:9ef157e63a1c19e7ab4823237b5f03a3bca45e1e94a4d5ed73baab6d019830c7"},
{file = "pymssql-2.3.1-cp37-cp37m-musllinux_1_2_x86_64.whl", hash = "sha256:66afe6ee539e37cdfea0c6b2d596ec0d2a6223f09450c4df7cf872bad12691fe"},
{file = "pymssql-2.3.1-cp37-cp37m-win32.whl", hash = "sha256:b9cc14a9f63e632200f54311da9868ece2715fa9560f6272c9bb82c57edc0543"},
{file = "pymssql-2.3.1-cp37-cp37m-win_amd64.whl", hash = "sha256:54bc10f28c0acc1347d3c7056e702ad21f128e6bf7737b4edc8c267372db9ce8"},
{file = "pymssql-2.3.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c8d955e751fb125be2a8513b5a338457a3fe73e5daa094815f96a86e496f7149"},
{file = "pymssql-2.3.1-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4c13ca6eaf0d7f16af9edf87d58070329bfacb7f27b90e1de16318d64c7b873b"},
{file = "pymssql-2.3.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5ecb0cdea24e2c019fb403fd642c04a64e8767c79f8dd38451eb5d72ceffce34"},
{file = "pymssql-2.3.1-cp38-cp38-manylinux_2_28_x86_64.whl", hash = "sha256:afd57a728e81d73a0f43f3d28216c402fea03bd06a382da881dfc8215fb4080d"},
{file = "pymssql-2.3.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:5e6f6d9de73309cda602bbb769cb707f08d6899664f3ac6e9ed3e3b1ad472cee"},
{file = "pymssql-2.3.1-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:02b808dbb86bbe751dd3fd117e83926b0a19ca9d9b833fae945bf2e31be66bf6"},
{file = "pymssql-2.3.1-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:b0f1ba9befe23e6c4e75c2a626ffe59d159ab3a425a0208515888ec8670bf5bf"},
{file = "pymssql-2.3.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:8eecb4f3b41b8b29a0cbe502ae37b6477063d690151f668c410328f101f6198b"},
{file = "pymssql-2.3.1-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:a36c8b089e2d7b606aee823eefdfd72f5df110241fc5d913094b0b9da2692794"},
{file = "pymssql-2.3.1-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:425de7d3f38cd1867c30b7c352d66020f38fdcdf804282ee232f5e25672930c1"},
{file = "pymssql-2.3.1-cp38-cp38-win32.whl", hash = "sha256:ce397eb6a2a90fcd2a83d8812c1b8752af3b5362e630da49aa556c947e32ce3d"},
{file = "pymssql-2.3.1-cp38-cp38-win_amd64.whl", hash = "sha256:02c4ab7a58bfb57edb2deee7e2aceed2512960e7c2c1fd2cb23c647471a36ba2"},
{file = "pymssql-2.3.1-cp39-cp39-macosx_13_0_x86_64.whl", hash = "sha256:750078568dafc1e0a24cf0f51eecfe548b13440976a2c8b19cc6e5d38e7b10bc"},
{file = "pymssql-2.3.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5a651dd98f67eef98f429c949fb50ea0a92fcf8668834cc35909237c24c1b906"},
{file = "pymssql-2.3.1-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a1ecedaeec8f4d8643d088b4985f0b742d9669bff701153a845b0d1900260b81"},
{file = "pymssql-2.3.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:015f6ccd1bcb53f22a3226653d0d8155da40f4afbc1fd0cec25de5fe8decf126"},
{file = "pymssql-2.3.1-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:da44761ca2f996d88f90c0f972b583dfe9c389db84888bd8209cdb83508f7c7a"},
{file = "pymssql-2.3.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:9557b738475e06dfd53f97d8a2c2b259b9b9fd79bf1a4e084ae4e9f164be644d"},
{file = "pymssql-2.3.1-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:a1f3f2e2792364a50417f3c2dc0d8f125955c1b641f36eb313daf666045b9748"},
{file = "pymssql-2.3.1-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:be8af4dea025f171ffb1e5b17cb0c9cbc92b0e3c32d0517bc678fff6f660e5fb"},
{file = "pymssql-2.3.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:a87950fb1a2b1c4028064fac971f3e191adebb58657ca985330f70e02f95223e"},
{file = "pymssql-2.3.1-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:9ea04bf8e13d567650631a944c88886c99a5622d9491e896a9b5a9ffbef2e352"},
{file = "pymssql-2.3.1-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:4d93a82f8ad7d3606354b81bbbe7e7832f70fd6e9ccb2e04a2975117da5df973"},
{file = "pymssql-2.3.1-cp39-cp39-win32.whl", hash = "sha256:6a2657152d4007314b66f353a25fc2742155c2770083320b5255fc576103661e"},
{file = "pymssql-2.3.1-cp39-cp39-win_amd64.whl", hash = "sha256:6c9ffb3ef110bf0fc2a41c845f231cf749162b1d71e02b0aceb6c0ebc603e2e9"},
{file = "pymssql-2.3.1.tar.gz", hash = "sha256:ddee15c4c193e14c92fe2cd720ca9be1dba1e0f4178240380b8f5f6f00da04c6"},
] ]
[[package]] [[package]]
@@ -3793,12 +3879,84 @@ cli = ["click (>=5.0)"]
[[package]] [[package]]
name = "python-rapidjson" name = "python-rapidjson"
version = "1.1" version = "1.20"
description = "Python wrapper around rapidjson" description = "Python wrapper around rapidjson"
optional = false optional = false
python-versions = ">=3.6" python-versions = ">=3.6"
files = [ files = [
{file = "python-rapidjson-1.1.tar.gz", hash = "sha256:9353a5eeb23a43556fa382ff94b3f6d67c663e31a2cfd220268c13e3f848fddc"}, {file = "python_rapidjson-1.20-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:eeaa8487fdd8db409bd2e0c41c59cee3b9f1d08401fc75520f7d35c7a22d8789"},
{file = "python_rapidjson-1.20-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:425c2bb8e778a04497953482c251944b2736f61012d897f17b73da3eca060c27"},
{file = "python_rapidjson-1.20-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4f7cbbff9696ea01dd8a29502cb314471c9a5d4239f2f3b7e35b6adbde2cc620"},
{file = "python_rapidjson-1.20-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:83a48f96d0abb8349a4d42f029259b755d8c6fd347f5de2d640e164c3f45e63b"},
{file = "python_rapidjson-1.20-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6cb3ad353ec083a6dcf0552f1fce3c490f92e2fccf9a81eac42835297a8431a1"},
{file = "python_rapidjson-1.20-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7f7b6574887d8828f34eb3384092d6e6c290e8fbb12703c409dbdde814612657"},
{file = "python_rapidjson-1.20-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:403e4986484f01f79fdce00b48c12a1b39d16e822cd37c60843ab26455ab0680"},
{file = "python_rapidjson-1.20-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:e3f89a58d7709d5879586e9dbfd11be76a799e8fbdbb5eddaffaeba9b572fba3"},
{file = "python_rapidjson-1.20-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:b0d07d4f0ebbb2228d5140463f11ac519147b9d791f7e40b3edf518a806be3cc"},
{file = "python_rapidjson-1.20-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:a5fb413414b92763a54d53b732df3c9de1b114012c8881a3d1215a19b9fca494"},
{file = "python_rapidjson-1.20-cp310-cp310-win32.whl", hash = "sha256:9831430f17101a6a249e07db9c42d26c3263e6009450722cce0c14726421f434"},
{file = "python_rapidjson-1.20-cp310-cp310-win_amd64.whl", hash = "sha256:fbff5caf127c5bed4d6620f95a039dd9e293784d844af50782aaf278a743acb4"},
{file = "python_rapidjson-1.20-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:328095d6d558090c29d24d889482b10dcc3ade3b77c93a61ea86794623046628"},
{file = "python_rapidjson-1.20-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:fc7a095f77eb3bb6acff94acf868a100faaf06028c4b513428f161cd55030476"},
{file = "python_rapidjson-1.20-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ce4cee141c924300cbedba1e5bea05b13484598d1e550afc5b50209ba73c62f2"},
{file = "python_rapidjson-1.20-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4355bcfc8629d15f6246011b40e84cc368d842518a91adb15c5eba211305ee5b"},
{file = "python_rapidjson-1.20-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7dd9c5e661d17eafa44b2875f6ce55178cc87388575ce3cd3c606d5a33772b49"},
{file = "python_rapidjson-1.20-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:bd978c7669cc844f669a48d2a6019fb9134a2385536f806fe265a1e374c3573a"},
{file = "python_rapidjson-1.20-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8fc52405435ce875aa000afa2637ea267eb0d4ab9622f9b97c92d92cb1a9c440"},
{file = "python_rapidjson-1.20-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:bef1eca712fb9fd5d2edd724dd1dd8a608215d6afcaee4f351b3e99e3f73f720"},
{file = "python_rapidjson-1.20-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:6355cb690bf64629767206524d4d00da909970d46d8fc0b367f339975e4eb419"},
{file = "python_rapidjson-1.20-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:f974c4e11be833221062fc4c3129bed172082792b33ef9fc1b8104f49c514f1d"},
{file = "python_rapidjson-1.20-cp311-cp311-win32.whl", hash = "sha256:06ee7bcf660ebbdf1953aa7bf74214b722d934928c7b9f2a23b12e0713b61fa4"},
{file = "python_rapidjson-1.20-cp311-cp311-win_amd64.whl", hash = "sha256:9df543521fa4b69589c42772b2f32a6c334b3b5fc612cd6dc3705136d0788da3"},
{file = "python_rapidjson-1.20-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:6056fcc8caeb9b04775bf655568bba362c7670ab792c1b438671bb056db954cd"},
{file = "python_rapidjson-1.20-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:225bd4cbabfe7910261cbcebb8b811d4ff98e90cdd17c233b916c6aa71a9553f"},
{file = "python_rapidjson-1.20-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:026077b663acf93a3f2b1adb87282e611a30214b8ae8001b7e4863a3b978e646"},
{file = "python_rapidjson-1.20-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:884e1dd4c0770ed424737941af4d5dc9014995f9c33595f151af13f83ce282c3"},
{file = "python_rapidjson-1.20-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5f55531c8197cb7a21a5ef0ffa46f2b8fc8c5fe7c6fd08bdbd2063ae65d2ff65"},
{file = "python_rapidjson-1.20-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c60121d155562dc694c05ed7df4e39e42ee1d3adff2a060c64a004498e6451f7"},
{file = "python_rapidjson-1.20-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3a6620eed0b04196f37fab7048c1d672d03391bb29d7f09ee8fee8dea33f11f4"},
{file = "python_rapidjson-1.20-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:ddb63eff401ce7cf20cdd5e21942fc23fbe0e1dc1d96d7ae838645fb1f74fb47"},
{file = "python_rapidjson-1.20-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:05e28c3dbb4a0d74ec13af9668ef2b9f302edf83cf7ce1d8316a95364720eec0"},
{file = "python_rapidjson-1.20-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b733978ecd84fc5df9a778ce821dc1f3113f7bfc2493cac0bb17efb4ae0bb8fa"},
{file = "python_rapidjson-1.20-cp312-cp312-win32.whl", hash = "sha256:d87041448cec00e2db5d858625a76dc1b59eef6691a039acff6d92ad8581cfc1"},
{file = "python_rapidjson-1.20-cp312-cp312-win_amd64.whl", hash = "sha256:5d3be149ce5475f9605f01240487541057792abad94d3fd0cd56af363cf5a4dc"},
{file = "python_rapidjson-1.20-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:daee815b4c20ca6e4dbc6bde373dd3f65b53813d775f1c94b765b33b402513a7"},
{file = "python_rapidjson-1.20-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:083df379c769b30f9bc40041c91fd9d8f7bb8ca2b3c7170258842aced2098e05"},
{file = "python_rapidjson-1.20-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b9399ad75a2e3377f9e6208caabe73eb9354cd01b732407475ccadcd42c577df"},
{file = "python_rapidjson-1.20-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:599ab208ccf6172d6cfac1abe048c837e62612f91f97d198e32773c45346a0b4"},
{file = "python_rapidjson-1.20-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bf3c0e2a5b97b0d07311f15f0dce4434e43dec865c3794ad1b10d968460fd665"},
{file = "python_rapidjson-1.20-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e8064b8edb57ddd9e3ffa539cf2ec2f03515751fb0698b40ba5cb66a2123af19"},
{file = "python_rapidjson-1.20-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:bc79d7f00f7538e027960ca6bcd1e03ed99fcf660d4d882d1c22f641155d0db0"},
{file = "python_rapidjson-1.20-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:87aa0b01b8c20984844f1440b8ff6bdb32de911a1750fed344b9daed33b4b52b"},
{file = "python_rapidjson-1.20-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:4099cb9eae8a0ce19c09e02729eb6d69d5180424f13a2641a6c407d053e47a82"},
{file = "python_rapidjson-1.20-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4c680cd2b4de760ff6875de71fe6a87bd610aa116593d62e4f81a563be86ae18"},
{file = "python_rapidjson-1.20-cp313-cp313-win32.whl", hash = "sha256:9e431a7afc77aa874fed537c9f6bf5fcecaef124ebeae2a2379d3b9e9adce74b"},
{file = "python_rapidjson-1.20-cp313-cp313-win_amd64.whl", hash = "sha256:7444bc7e6a04c03d6ed748b5dab0798fa2b3f2b303be8c38d3af405b2cac6d63"},
{file = "python_rapidjson-1.20-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:69e702fe74fe8c44c6253bb91364a270dc49f704920c90e01040155bd600a5fd"},
{file = "python_rapidjson-1.20-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:b9496b1e9d6247e8802ac559b7eebb5f3cae426d1c1dbde4049c63dff0941370"},
{file = "python_rapidjson-1.20-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1446e902b6c781f271bf8556da636c1375cbb208e25f92e1af4cc2d92cf0cf15"},
{file = "python_rapidjson-1.20-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:368ecdf4031abbde9c94aac40981d9a1238e6bcfef9fbfee441047b4757d6033"},
{file = "python_rapidjson-1.20-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:924f9ea302494d4a4d540d3509f8f1f15622ea7d614c6f29df3188d52c6cb546"},
{file = "python_rapidjson-1.20-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:632acb2dfa29883723e24bb2ce47c726edd5f672341553a5184db68f78d3bd09"},
{file = "python_rapidjson-1.20-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:c2f85da53286e67778d4061ef32ff44ca9b5f945030463716e046ee8985319f8"},
{file = "python_rapidjson-1.20-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:c05c8602c019cc0db19601fdc4927755a9d33f21d01beb3d5767313d7a81360d"},
{file = "python_rapidjson-1.20-cp38-cp38-musllinux_1_2_ppc64le.whl", hash = "sha256:7d36aab758bfb1b59e0a849cd20e971eda951a04d3586bb5f6cb460bfc7c103d"},
{file = "python_rapidjson-1.20-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:e5774c905034362298312116f9b58c181e91a09800e4e5cede7b3d460a6a9fde"},
{file = "python_rapidjson-1.20-cp38-cp38-win32.whl", hash = "sha256:488d0c6155004b5177225eaf331bb1838616da05ae966dd24a7d442751c1d193"},
{file = "python_rapidjson-1.20-cp38-cp38-win_amd64.whl", hash = "sha256:00183c4938cd491b98b1a43626bc5a381842ceba87644cb91b25555f3fc3c0bf"},
{file = "python_rapidjson-1.20-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:f510ffe32fec319699f0c1ea9cee5bde47c33202b034b85c5d1b9ace682aa96a"},
{file = "python_rapidjson-1.20-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:a2b624b3613fb7b8dfef4adc709bf39489be8c655cd9d24dc4e2cc16fc5def83"},
{file = "python_rapidjson-1.20-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6c9f813a37d1f708a221f1f7d8c97c437d10597261810c1d3b52cf8f248d66c0"},
{file = "python_rapidjson-1.20-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0c3f7085c52259c56af72462df7620c3b8bb95575fd9b8c3a073728855e93269"},
{file = "python_rapidjson-1.20-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:871f2eeb0907f3d7ab09efe04c5b5e2886c275ea568f7867c97468ae14cdd52f"},
{file = "python_rapidjson-1.20-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b7c0408e7f52f32cf4bdd5aa305f005914b0143cac69d42575e2d40e8678cd72"},
{file = "python_rapidjson-1.20-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:ec17a18df700e1f956fc5a0c41cbb3cc746c44c0fef38988efba9b2cb607ecfa"},
{file = "python_rapidjson-1.20-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:1c0303bd445312a78485a9adba06dfdb84561c5157a9cda7999fefb36df4c6cc"},
{file = "python_rapidjson-1.20-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:303b079ef268a996242be51ae80c8b563ee2d73489ab4f16199fef2216e80765"},
{file = "python_rapidjson-1.20-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:5adcef7a27abafbb2b3d0b02c822dfd9b4b329769cb97810b7f9733e1fda0498"},
{file = "python_rapidjson-1.20-cp39-cp39-win32.whl", hash = "sha256:3e963e78fff6ab5ab2ae847b65683774c48b9b192307380f2175540d6423fd73"},
{file = "python_rapidjson-1.20-cp39-cp39-win_amd64.whl", hash = "sha256:1fc3bba6632ecffeb1897fdf98858dc50a677237f4241853444c70a041158a90"},
{file = "python_rapidjson-1.20.tar.gz", hash = "sha256:115f08c86d2df7543c02605e77c84727cdabc4b08310d2f097e953efeaaa73eb"},
] ]
[[package]] [[package]]
@@ -4107,13 +4265,13 @@ requests = ">=2.0.1,<3.0.0"
[[package]] [[package]]
name = "restrictedpython" name = "restrictedpython"
version = "6.2" version = "7.3"
description = "RestrictedPython is a defined subset of the Python language which allows to provide a program input into a trusted environment." description = "RestrictedPython is a defined subset of the Python language which allows to provide a program input into a trusted environment."
optional = false optional = false
python-versions = ">=3.6, <3.12" python-versions = "<3.13,>=3.7"
files = [ files = [
{file = "RestrictedPython-6.2-py3-none-any.whl", hash = "sha256:7c2ffa4904300d67732f841d8a975dcdc53eba4c1cdc9d84b97684ef12304a3d"}, {file = "RestrictedPython-7.3-py3-none-any.whl", hash = "sha256:40a6170bbcfc48b32962831d9281a61608c8e56e7c02fd8e2397225f516a6ed4"},
{file = "RestrictedPython-6.2.tar.gz", hash = "sha256:db73eb7e3b39650f0d21d10cc8dda9c0e2986e621c94b0c5de32fb0dee3a08af"}, {file = "RestrictedPython-7.3.tar.gz", hash = "sha256:8888304c7858fdcfd86c50b58561797375ba40319d2b6ffb5d24b08b6a2dcd61"},
] ]
[package.extras] [package.extras]
@@ -4282,13 +4440,13 @@ files = [
[[package]] [[package]]
name = "sentry-sdk" name = "sentry-sdk"
version = "1.28.1" version = "1.45.1"
description = "Python client for Sentry (https://sentry.io)" description = "Python client for Sentry (https://sentry.io)"
optional = false optional = false
python-versions = "*" python-versions = "*"
files = [ files = [
{file = "sentry-sdk-1.28.1.tar.gz", hash = "sha256:dcd88c68aa64dae715311b5ede6502fd684f70d00a7cd4858118f0ba3153a3ae"}, {file = "sentry_sdk-1.45.1-py2.py3-none-any.whl", hash = "sha256:608887855ccfe39032bfd03936e3a1c4f4fc99b3a4ac49ced54a4220de61c9c1"},
{file = "sentry_sdk-1.28.1-py2.py3-none-any.whl", hash = "sha256:6bdb25bd9092478d3a817cb0d01fa99e296aea34d404eac3ca0037faa5c2aa0a"}, {file = "sentry_sdk-1.45.1.tar.gz", hash = "sha256:a16c997c0f4e3df63c0fc5e4207ccb1ab37900433e0f72fef88315d317829a26"},
] ]
[package.dependencies] [package.dependencies]
@@ -4298,10 +4456,13 @@ urllib3 = {version = ">=1.26.11", markers = "python_version >= \"3.6\""}
[package.extras] [package.extras]
aiohttp = ["aiohttp (>=3.5)"] aiohttp = ["aiohttp (>=3.5)"]
arq = ["arq (>=0.23)"] arq = ["arq (>=0.23)"]
asyncpg = ["asyncpg (>=0.23)"]
beam = ["apache-beam (>=2.12)"] beam = ["apache-beam (>=2.12)"]
bottle = ["bottle (>=0.12.13)"] bottle = ["bottle (>=0.12.13)"]
celery = ["celery (>=3)"] celery = ["celery (>=3)"]
celery-redbeat = ["celery-redbeat (>=2)"]
chalice = ["chalice (>=1.16.0)"] chalice = ["chalice (>=1.16.0)"]
clickhouse-driver = ["clickhouse-driver (>=0.2.0)"]
django = ["django (>=1.8)"] django = ["django (>=1.8)"]
falcon = ["falcon (>=1.4)"] falcon = ["falcon (>=1.4)"]
fastapi = ["fastapi (>=0.79.0)"] fastapi = ["fastapi (>=0.79.0)"]
@@ -4310,7 +4471,9 @@ grpcio = ["grpcio (>=1.21.1)"]
httpx = ["httpx (>=0.16.0)"] httpx = ["httpx (>=0.16.0)"]
huey = ["huey (>=2)"] huey = ["huey (>=2)"]
loguru = ["loguru (>=0.5)"] loguru = ["loguru (>=0.5)"]
openai = ["openai (>=1.0.0)", "tiktoken (>=0.3.0)"]
opentelemetry = ["opentelemetry-distro (>=0.35b0)"] opentelemetry = ["opentelemetry-distro (>=0.35b0)"]
opentelemetry-experimental = ["opentelemetry-distro (>=0.40b0,<1.0)", "opentelemetry-instrumentation-aiohttp-client (>=0.40b0,<1.0)", "opentelemetry-instrumentation-django (>=0.40b0,<1.0)", "opentelemetry-instrumentation-fastapi (>=0.40b0,<1.0)", "opentelemetry-instrumentation-flask (>=0.40b0,<1.0)", "opentelemetry-instrumentation-requests (>=0.40b0,<1.0)", "opentelemetry-instrumentation-sqlite3 (>=0.40b0,<1.0)", "opentelemetry-instrumentation-urllib (>=0.40b0,<1.0)"]
pure-eval = ["asttokens", "executing", "pure-eval"] pure-eval = ["asttokens", "executing", "pure-eval"]
pymongo = ["pymongo (>=3.1)"] pymongo = ["pymongo (>=3.1)"]
pyspark = ["pyspark (>=2.4.4)"] pyspark = ["pyspark (>=2.4.4)"]
@@ -4481,37 +4644,37 @@ files = [
[[package]] [[package]]
name = "snowflake-connector-python" name = "snowflake-connector-python"
version = "3.12.0" version = "3.12.3"
description = "Snowflake Connector for Python" description = "Snowflake Connector for Python"
optional = false optional = false
python-versions = ">=3.8" python-versions = ">=3.8"
files = [ files = [
{file = "snowflake_connector_python-3.12.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:edf28df8be24845cfcec653b160d2b8c048d5cb0c85b051f4957f0b0aae1e493"}, {file = "snowflake_connector_python-3.12.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:497a096fc379ef0846b2f1cf11a8d7620f0d090f08a77d9e93473845014d57d1"},
{file = "snowflake_connector_python-3.12.0-cp310-cp310-macosx_11_0_x86_64.whl", hash = "sha256:c2bbdbbb028d7d542815ed68b28200728aa6707b9354e3a447fdc8c7a34bcdce"}, {file = "snowflake_connector_python-3.12.3-cp310-cp310-macosx_11_0_x86_64.whl", hash = "sha256:055c5808d524497213e4cc9ae91ec3e46cb8342b314e78bc3e139d733dc16741"},
{file = "snowflake_connector_python-3.12.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a92c9a19a23033df709e63baa6ccdf6eff65210143a8c9c67a0a24bba862034b"}, {file = "snowflake_connector_python-3.12.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4a5dc512d62ef693041ed2ad82931231caddc16e14ffc2842da3e3dd4240b83d"},
{file = "snowflake_connector_python-3.12.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8d33d845e4c68d33e73a9f64100b53342c18607ac25c4f2a27dbed2078078d12"}, {file = "snowflake_connector_python-3.12.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a46448f7279d444084eb84a9cddea67662e80ccfaddf41713b9e9aab2b1242e9"},
{file = "snowflake_connector_python-3.12.0-cp310-cp310-win_amd64.whl", hash = "sha256:c1d43bfaa885aab712f14f9ced232abe5023adfca7fbf7a7a0768a162523e9d6"}, {file = "snowflake_connector_python-3.12.3-cp310-cp310-win_amd64.whl", hash = "sha256:821b774b77129ce9f03729456ac1f21d69fedb50e5ce957178131c7bb3d8279f"},
{file = "snowflake_connector_python-3.12.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:6a0cc03fb44808f3ddc464ee272f141564c8daea14475e1df5c2a54c7acb2ddf"}, {file = "snowflake_connector_python-3.12.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:82290134978d11628026b447052219ce8d880e36937204f1f0332dfc3f2e92e9"},
{file = "snowflake_connector_python-3.12.0-cp311-cp311-macosx_11_0_x86_64.whl", hash = "sha256:564752d22accc43351b50f676b03aa9f2b441be2641e3cf9a7790faf54eff210"}, {file = "snowflake_connector_python-3.12.3-cp311-cp311-macosx_11_0_x86_64.whl", hash = "sha256:20b5c8000ee9cee11b0f9a6ae26640f0d498ce77f7e2ec649a2f0d306523792d"},
{file = "snowflake_connector_python-3.12.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f27d6a1a180832c7b551d38df1094a70fb79917f90c57893b9ce7e219362f6c1"}, {file = "snowflake_connector_python-3.12.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ca6500d16bdbd37da88e589cc3e82b90272471d3aabfe4a79ec1cf4696675acf"},
{file = "snowflake_connector_python-3.12.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:60675fd83022daef40541d717d006695149c512b283e35741b61a4f48ba537e9"}, {file = "snowflake_connector_python-3.12.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b455ba117a68da436e253899674fae1a93669eaefdde8a903c03eb65b7e87c86"},
{file = "snowflake_connector_python-3.12.0-cp311-cp311-win_amd64.whl", hash = "sha256:a567b937b0179d1e95a8ad7200943d286f38d0e76df90af10f747ed9149dd681"}, {file = "snowflake_connector_python-3.12.3-cp311-cp311-win_amd64.whl", hash = "sha256:205219fcaeee2d33db5d0d023d60518e3bd8272ce1679be2199d7f362d255054"},
{file = "snowflake_connector_python-3.12.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:dc333fcfc383a8cab8bd7e890a7c76703e26598925a05954c75d2c50bff06071"}, {file = "snowflake_connector_python-3.12.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3d830ca32c864b730cba5d92900d850752199635c4fb0ae0a70ee677f62aee70"},
{file = "snowflake_connector_python-3.12.0-cp312-cp312-macosx_11_0_x86_64.whl", hash = "sha256:3c06bfba4a329fd4ec3feba0ada7b31f86ed4e156a9766bced52c2814d001fd2"}, {file = "snowflake_connector_python-3.12.3-cp312-cp312-macosx_11_0_x86_64.whl", hash = "sha256:597b0c74ec57ba693191ae2de8db9536e349ee32cab152df657473e498b6fd87"},
{file = "snowflake_connector_python-3.12.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:acf84b07dd2f22adfaa7d52ccd6be1722bd5a0e2b1a9b08681c3851bea05768f"}, {file = "snowflake_connector_python-3.12.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2215d8a4c5e25ea0d2183fe693c3fdf058cd6035e5c84710d532dc04ab4ffd31"},
{file = "snowflake_connector_python-3.12.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:019b8a61e5af689451d502df2af8793fc6f20b5b0a3548fd8ad03aa8b62e7f2d"}, {file = "snowflake_connector_python-3.12.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f8ba9c261904c1ba7cae6035c7881224cf979da39c8b7c7cb10236fdfc57e505"},
{file = "snowflake_connector_python-3.12.0-cp312-cp312-win_amd64.whl", hash = "sha256:45f9b9678694f10571c1f7ec7d0d741663ad0ff61a71ae53aa71be47faa19978"}, {file = "snowflake_connector_python-3.12.3-cp312-cp312-win_amd64.whl", hash = "sha256:f0d0fcb948ef0812ab162ec9767622f345554043a07439c0c1a9474c86772320"},
{file = "snowflake_connector_python-3.12.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:21cbaef51fbed719de01155079df3d004cee963d3723c1ebdb8980923f893e04"}, {file = "snowflake_connector_python-3.12.3-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:fe742a0b2fb1c79a21e95b97c49a05783bc00314d1184d227c5fe5b57688af12"},
{file = "snowflake_connector_python-3.12.0-cp38-cp38-macosx_11_0_x86_64.whl", hash = "sha256:c86d4a7d49f42ea0bb34218cb49c401ba995892abcfb509ea749cd0a74a8b28a"}, {file = "snowflake_connector_python-3.12.3-cp38-cp38-macosx_11_0_x86_64.whl", hash = "sha256:a8584a44a6bb41d2056cf1b833e629c76e28c5303d2c875c1a23bda46a1cd43a"},
{file = "snowflake_connector_python-3.12.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1aa34aec0f96d7fc7271e38c68ee0d58529875d05e084afb4fc8f09b694643c4"}, {file = "snowflake_connector_python-3.12.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dd990db8e4886c32ba5c63758e8dc4814e2e75f5fd3fe79d43f7e5ee0fc46793"},
{file = "snowflake_connector_python-3.12.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c2f621030b26a220711c64518e00059736b79c1da53afa6a8ce68b31c1941014"}, {file = "snowflake_connector_python-3.12.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e4fe7f91f6e44bda877e77403a586d7487ca2c52dc1a32a705b2fea33f9c763a"},
{file = "snowflake_connector_python-3.12.0-cp38-cp38-win_amd64.whl", hash = "sha256:368e46f1d079056e028bfe8f7171fabef62eb00bcf590df294220b7a5be5d56c"}, {file = "snowflake_connector_python-3.12.3-cp38-cp38-win_amd64.whl", hash = "sha256:4994e95eff593dc44c28243ef0ae8d27b8b1aeb96dd64cbcea5bcf0e4dfb77fb"},
{file = "snowflake_connector_python-3.12.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:2735e16fffded0900f7484030613b79699afc1ed4e5cff086bd139a0ce965594"}, {file = "snowflake_connector_python-3.12.3-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:ac33a7dd54b35f94c4b91369971dbd6467a914dff4b01c46e77e7e6901d7eca4"},
{file = "snowflake_connector_python-3.12.0-cp39-cp39-macosx_11_0_x86_64.whl", hash = "sha256:c06a8e2e12284b4a4d462d0073fb4983e90ad2d6a2382926f9e3409f06c81d0b"}, {file = "snowflake_connector_python-3.12.3-cp39-cp39-macosx_11_0_x86_64.whl", hash = "sha256:a26876322811fe2b93f6d814dcfe016f1df680a12624026ecf57a6bcdf20f969"},
{file = "snowflake_connector_python-3.12.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:880e6e95171cd7374a86da14132fdfc4b622665f134561f4d43e3f35bdacf67d"}, {file = "snowflake_connector_python-3.12.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6c0bb390be2e15b6b7cccab7fbe1ef94e1e9ab13790c974aa44761298cdc2641"},
{file = "snowflake_connector_python-3.12.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6e245b84c164433454ce49d78e6bcf5c2e62e25657358bf34ab533166e588f80"}, {file = "snowflake_connector_python-3.12.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e7340f73af4ae72e6af8fe28a1b8e196a0c99943071afc96ce419efb4da80035"},
{file = "snowflake_connector_python-3.12.0-cp39-cp39-win_amd64.whl", hash = "sha256:85a5565b8813d164f33f32a825a70443008fe009aae050307f128a1ca892f9ed"}, {file = "snowflake_connector_python-3.12.3-cp39-cp39-win_amd64.whl", hash = "sha256:c314749bd0151218b654a7d4646a39067ab650bdc86dfebb1884b056b0bdb4b4"},
{file = "snowflake_connector_python-3.12.0.tar.gz", hash = "sha256:320e0b6f8cd8556e19c8b87249c931700238b2958313afc7a33108d67da87d82"}, {file = "snowflake_connector_python-3.12.3.tar.gz", hash = "sha256:02873c7f7a3b10322e28dddc2be6907f8ab8ecad93d6d6af14c77c2f53091b88"},
] ]
[package.dependencies] [package.dependencies]
@@ -4519,7 +4682,7 @@ asn1crypto = ">0.24.0,<2.0.0"
certifi = ">=2017.4.17" certifi = ">=2017.4.17"
cffi = ">=1.9,<2.0.0" cffi = ">=1.9,<2.0.0"
charset-normalizer = ">=2,<4" charset-normalizer = ">=2,<4"
cryptography = ">=3.1.0,<43.0.0" cryptography = ">=3.1.0"
filelock = ">=3.5,<4" filelock = ">=3.5,<4"
idna = ">=2.5,<4" idna = ">=2.5,<4"
packaging = "*" packaging = "*"
@@ -4991,13 +5154,13 @@ six = ">=1.10.0"
[[package]] [[package]]
name = "virtualenv" name = "virtualenv"
version = "20.25.0" version = "20.26.6"
description = "Virtual Python Environment builder" description = "Virtual Python Environment builder"
optional = false optional = false
python-versions = ">=3.7" python-versions = ">=3.7"
files = [ files = [
{file = "virtualenv-20.25.0-py3-none-any.whl", hash = "sha256:4238949c5ffe6876362d9c0180fc6c3a824a7b12b80604eeb8085f2ed7460de3"}, {file = "virtualenv-20.26.6-py3-none-any.whl", hash = "sha256:7345cc5b25405607a624d8418154577459c3e0277f5466dd79c49d5e492995f2"},
{file = "virtualenv-20.25.0.tar.gz", hash = "sha256:bf51c0d9c7dd63ea8e44086fa1e4fb1093a31e963b86959257378aef020e1f1b"}, {file = "virtualenv-20.26.6.tar.gz", hash = "sha256:280aede09a2a5c317e409a00102e7077c6432c5a38f0ef938e643805a7ad2c48"},
] ]
[package.dependencies] [package.dependencies]
@@ -5006,7 +5169,7 @@ filelock = ">=3.12.2,<4"
platformdirs = ">=3.9.1,<5" platformdirs = ">=3.9.1,<5"
[package.extras] [package.extras]
docs = ["furo (>=2023.7.26)", "proselint (>=0.13)", "sphinx (>=7.1.2)", "sphinx-argparse (>=0.4)", "sphinxcontrib-towncrier (>=0.2.1a0)", "towncrier (>=23.6)"] docs = ["furo (>=2023.7.26)", "proselint (>=0.13)", "sphinx (>=7.1.2,!=7.3)", "sphinx-argparse (>=0.4)", "sphinxcontrib-towncrier (>=0.2.1a0)", "towncrier (>=23.6)"]
test = ["covdefaults (>=2.3)", "coverage (>=7.2.7)", "coverage-enable-subprocess (>=1)", "flaky (>=3.7)", "packaging (>=23.1)", "pytest (>=7.4)", "pytest-env (>=0.8.2)", "pytest-freezer (>=0.4.8)", "pytest-mock (>=3.11.1)", "pytest-randomly (>=3.12)", "pytest-timeout (>=2.1)", "setuptools (>=68)", "time-machine (>=2.10)"] test = ["covdefaults (>=2.3)", "coverage (>=7.2.7)", "coverage-enable-subprocess (>=1)", "flaky (>=3.7)", "packaging (>=23.1)", "pytest (>=7.4)", "pytest-env (>=0.8.2)", "pytest-freezer (>=0.4.8)", "pytest-mock (>=3.11.1)", "pytest-randomly (>=3.12)", "pytest-timeout (>=2.1)", "setuptools (>=68)", "time-machine (>=2.10)"]
[[package]] [[package]]
@@ -5330,4 +5493,4 @@ testing = ["coverage (>=5.0.3)", "zope.event", "zope.testing"]
[metadata] [metadata]
lock-version = "2.0" lock-version = "2.0"
python-versions = ">=3.8,<3.11" python-versions = ">=3.8,<3.11"
content-hash = "2f392e4b1cf2dd6c455462028ce8347e698a13a1b26ebe8449d71800bb925f25" content-hash = "93b13c8a960e148463fba93cfd826c0f3e7bd822bbda55af7ba708baead293df"

View File

@@ -12,7 +12,7 @@ force-exclude = '''
[tool.poetry] [tool.poetry]
name = "redash" name = "redash"
version = "24.08.1-dev" version = "25.03.0-dev"
description = "Make Your Company Data Driven. Connect to any data source, easily visualize, dashboard and share your data." description = "Make Your Company Data Driven. Connect to any data source, easily visualize, dashboard and share your data."
authors = ["Arik Fraimovich <arik@redash.io>"] authors = ["Arik Fraimovich <arik@redash.io>"]
# to be added to/removed from the mailing list, please reach out to Arik via the above email or Discord # to be added to/removed from the mailing list, please reach out to Arik via the above email or Discord
@@ -29,7 +29,7 @@ authlib = "0.15.5"
backoff = "2.2.1" backoff = "2.2.1"
blinker = "1.6.2" blinker = "1.6.2"
click = "8.1.3" click = "8.1.3"
cryptography = "42.0.8" cryptography = "43.0.1"
disposable-email-domains = ">=0.0.52" disposable-email-domains = ">=0.0.52"
flask = "2.3.2" flask = "2.3.2"
flask-limiter = "3.3.1" flask-limiter = "3.3.1"
@@ -46,7 +46,7 @@ greenlet = "2.0.2"
gunicorn = "22.0.0" gunicorn = "22.0.0"
httplib2 = "0.19.0" httplib2 = "0.19.0"
itsdangerous = "2.1.2" itsdangerous = "2.1.2"
jinja2 = "3.1.4" jinja2 = "3.1.5"
jsonschema = "3.1.1" jsonschema = "3.1.1"
markupsafe = "2.1.1" markupsafe = "2.1.1"
maxminddb-geolite2 = "2018.703" maxminddb-geolite2 = "2018.703"
@@ -65,11 +65,11 @@ pyyaml = "6.0.1"
redis = "4.6.0" redis = "4.6.0"
regex = "2023.8.8" regex = "2023.8.8"
requests = "2.32.3" requests = "2.32.3"
restrictedpython = "6.2" restrictedpython = "7.3"
rq = "1.16.1" rq = "1.16.1"
rq-scheduler = "0.13.1" rq-scheduler = "0.13.1"
semver = "2.8.1" semver = "2.8.1"
sentry-sdk = "1.28.1" sentry-sdk = "1.45.1"
sqlalchemy = "1.3.24" sqlalchemy = "1.3.24"
sqlalchemy-searchable = "1.2.0" sqlalchemy-searchable = "1.2.0"
sqlalchemy-utils = "0.38.3" sqlalchemy-utils = "0.38.3"
@@ -86,6 +86,9 @@ wtforms = "2.2.1"
xlsxwriter = "1.2.2" xlsxwriter = "1.2.2"
tzlocal = "4.3.1" tzlocal = "4.3.1"
pyodbc = "5.1.0" pyodbc = "5.1.0"
debugpy = "^1.8.9"
paramiko = "3.4.1"
oracledb = "2.5.1"
[tool.poetry.group.all_ds] [tool.poetry.group.all_ds]
optional = true optional = true
@@ -111,26 +114,25 @@ nzalchemy = "^11.0.2"
nzpy = ">=1.15" nzpy = ">=1.15"
oauth2client = "4.1.3" oauth2client = "4.1.3"
openpyxl = "3.0.7" openpyxl = "3.0.7"
oracledb = "2.1.2"
pandas = "1.3.4" pandas = "1.3.4"
phoenixdb = "0.7" phoenixdb = "0.7"
pinotdb = ">=0.4.5" pinotdb = ">=0.4.5"
protobuf = "3.20.2" protobuf = "3.20.2"
pyathena = ">=1.5.0,<=1.11.5" pyathena = "2.25.2"
pydgraph = "2.0.2" pydgraph = "2.0.2"
pydruid = "0.5.7" pydruid = "0.5.7"
pyexasol = "0.12.0" pyexasol = "0.12.0"
pyhive = "0.6.1" pyhive = "0.6.1"
pyignite = "0.6.1" pyignite = "0.6.1"
pymongo = { version = "4.6.3", extras = ["srv", "tls"] } pymongo = { version = "4.6.3", extras = ["srv", "tls"] }
pymssql = "2.2.8" pymssql = "^2.3.1"
pyodbc = "5.1.0" pyodbc = "5.1.0"
python-arango = "6.1.0" python-arango = "6.1.0"
python-rapidjson = "1.1.0" python-rapidjson = "1.20"
requests-aws-sign = "0.1.5" requests-aws-sign = "0.1.5"
sasl = ">=0.1.3" sasl = ">=0.1.3"
simple-salesforce = "0.74.3" simple-salesforce = "0.74.3"
snowflake-connector-python = "3.12.0" snowflake-connector-python = "3.12.3"
td-client = "1.0.0" td-client = "1.0.0"
thrift = ">=0.8.0" thrift = ">=0.8.0"
thrift-sasl = ">=0.1.0" thrift-sasl = ">=0.1.0"
@@ -156,7 +158,6 @@ jwcrypto = "1.5.6"
mock = "5.0.2" mock = "5.0.2"
pre-commit = "3.3.3" pre-commit = "3.3.3"
ptpython = "3.0.23" ptpython = "3.0.23"
ptvsd = "4.3.2"
pytest-cov = "4.1.0" pytest-cov = "4.1.0"
watchdog = "3.0.0" watchdog = "3.0.0"
ruff = "0.0.289" ruff = "0.0.289"

View File

@@ -14,13 +14,14 @@ from redash.app import create_app # noqa
from redash.destinations import import_destinations from redash.destinations import import_destinations
from redash.query_runner import import_query_runners from redash.query_runner import import_query_runners
__version__ = "24.08.1-dev" __version__ = "25.03.0-dev"
if os.environ.get("REMOTE_DEBUG"): if os.environ.get("REMOTE_DEBUG"):
import ptvsd import debugpy
ptvsd.enable_attach(address=("0.0.0.0", 5678)) debugpy.listen(("0.0.0.0", 5678))
debugpy.wait_for_client()
def setup_logging(): def setup_logging():

View File

@@ -36,10 +36,14 @@ def create_app():
from .metrics import request as request_metrics from .metrics import request as request_metrics
from .models import db, users from .models import db, users
from .utils import sentry from .utils import sentry
from .version_check import reset_new_version_status
sentry.init() sentry.init()
app = Redash() app = Redash()
# Check and update the cached version for use by the client
reset_new_version_status()
security.init_app(app) security.init_app(app)
request_metrics.init_app(app) request_metrics.init_app(app)
db.init_app(app) db.init_app(app)

View File

@@ -5,6 +5,22 @@ from sqlalchemy.orm.exc import NoResultFound
manager = AppGroup(help="Queries management commands.") manager = AppGroup(help="Queries management commands.")
@manager.command(name="rehash")
def rehash():
from redash import models
for q in models.Query.query.all():
old_hash = q.query_hash
q.update_query_hash()
new_hash = q.query_hash
if old_hash != new_hash:
print(f"Query {q.id} has changed hash from {old_hash} to {new_hash}")
models.db.session.add(q)
models.db.session.commit()
@manager.command(name="add_tag") @manager.command(name="add_tag")
@argument("query_id") @argument("query_id")
@argument("tag") @argument("tag")

View File

@@ -1,3 +1,5 @@
import html
import json
import logging import logging
from copy import deepcopy from copy import deepcopy
@@ -37,6 +39,129 @@ class Webex(BaseDestination):
@staticmethod @staticmethod
def formatted_attachments_template(subject, description, query_link, alert_link): def formatted_attachments_template(subject, description, query_link, alert_link):
# Attempt to parse the description to find a 2D array
try:
# Extract the part of the description that looks like a JSON array
start_index = description.find("[")
end_index = description.rfind("]") + 1
json_array_str = description[start_index:end_index]
# Decode HTML entities
json_array_str = html.unescape(json_array_str)
# Replace single quotes with double quotes for valid JSON
json_array_str = json_array_str.replace("'", '"')
# Load the JSON array
data_array = json.loads(json_array_str)
# Check if it's a 2D array
if isinstance(data_array, list) and all(isinstance(i, list) for i in data_array):
# Create a table for the Adaptive Card
table_rows = []
for row in data_array:
table_rows.append(
{
"type": "ColumnSet",
"columns": [
{"type": "Column", "items": [{"type": "TextBlock", "text": str(item), "wrap": True}]}
for item in row
],
}
)
# Create the body of the card with the table
body = (
[
{
"type": "TextBlock",
"text": f"{subject}",
"weight": "bolder",
"size": "medium",
"wrap": True,
},
{
"type": "TextBlock",
"text": f"{description[:start_index]}",
"isSubtle": True,
"wrap": True,
},
]
+ table_rows
+ [
{
"type": "TextBlock",
"text": f"Click [here]({query_link}) to check your query!",
"wrap": True,
"isSubtle": True,
},
{
"type": "TextBlock",
"text": f"Click [here]({alert_link}) to check your alert!",
"wrap": True,
"isSubtle": True,
},
]
)
else:
# Fallback to the original description if no valid 2D array is found
body = [
{
"type": "TextBlock",
"text": f"{subject}",
"weight": "bolder",
"size": "medium",
"wrap": True,
},
{
"type": "TextBlock",
"text": f"{description}",
"isSubtle": True,
"wrap": True,
},
{
"type": "TextBlock",
"text": f"Click [here]({query_link}) to check your query!",
"wrap": True,
"isSubtle": True,
},
{
"type": "TextBlock",
"text": f"Click [here]({alert_link}) to check your alert!",
"wrap": True,
"isSubtle": True,
},
]
except json.JSONDecodeError:
# If parsing fails, fallback to the original description
body = [
{
"type": "TextBlock",
"text": f"{subject}",
"weight": "bolder",
"size": "medium",
"wrap": True,
},
{
"type": "TextBlock",
"text": f"{description}",
"isSubtle": True,
"wrap": True,
},
{
"type": "TextBlock",
"text": f"Click [here]({query_link}) to check your query!",
"wrap": True,
"isSubtle": True,
},
{
"type": "TextBlock",
"text": f"Click [here]({alert_link}) to check your alert!",
"wrap": True,
"isSubtle": True,
},
]
return [ return [
{ {
"contentType": "application/vnd.microsoft.card.adaptive", "contentType": "application/vnd.microsoft.card.adaptive",
@@ -44,44 +169,7 @@ class Webex(BaseDestination):
"$schema": "http://adaptivecards.io/schemas/adaptive-card.json", "$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
"type": "AdaptiveCard", "type": "AdaptiveCard",
"version": "1.0", "version": "1.0",
"body": [ "body": body,
{
"type": "ColumnSet",
"columns": [
{
"type": "Column",
"width": 4,
"items": [
{
"type": "TextBlock",
"text": {subject},
"weight": "bolder",
"size": "medium",
"wrap": True,
},
{
"type": "TextBlock",
"text": {description},
"isSubtle": True,
"wrap": True,
},
{
"type": "TextBlock",
"text": f"Click [here]({query_link}) to check your query!",
"wrap": True,
"isSubtle": True,
},
{
"type": "TextBlock",
"text": f"Click [here]({alert_link}) to check your alert!",
"wrap": True,
"isSubtle": True,
},
],
},
],
}
],
}, },
} }
] ]
@@ -116,6 +204,10 @@ class Webex(BaseDestination):
# destinations is guaranteed to be a comma-separated string # destinations is guaranteed to be a comma-separated string
for destination_id in destinations.split(","): for destination_id in destinations.split(","):
destination_id = destination_id.strip() # Remove any leading or trailing whitespace
if not destination_id: # Check if the destination_id is empty or blank
continue # Skip to the next iteration if it's empty or blank
payload = deepcopy(template_payload) payload = deepcopy(template_payload)
payload[payload_tag] = destination_id payload[payload_tag] = destination_id
self.post_message(payload, headers) self.post_message(payload, headers)

View File

@@ -15,6 +15,7 @@ from redash.authentication.account import (
) )
from redash.handlers import routes from redash.handlers import routes
from redash.handlers.base import json_response, org_scoped_rule from redash.handlers.base import json_response, org_scoped_rule
from redash.version_check import get_latest_version
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@@ -28,6 +29,7 @@ def get_google_auth_url(next_path):
def render_token_login_page(template, org_slug, token, invite): def render_token_login_page(template, org_slug, token, invite):
error_message = None
try: try:
user_id = validate_token(token) user_id = validate_token(token)
org = current_org._get_current_object() org = current_org._get_current_object()
@@ -39,19 +41,19 @@ def render_token_login_page(template, org_slug, token, invite):
user_id, user_id,
org_slug, org_slug,
) )
error_message = "Your invite link is invalid. Bad user id in token. Please ask for a new one."
except SignatureExpired:
logger.exception("Token signature has expired. Token: %s, org=%s", token, org_slug)
error_message = "Your invite link has expired. Please ask for a new one."
except BadSignature:
logger.exception("Bad signature for the token: %s, org=%s", token, org_slug)
error_message = "Your invite link is invalid. Bad signature. Please double-check the token."
if error_message:
return ( return (
render_template( render_template(
"error.html", "error.html",
error_message="Invalid invite link. Please ask for a new one.", error_message=error_message,
),
400,
)
except (SignatureExpired, BadSignature):
logger.exception("Failed to verify invite token: %s, org=%s", token, org_slug)
return (
render_template(
"error.html",
error_message="Your invite link has expired. Please ask for a new one.",
), ),
400, 400,
) )
@@ -255,11 +257,15 @@ def number_format_config():
def client_config(): def client_config():
if not current_user.is_api_user() and current_user.is_authenticated: if not current_user.is_api_user() and current_user.is_authenticated:
client_config_inner = { client_config = {
"newVersionAvailable": bool(get_latest_version()),
"version": __version__, "version": __version__,
} }
else: else:
client_config_inner = {} client_config = {}
if current_user.has_permission("admin") and current_org.get_setting("beacon_consent") is None:
client_config["showBeaconConsentMessage"] = True
defaults = { defaults = {
"allowScriptsInUserInput": settings.ALLOW_SCRIPTS_IN_USER_INPUT, "allowScriptsInUserInput": settings.ALLOW_SCRIPTS_IN_USER_INPUT,
@@ -279,12 +285,12 @@ def client_config():
"tableCellMaxJSONSize": settings.TABLE_CELL_MAX_JSON_SIZE, "tableCellMaxJSONSize": settings.TABLE_CELL_MAX_JSON_SIZE,
} }
client_config_inner.update(defaults) client_config.update(defaults)
client_config_inner.update({"basePath": base_href()}) client_config.update({"basePath": base_href()})
client_config_inner.update(date_time_format_config()) client_config.update(date_time_format_config())
client_config_inner.update(number_format_config()) client_config.update(number_format_config())
return client_config_inner return client_config
def messages(): def messages():

View File

@@ -1,12 +1,13 @@
from flask import g, redirect, render_template, request, url_for from flask import g, redirect, render_template, request, url_for
from flask_login import login_user from flask_login import login_user
from wtforms import Form, PasswordField, StringField, validators from wtforms import BooleanField, Form, PasswordField, StringField, validators
from wtforms.fields.html5 import EmailField from wtforms.fields.html5 import EmailField
from redash import settings from redash import settings
from redash.authentication.org_resolving import current_org from redash.authentication.org_resolving import current_org
from redash.handlers.base import routes from redash.handlers.base import routes
from redash.models import Group, Organization, User, db from redash.models import Group, Organization, User, db
from redash.tasks.general import subscribe
class SetupForm(Form): class SetupForm(Form):
@@ -14,6 +15,8 @@ class SetupForm(Form):
email = EmailField("Email Address", validators=[validators.Email()]) email = EmailField("Email Address", validators=[validators.Email()])
password = PasswordField("Password", validators=[validators.Length(6)]) password = PasswordField("Password", validators=[validators.Length(6)])
org_name = StringField("Organization Name", validators=[validators.InputRequired()]) org_name = StringField("Organization Name", validators=[validators.InputRequired()])
security_notifications = BooleanField()
newsletter = BooleanField()
def create_org(org_name, user_name, email, password): def create_org(org_name, user_name, email, password):
@@ -54,6 +57,8 @@ def setup():
return redirect("/") return redirect("/")
form = SetupForm(request.form) form = SetupForm(request.form)
form.newsletter.data = True
form.security_notifications.data = True
if request.method == "POST" and form.validate(): if request.method == "POST" and form.validate():
default_org, user = create_org(form.org_name.data, form.name.data, form.email.data, form.password.data) default_org, user = create_org(form.org_name.data, form.name.data, form.email.data, form.password.data)
@@ -61,6 +66,10 @@ def setup():
g.org = default_org g.org = default_org
login_user(user) login_user(user)
# signup to newsletter if needed
if form.newsletter.data or form.security_notifications:
subscribe.delay(form.data)
return redirect(url_for("redash.index", org_slug=None)) return redirect(url_for("redash.index", org_slug=None))
return render_template("setup.html", form=form) return render_template("setup.html", form=form)

View File

@@ -5,7 +5,7 @@ from flask import g, has_request_context
from sqlalchemy.engine import Engine from sqlalchemy.engine import Engine
from sqlalchemy.event import listens_for from sqlalchemy.event import listens_for
from sqlalchemy.orm.util import _ORMJoin from sqlalchemy.orm.util import _ORMJoin
from sqlalchemy.sql.selectable import Alias from sqlalchemy.sql.selectable import Alias, Join
from redash import statsd_client from redash import statsd_client
@@ -18,7 +18,7 @@ def _table_name_from_select_element(elt):
if isinstance(t, Alias): if isinstance(t, Alias):
t = t.original.froms[0] t = t.original.froms[0]
while isinstance(t, _ORMJoin): while isinstance(t, _ORMJoin) or isinstance(t, Join):
t = t.left t = t.left
return t.name return t.name

View File

@@ -387,6 +387,10 @@ class QueryResult(db.Model, BelongsToOrgMixin):
def should_schedule_next(previous_iteration, now, interval, time=None, day_of_week=None, failures=0): def should_schedule_next(previous_iteration, now, interval, time=None, day_of_week=None, failures=0):
# if previous_iteration is None, it means the query has never been run before
# so we should schedule it immediately
if previous_iteration is None:
return True
# if time exists then interval > 23 hours (82800s) # if time exists then interval > 23 hours (82800s)
# if day_of_week exists then interval > 6 days (518400s) # if day_of_week exists then interval > 6 days (518400s)
if time is None: if time is None:
@@ -602,6 +606,11 @@ class Query(ChangeTrackingMixin, TimestampMixin, BelongsToOrgMixin, db.Model):
if query.schedule.get("disabled"): if query.schedule.get("disabled"):
continue continue
# Skip queries that have None for all schedule values. It's unclear whether this
# something that can happen in practice, but we have a test case for it.
if all(value is None for value in query.schedule.values()):
continue
if query.schedule["until"]: if query.schedule["until"]:
schedule_until = pytz.utc.localize(datetime.datetime.strptime(query.schedule["until"], "%Y-%m-%d")) schedule_until = pytz.utc.localize(datetime.datetime.strptime(query.schedule["until"], "%Y-%m-%d"))
@@ -613,7 +622,7 @@ class Query(ChangeTrackingMixin, TimestampMixin, BelongsToOrgMixin, db.Model):
) )
if should_schedule_next( if should_schedule_next(
retrieved_at or now, retrieved_at,
now, now,
query.schedule["interval"], query.schedule["interval"],
query.schedule["time"], query.schedule["time"],
@@ -899,6 +908,7 @@ def next_state(op, value, threshold):
# boolean value is Python specific and most likely will be confusing to # boolean value is Python specific and most likely will be confusing to
# users. # users.
value = str(value).lower() value = str(value).lower()
value_is_number = False
else: else:
try: try:
value = float(value) value = float(value)
@@ -916,6 +926,8 @@ def next_state(op, value, threshold):
if op(value, threshold): if op(value, threshold):
new_state = Alert.TRIGGERED_STATE new_state = Alert.TRIGGERED_STATE
elif not value_is_number and op not in [OPERATORS.get("!="), OPERATORS.get("=="), OPERATORS.get("equals")]:
new_state = Alert.UNKNOWN_STATE
else: else:
new_state = Alert.OK_STATE new_state = Alert.OK_STATE
@@ -957,9 +969,10 @@ class Alert(TimestampMixin, BelongsToOrgMixin, db.Model):
return super(Alert, cls).get_by_id_and_org(object_id, org, Query) return super(Alert, cls).get_by_id_and_org(object_id, org, Query)
def evaluate(self): def evaluate(self):
data = self.query_rel.latest_query_data.data data = self.query_rel.latest_query_data.data if self.query_rel.latest_query_data else None
new_state = self.UNKNOWN_STATE
if data["rows"] and self.options["column"] in data["rows"][0]: if data and data["rows"] and self.options["column"] in data["rows"][0]:
op = OPERATORS.get(self.options["op"], lambda v, t: False) op = OPERATORS.get(self.options["op"], lambda v, t: False)
if "selector" not in self.options: if "selector" not in self.options:
@@ -967,24 +980,27 @@ class Alert(TimestampMixin, BelongsToOrgMixin, db.Model):
else: else:
selector = self.options["selector"] selector = self.options["selector"]
if selector == "max": try:
max_val = float("-inf") if selector == "max":
for i in range(0, len(data["rows"])): max_val = float("-inf")
max_val = max(max_val, data["rows"][i][self.options["column"]]) for i in range(len(data["rows"])):
value = max_val max_val = max(max_val, float(data["rows"][i][self.options["column"]]))
elif selector == "min": value = max_val
min_val = float("inf") elif selector == "min":
for i in range(0, len(data["rows"])): min_val = float("inf")
min_val = min(min_val, data["rows"][i][self.options["column"]]) for i in range(len(data["rows"])):
value = min_val min_val = min(min_val, float(data["rows"][i][self.options["column"]]))
else: value = min_val
value = data["rows"][0][self.options["column"]] else:
value = data["rows"][0][self.options["column"]]
except ValueError:
return self.UNKNOWN_STATE
threshold = self.options["value"] threshold = self.options["value"]
new_state = next_state(op, value, threshold) if value is not None:
else: new_state = next_state(op, value, threshold)
new_state = self.UNKNOWN_STATE
return new_state return new_state
@@ -1007,7 +1023,6 @@ class Alert(TimestampMixin, BelongsToOrgMixin, db.Model):
result_table = [] # A two-dimensional array which can rendered as a table in Mustache result_table = [] # A two-dimensional array which can rendered as a table in Mustache
for row in data["rows"]: for row in data["rows"]:
result_table.append([row[col["name"]] for col in data["columns"]]) result_table.append([row[col["name"]] for col in data["columns"]])
print("OPTIONS", self.options)
context = { context = {
"ALERT_NAME": self.name, "ALERT_NAME": self.name,
"ALERT_URL": "{host}/alerts/{alert_id}".format(host=host, alert_id=self.id), "ALERT_URL": "{host}/alerts/{alert_id}".format(host=host, alert_id=self.id),

View File

@@ -59,7 +59,7 @@ def get_status():
def rq_job_ids(): def rq_job_ids():
queues = Queue.all(connection=redis_connection) queues = Queue.all(connection=rq_redis_connection)
started_jobs = [StartedJobRegistry(queue=q).get_job_ids() for q in queues] started_jobs = [StartedJobRegistry(queue=q).get_job_ids() for q in queues]
queued_jobs = [q.job_ids for q in queues] queued_jobs = [q.job_ids for q in queues]

View File

@@ -90,15 +90,26 @@ class Athena(BaseQueryRunner):
"title": "Athena cost per Tb scanned (USD)", "title": "Athena cost per Tb scanned (USD)",
"default": 5, "default": 5,
}, },
"result_reuse_enable": {
"type": "boolean",
"title": "Reuse Athena query results",
},
"result_reuse_minutes": {
"type": "number",
"title": "Minutes to reuse Athena query results",
"default": 60,
},
}, },
"required": ["region", "s3_staging_dir"], "required": ["region", "s3_staging_dir"],
"extra_options": ["glue", "catalog_ids", "cost_per_tb"], "extra_options": ["glue", "catalog_ids", "cost_per_tb", "result_reuse_enable", "result_reuse_minutes"],
"order": [ "order": [
"region", "region",
"s3_staging_dir", "s3_staging_dir",
"schema", "schema",
"work_group", "work_group",
"cost_per_tb", "cost_per_tb",
"result_reuse_enable",
"result_reuse_minutes",
], ],
"secret": ["aws_secret_key"], "secret": ["aws_secret_key"],
} }
@@ -199,10 +210,20 @@ class Athena(BaseQueryRunner):
logger.warning("Glue table doesn't have StorageDescriptor: %s", table_name) logger.warning("Glue table doesn't have StorageDescriptor: %s", table_name)
continue continue
if table_name not in schema: if table_name not in schema:
column = [columns["Name"] for columns in table["StorageDescriptor"]["Columns"]] schema[table_name] = {"name": table_name, "columns": []}
schema[table_name] = {"name": table_name, "columns": column}
for partition in table.get("PartitionKeys", []): for column_data in table["StorageDescriptor"]["Columns"]:
schema[table_name]["columns"].append(partition["Name"]) column = {
"name": column_data["Name"],
"type": column_data["Type"] if "Type" in column_data else None,
}
schema[table_name]["columns"].append(column)
for partition in table.get("PartitionKeys", []):
partition_column = {
"name": partition["Name"],
"type": partition["Type"] if "Type" in partition else None,
}
schema[table_name]["columns"].append(partition_column)
return list(schema.values()) return list(schema.values())
def get_schema(self, get_stats=False): def get_schema(self, get_stats=False):
@@ -212,7 +233,7 @@ class Athena(BaseQueryRunner):
schema = {} schema = {}
query = """ query = """
SELECT table_schema, table_name, column_name SELECT table_schema, table_name, column_name, data_type
FROM information_schema.columns FROM information_schema.columns
WHERE table_schema NOT IN ('information_schema') WHERE table_schema NOT IN ('information_schema')
""" """
@@ -225,7 +246,7 @@ class Athena(BaseQueryRunner):
table_name = "{0}.{1}".format(row["table_schema"], row["table_name"]) table_name = "{0}.{1}".format(row["table_schema"], row["table_name"])
if table_name not in schema: if table_name not in schema:
schema[table_name] = {"name": table_name, "columns": []} schema[table_name] = {"name": table_name, "columns": []}
schema[table_name]["columns"].append(row["column_name"]) schema[table_name]["columns"].append({"name": row["column_name"], "type": row["data_type"]})
return list(schema.values()) return list(schema.values())
@@ -237,6 +258,8 @@ class Athena(BaseQueryRunner):
kms_key=self.configuration.get("kms_key", None), kms_key=self.configuration.get("kms_key", None),
work_group=self.configuration.get("work_group", "primary"), work_group=self.configuration.get("work_group", "primary"),
formatter=SimpleFormatter(), formatter=SimpleFormatter(),
result_reuse_enable=self.configuration.get("result_reuse_enable", False),
result_reuse_minutes=self.configuration.get("result_reuse_minutes", 60),
**self._get_iam_credentials(user=user), **self._get_iam_credentials(user=user),
).cursor() ).cursor()

View File

@@ -7,6 +7,7 @@ from base64 import b64decode
from redash import settings from redash import settings
from redash.query_runner import ( from redash.query_runner import (
TYPE_BOOLEAN, TYPE_BOOLEAN,
TYPE_DATE,
TYPE_DATETIME, TYPE_DATETIME,
TYPE_FLOAT, TYPE_FLOAT,
TYPE_INTEGER, TYPE_INTEGER,
@@ -37,6 +38,8 @@ types_map = {
"BOOLEAN": TYPE_BOOLEAN, "BOOLEAN": TYPE_BOOLEAN,
"STRING": TYPE_STRING, "STRING": TYPE_STRING,
"TIMESTAMP": TYPE_DATETIME, "TIMESTAMP": TYPE_DATETIME,
"DATETIME": TYPE_DATETIME,
"DATE": TYPE_DATE,
} }
@@ -301,7 +304,7 @@ class BigQuery(BaseQueryRunner):
datasets = self._get_project_datasets(project_id) datasets = self._get_project_datasets(project_id)
query_base = """ query_base = """
SELECT table_schema, table_name, field_path SELECT table_schema, table_name, field_path, data_type
FROM `{dataset_id}`.INFORMATION_SCHEMA.COLUMN_FIELD_PATHS FROM `{dataset_id}`.INFORMATION_SCHEMA.COLUMN_FIELD_PATHS
WHERE table_schema NOT IN ('information_schema') WHERE table_schema NOT IN ('information_schema')
""" """
@@ -322,7 +325,7 @@ class BigQuery(BaseQueryRunner):
table_name = "{0}.{1}".format(row["table_schema"], row["table_name"]) table_name = "{0}.{1}".format(row["table_schema"], row["table_name"])
if table_name not in schema: if table_name not in schema:
schema[table_name] = {"name": table_name, "columns": []} schema[table_name] = {"name": table_name, "columns": []}
schema[table_name]["columns"].append(row["field_path"]) schema[table_name]["columns"].append({"name": row["field_path"], "type": row["data_type"]})
return list(schema.values()) return list(schema.values())

View File

@@ -91,8 +91,8 @@ class BaseElasticSearch(BaseQueryRunner):
logger.setLevel(logging.DEBUG) logger.setLevel(logging.DEBUG)
self.server_url = self.configuration["server"] self.server_url = self.configuration.get("server", "")
if self.server_url[-1] == "/": if self.server_url and self.server_url[-1] == "/":
self.server_url = self.server_url[:-1] self.server_url = self.server_url[:-1]
basic_auth_user = self.configuration.get("basic_auth_user", None) basic_auth_user = self.configuration.get("basic_auth_user", None)

View File

@@ -131,6 +131,17 @@ def parse_results(results: list, flatten: bool = False) -> list:
return rows, columns return rows, columns
def _sorted_fields(fields):
ord = {}
for k, v in fields.items():
if isinstance(v, int):
ord[k] = v
else:
ord[k] = len(fields)
return sorted(ord, key=ord.get)
class MongoDB(BaseQueryRunner): class MongoDB(BaseQueryRunner):
should_annotate_query = False should_annotate_query = False
@@ -177,7 +188,7 @@ class MongoDB(BaseQueryRunner):
self.syntax = "json" self.syntax = "json"
self.db_name = self.configuration["dbName"] self.db_name = self.configuration.get("dbName", "")
self.is_replica_set = ( self.is_replica_set = (
True if "replicaSetName" in self.configuration and self.configuration["replicaSetName"] else False True if "replicaSetName" in self.configuration and self.configuration["replicaSetName"] else False
@@ -365,7 +376,7 @@ class MongoDB(BaseQueryRunner):
if f: if f:
ordered_columns = [] ordered_columns = []
for k in sorted(f, key=f.get): for k in _sorted_fields(f):
column = _get_column_by_name(columns, k) column = _get_column_by_name(columns, k)
if column: if column:
ordered_columns.append(column) ordered_columns.append(column)

View File

@@ -152,7 +152,7 @@ class Mysql(BaseSQLQueryRunner):
col.table_name as table_name, col.table_name as table_name,
col.column_name as column_name col.column_name as column_name
FROM `information_schema`.`columns` col FROM `information_schema`.`columns` col
WHERE col.table_schema NOT IN ('information_schema', 'performance_schema', 'mysql', 'sys'); WHERE LOWER(col.table_schema) NOT IN ('information_schema', 'performance_schema', 'mysql', 'sys');
""" """
results, error = self.run_query(query, None) results, error = self.run_query(query, None)

View File

@@ -388,12 +388,13 @@ class Redshift(PostgreSQL):
SELECT DISTINCT table_name, SELECT DISTINCT table_name,
table_schema, table_schema,
column_name, column_name,
data_type,
ordinal_position AS pos ordinal_position AS pos
FROM svv_columns FROM svv_columns
WHERE table_schema NOT IN ('pg_internal','pg_catalog','information_schema') WHERE table_schema NOT IN ('pg_internal','pg_catalog','information_schema')
AND table_schema NOT LIKE 'pg_temp_%' AND table_schema NOT LIKE 'pg_temp_%'
) )
SELECT table_name, table_schema, column_name SELECT table_name, table_schema, column_name, data_type
FROM tables FROM tables
WHERE WHERE
HAS_SCHEMA_PRIVILEGE(table_schema, 'USAGE') AND HAS_SCHEMA_PRIVILEGE(table_schema, 'USAGE') AND

View File

@@ -55,12 +55,13 @@ class Script(BaseQueryRunner):
def __init__(self, configuration): def __init__(self, configuration):
super(Script, self).__init__(configuration) super(Script, self).__init__(configuration)
path = self.configuration.get("path", "")
# If path is * allow any execution path # If path is * allow any execution path
if self.configuration["path"] == "*": if path == "*":
return return
# Poor man's protection against running scripts from outside the scripts directory # Poor man's protection against running scripts from outside the scripts directory
if self.configuration["path"].find("../") > -1: if path.find("../") > -1:
raise ValueError("Scripts can only be run from the configured scripts directory") raise ValueError("Scripts can only be run from the configured scripts directory")
def test_connection(self): def test_connection(self):

View File

@@ -28,7 +28,7 @@ class Sqlite(BaseSQLQueryRunner):
def __init__(self, configuration): def __init__(self, configuration):
super(Sqlite, self).__init__(configuration) super(Sqlite, self).__init__(configuration)
self._dbpath = self.configuration["dbpath"] self._dbpath = self.configuration.get("dbpath", "")
def _get_tables(self, schema): def _get_tables(self, schema):
query_table = "select tbl_name from sqlite_master where type='table'" query_table = "select tbl_name from sqlite_master where type='table'"

View File

@@ -50,6 +50,7 @@ QUERY_RESULTS_EXPIRED_TTL_ENABLED = parse_boolean(os.environ.get("REDASH_QUERY_R
QUERY_RESULTS_EXPIRED_TTL = int(os.environ.get("REDASH_QUERY_RESULTS_EXPIRED_TTL", "86400")) QUERY_RESULTS_EXPIRED_TTL = int(os.environ.get("REDASH_QUERY_RESULTS_EXPIRED_TTL", "86400"))
SCHEMAS_REFRESH_SCHEDULE = int(os.environ.get("REDASH_SCHEMAS_REFRESH_SCHEDULE", 30)) SCHEMAS_REFRESH_SCHEDULE = int(os.environ.get("REDASH_SCHEMAS_REFRESH_SCHEDULE", 30))
SCHEMAS_REFRESH_TIMEOUT = int(os.environ.get("REDASH_SCHEMAS_REFRESH_TIMEOUT", 300))
AUTH_TYPE = os.environ.get("REDASH_AUTH_TYPE", "api_key") AUTH_TYPE = os.environ.get("REDASH_AUTH_TYPE", "api_key")
INVITATION_TOKEN_MAX_AGE = int(os.environ.get("REDASH_INVITATION_TOKEN_MAX_AGE", 60 * 60 * 24 * 7)) INVITATION_TOKEN_MAX_AGE = int(os.environ.get("REDASH_INVITATION_TOKEN_MAX_AGE", 60 * 60 * 24 * 7))
@@ -412,6 +413,7 @@ PAGE_SIZE_OPTIONS = list(
TABLE_CELL_MAX_JSON_SIZE = int(os.environ.get("REDASH_TABLE_CELL_MAX_JSON_SIZE", 50000)) TABLE_CELL_MAX_JSON_SIZE = int(os.environ.get("REDASH_TABLE_CELL_MAX_JSON_SIZE", 50000))
# Features: # Features:
VERSION_CHECK = parse_boolean(os.environ.get("REDASH_VERSION_CHECK", "true"))
FEATURE_DISABLE_REFRESH_QUERIES = parse_boolean(os.environ.get("REDASH_FEATURE_DISABLE_REFRESH_QUERIES", "false")) FEATURE_DISABLE_REFRESH_QUERIES = parse_boolean(os.environ.get("REDASH_FEATURE_DISABLE_REFRESH_QUERIES", "false"))
FEATURE_SHOW_QUERY_RESULTS_COUNT = parse_boolean(os.environ.get("REDASH_FEATURE_SHOW_QUERY_RESULTS_COUNT", "true")) FEATURE_SHOW_QUERY_RESULTS_COUNT = parse_boolean(os.environ.get("REDASH_FEATURE_SHOW_QUERY_RESULTS_COUNT", "true"))
FEATURE_ALLOW_CUSTOM_JS_VISUALIZATIONS = parse_boolean( FEATURE_ALLOW_CUSTOM_JS_VISUALIZATIONS = parse_boolean(

View File

@@ -45,6 +45,7 @@ HIDE_PLOTLY_MODE_BAR = parse_boolean(os.environ.get("HIDE_PLOTLY_MODE_BAR", "fal
DISABLE_PUBLIC_URLS = parse_boolean(os.environ.get("REDASH_DISABLE_PUBLIC_URLS", "false")) DISABLE_PUBLIC_URLS = parse_boolean(os.environ.get("REDASH_DISABLE_PUBLIC_URLS", "false"))
settings = { settings = {
"beacon_consent": None,
"auth_password_login_enabled": PASSWORD_LOGIN_ENABLED, "auth_password_login_enabled": PASSWORD_LOGIN_ENABLED,
"auth_saml_enabled": SAML_LOGIN_ENABLED, "auth_saml_enabled": SAML_LOGIN_ENABLED,
"auth_saml_type": SAML_LOGIN_TYPE, "auth_saml_type": SAML_LOGIN_TYPE,

View File

@@ -7,6 +7,7 @@ from redash.tasks.general import (
record_event, record_event,
send_mail, send_mail,
sync_user_details, sync_user_details,
version_check,
) )
from redash.tasks.queries import ( from redash.tasks.queries import (
cleanup_query_results, cleanup_query_results,

View File

@@ -5,6 +5,7 @@ from redash import mail, models, settings
from redash.models import users from redash.models import users
from redash.query_runner import NotSupported from redash.query_runner import NotSupported
from redash.tasks.worker import Queue from redash.tasks.worker import Queue
from redash.version_check import run_version_check
from redash.worker import get_job_logger, job from redash.worker import get_job_logger, job
logger = get_job_logger(__name__) logger = get_job_logger(__name__)
@@ -29,6 +30,27 @@ def record_event(raw_event):
logger.exception("Failed posting to %s", hook) logger.exception("Failed posting to %s", hook)
def version_check():
run_version_check()
@job("default")
def subscribe(form):
logger.info(
"Subscribing to: [security notifications=%s], [newsletter=%s]",
form["security_notifications"],
form["newsletter"],
)
data = {
"admin_name": form["name"],
"admin_email": form["email"],
"org_name": form["org_name"],
"security_notifications": form["security_notifications"],
"newsletter": form["newsletter"],
}
requests.post("https://version.redash.io/subscribe", json=data)
@job("emails") @job("emails")
def send_mail(to, subject, html, text): def send_mail(to, subject, html, text):
try: try:
@@ -50,7 +72,7 @@ def test_connection(data_source_id):
return True return True
@job("schemas", queue_class=Queue, at_front=True, timeout=300, ttl=90) @job("schemas", queue_class=Queue, at_front=True, timeout=settings.SCHEMAS_REFRESH_TIMEOUT, ttl=90)
def get_schema(data_source_id, refresh): def get_schema(data_source_id, refresh):
try: try:
data_source = models.DataSource.get_by_id(data_source_id) data_source = models.DataSource.get_by_id(data_source_id)

View File

@@ -15,6 +15,7 @@ from redash.tasks.alerts import check_alerts_for_query
from redash.tasks.failure_report import track_failure from redash.tasks.failure_report import track_failure
from redash.tasks.worker import Job, Queue from redash.tasks.worker import Job, Queue
from redash.utils import gen_query_hash, utcnow from redash.utils import gen_query_hash, utcnow
from redash.utils.locks import acquire_lock, release_lock
from redash.worker import get_job_logger from redash.worker import get_job_logger
logger = get_job_logger(__name__) logger = get_job_logger(__name__)
@@ -34,14 +35,18 @@ def enqueue_query(query, data_source, user_id, is_api_key=False, scheduled_query
logger.info("Inserting job for %s with metadata=%s", query_hash, metadata) logger.info("Inserting job for %s with metadata=%s", query_hash, metadata)
try_count = 0 try_count = 0
job = None job = None
job_lock_id = _job_lock_id(query_hash, data_source.id)
while try_count < 5: while try_count < 5:
try_count += 1 try_count += 1
identifier = acquire_lock(job_lock_id)
if identifier is None:
continue
pipe = redis_connection.pipeline() pipe = redis_connection.pipeline()
try: try:
pipe.watch(_job_lock_id(query_hash, data_source.id)) pipe.watch(job_lock_id)
job_id = pipe.get(_job_lock_id(query_hash, data_source.id)) job_id = pipe.get(job_lock_id)
if job_id: if job_id:
logger.info("[%s] Found existing job: %s", query_hash, job_id) logger.info("[%s] Found existing job: %s", query_hash, job_id)
job_complete = None job_complete = None
@@ -66,7 +71,7 @@ def enqueue_query(query, data_source, user_id, is_api_key=False, scheduled_query
if lock_is_irrelevant: if lock_is_irrelevant:
logger.info("[%s] %s, removing lock", query_hash, message) logger.info("[%s] %s, removing lock", query_hash, message)
redis_connection.delete(_job_lock_id(query_hash, data_source.id)) redis_connection.delete(job_lock_id)
job = None job = None
if not job: if not job:
@@ -115,6 +120,7 @@ def enqueue_query(query, data_source, user_id, is_api_key=False, scheduled_query
except redis.WatchError: except redis.WatchError:
continue continue
finally: finally:
release_lock(job_lock_id, identifier)
pipe.reset() pipe.reset()
if not job: if not job:

View File

@@ -157,7 +157,7 @@ def remove_ghost_locks():
logger.info("Locks found: {}, Locks removed: {}".format(len(locks), count)) logger.info("Locks found: {}, Locks removed: {}".format(len(locks), count))
@job("schemas") @job("schemas", timeout=settings.SCHEMAS_REFRESH_TIMEOUT)
def refresh_schema(data_source_id): def refresh_schema(data_source_id):
ds = models.DataSource.get_by_id(data_source_id) ds = models.DataSource.get_by_id(data_source_id)
logger.info("task=refresh_schema state=start ds_id=%s", ds.id) logger.info("task=refresh_schema state=start ds_id=%s", ds.id)

View File

@@ -8,7 +8,7 @@ from rq_scheduler import Scheduler
from redash import rq_redis_connection, settings from redash import rq_redis_connection, settings
from redash.tasks.failure_report import send_aggregated_errors from redash.tasks.failure_report import send_aggregated_errors
from redash.tasks.general import sync_user_details from redash.tasks.general import sync_user_details, version_check
from redash.tasks.queries import ( from redash.tasks.queries import (
cleanup_query_results, cleanup_query_results,
empty_schedules, empty_schedules,
@@ -79,6 +79,9 @@ def periodic_job_definitions():
}, },
] ]
if settings.VERSION_CHECK:
jobs.append({"func": version_check, "interval": timedelta(days=1)})
if settings.QUERY_RESULTS_CLEANUP_ENABLED: if settings.QUERY_RESULTS_CLEANUP_ENABLED:
jobs.append({"func": cleanup_query_results, "interval": timedelta(minutes=5)}) jobs.append({"func": cleanup_query_results, "interval": timedelta(minutes=5)})

View File

@@ -6,7 +6,7 @@ import sys
from rq import Queue as BaseQueue from rq import Queue as BaseQueue
from rq.job import Job as BaseJob from rq.job import Job as BaseJob
from rq.job import JobStatus from rq.job import JobStatus
from rq.timeouts import HorseMonitorTimeoutException, UnixSignalDeathPenalty from rq.timeouts import HorseMonitorTimeoutException
from rq.utils import utcnow from rq.utils import utcnow
from rq.worker import ( from rq.worker import (
HerokuWorker, # HerokuWorker implements graceful shutdown on SIGTERM HerokuWorker, # HerokuWorker implements graceful shutdown on SIGTERM
@@ -113,30 +113,44 @@ class HardLimitingWorker(BaseWorker):
) )
self.kill_horse() self.kill_horse()
def monitor_work_horse(self, job, queue): def monitor_work_horse(self, job: "Job", queue: "Queue"):
"""The worker will monitor the work horse and make sure that it """The worker will monitor the work horse and make sure that it
either executes successfully or the status of the job is set to either executes successfully or the status of the job is set to
failed failed
Args:
job (Job): _description_
queue (Queue): _description_
""" """
self.monitor_started = utcnow() self.monitor_started = utcnow()
retpid = ret_val = rusage = None
job.started_at = utcnow() job.started_at = utcnow()
while True: while True:
try: try:
with UnixSignalDeathPenalty(self.job_monitoring_interval, HorseMonitorTimeoutException): with self.death_penalty_class(self.job_monitoring_interval, HorseMonitorTimeoutException):
retpid, ret_val = os.waitpid(self._horse_pid, 0) retpid, ret_val, rusage = self.wait_for_horse()
break break
except HorseMonitorTimeoutException: except HorseMonitorTimeoutException:
# Horse has not exited yet and is still running. # Horse has not exited yet and is still running.
# Send a heartbeat to keep the worker alive. # Send a heartbeat to keep the worker alive.
self.heartbeat(self.job_monitoring_interval + 5) self.set_current_job_working_time((utcnow() - job.started_at).total_seconds())
job.refresh() job.refresh()
# Kill the job from this side if something is really wrong (interpreter lock/etc).
if job.timeout != -1 and self.current_job_working_time > (job.timeout + 60): # type: ignore
self.heartbeat(self.job_monitoring_interval + 60)
self.kill_horse()
self.wait_for_horse()
break
self.maintain_heartbeats(job)
if job.is_cancelled: if job.is_cancelled:
self.stop_executing_job(job) self.stop_executing_job(job)
if self.soft_limit_exceeded(job): if self.soft_limit_exceeded(job):
self.enforce_hard_limit(job) self.enforce_hard_limit(job)
except OSError as e: except OSError as e:
# In case we encountered an OSError due to EINTR (which is # In case we encountered an OSError due to EINTR (which is
# caused by a SIGINT or SIGTERM signal during # caused by a SIGINT or SIGTERM signal during
@@ -149,29 +163,32 @@ class HardLimitingWorker(BaseWorker):
# Send a heartbeat to keep the worker alive. # Send a heartbeat to keep the worker alive.
self.heartbeat() self.heartbeat()
self.set_current_job_working_time(0)
self._horse_pid = 0 # Set horse PID to 0, horse has finished working
if ret_val == os.EX_OK: # The process exited normally. if ret_val == os.EX_OK: # The process exited normally.
return return
job_status = job.get_status() job_status = job.get_status()
if job_status is None: # Job completed and its ttl has expired if job_status is None: # Job completed and its ttl has expired
return return
if job_status not in [JobStatus.FINISHED, JobStatus.FAILED]: elif self._stopped_job_id == job.id:
# Work-horse killed deliberately
self.log.warning("Job stopped by user, moving job to FailedJobRegistry")
if job.stopped_callback:
job.execute_stopped_callback(self.death_penalty_class)
self.handle_job_failure(job, queue=queue, exc_string="Job stopped by user, work-horse terminated.")
elif job_status not in [JobStatus.FINISHED, JobStatus.FAILED]:
if not job.ended_at: if not job.ended_at:
job.ended_at = utcnow() job.ended_at = utcnow()
# Unhandled failure: move the job to the failed queue # Unhandled failure: move the job to the failed queue
self.log.warning( signal_msg = f" (signal {os.WTERMSIG(ret_val)})" if ret_val and os.WIFSIGNALED(ret_val) else ""
( exc_string = f"Work-horse terminated unexpectedly; waitpid returned {ret_val}{signal_msg}; "
"Moving job to FailedJobRegistry " self.log.warning("Moving job to FailedJobRegistry (%s)", exc_string)
"(work-horse terminated unexpectedly; waitpid returned {})" # fmt: skip
).format(ret_val)
)
self.handle_job_failure( self.handle_work_horse_killed(job, retpid, ret_val, rusage)
job, self.handle_job_failure(job, queue=queue, exc_string=exc_string)
queue=queue,
exc_string="Work-horse process was terminated unexpectedly "
"(waitpid returned %s)" % ret_val, # fmt: skip
)
class RedashWorker(StatsdRecordingWorker, HardLimitingWorker): class RedashWorker(StatsdRecordingWorker, HardLimitingWorker):

View File

@@ -42,6 +42,20 @@
{{ render_field(form.email) }} {{ render_field(form.email) }}
{{ render_field(form.password) }} {{ render_field(form.password) }}
<div class="checkbox">
<label>
{{ form.security_notifications() }}
Subscribe to Security Notifications
</label>
</div>
<div class="checkbox">
<label>
{{ form.newsletter() }}
Subscribe to newsletter (version updates, no more than once a month)
</label>
</div>
<h4 class="m-t-25">General</h4> <h4 class="m-t-25">General</h4>
{{ render_field(form.org_name, help_block="Used in email notifications and the UI.") }} {{ render_field(form.org_name, help_block="Used in email notifications and the UI.") }}

View File

@@ -6,6 +6,7 @@ import decimal
import hashlib import hashlib
import io import io
import json import json
import math
import os import os
import random import random
import re import re
@@ -120,6 +121,17 @@ def json_loads(data, *args, **kwargs):
return json.loads(data, *args, **kwargs) return json.loads(data, *args, **kwargs)
# Convert NaN, Inf, and -Inf to None, as they are not valid JSON values.
def _sanitize_data(data):
if isinstance(data, dict):
return {k: _sanitize_data(v) for k, v in data.items()}
if isinstance(data, list):
return [_sanitize_data(v) for v in data]
if isinstance(data, float) and (math.isnan(data) or math.isinf(data)):
return None
return data
def json_dumps(data, *args, **kwargs): def json_dumps(data, *args, **kwargs):
"""A custom JSON dumping function which passes all parameters to the """A custom JSON dumping function which passes all parameters to the
json.dumps function.""" json.dumps function."""
@@ -128,7 +140,7 @@ def json_dumps(data, *args, **kwargs):
# Float value nan or inf in Python should be render to None or null in json. # Float value nan or inf in Python should be render to None or null in json.
# Using allow_nan = True will make Python render nan as NaN, leading to parse error in front-end # Using allow_nan = True will make Python render nan as NaN, leading to parse error in front-end
kwargs.setdefault("allow_nan", False) kwargs.setdefault("allow_nan", False)
return json.dumps(data, *args, **kwargs) return json.dumps(_sanitize_data(data), *args, **kwargs)
def mustache_render(template, context=None, **kwargs): def mustache_render(template, context=None, **kwargs):

61
redash/utils/locks.py Normal file
View File

@@ -0,0 +1,61 @@
import logging
import random
import time
import uuid
from redis import WatchError
from redash import redis_connection
logger = logging.getLogger(__name__)
def acquire_lock(name, acquire_timeout=10, lock_timeout=5):
identifier = str(uuid.uuid4())
lock_name = f"lock:{name}"
end = time.time() + acquire_timeout
base_delay = 0.001
max_delay = 0.05
while time.time() < end:
if redis_connection.set(lock_name, identifier, ex=lock_timeout, nx=True):
logger.info("acquire_lock, lock_name=[%s], identifier=[%s]", lock_name, identifier)
return identifier
delay = base_delay + random.uniform(0, base_delay)
time.sleep(min(delay, max_delay))
base_delay = min(base_delay * 2, max_delay)
return None
def release_lock(name, identifier):
lock_name = f"lock:{name}"
logger.info("release_lock, lock_name=[%s], identifier=[%s]", lock_name, identifier)
with redis_connection.pipeline() as pipe:
while True:
try:
pipe.watch(lock_name)
if pipe.get(lock_name) == identifier:
pipe.multi()
pipe.delete(lock_name)
pipe.execute()
logger.info("Lock released successfully, lock_name=[%s], identifier=[%s]", lock_name, identifier)
return True
pipe.unwatch()
logger.warning(
"Lock not owned by this identifier, lock_name=[%s], identifier=[%s]", lock_name, identifier
)
break
except WatchError:
logger.warning(
"WatchError occurred, retrying lock release, lock_name=[%s], identifier=[%s]",
lock_name,
identifier,
)
except Exception as e:
logger.error("Error releasing lock: %s", str(e))
break
return False

View File

@@ -33,7 +33,7 @@ from sqlalchemy.orm import mapperlib
from sqlalchemy.orm.properties import ColumnProperty from sqlalchemy.orm.properties import ColumnProperty
from sqlalchemy.orm.query import _ColumnEntity from sqlalchemy.orm.query import _ColumnEntity
from sqlalchemy.orm.util import AliasedInsp from sqlalchemy.orm.util import AliasedInsp
from sqlalchemy.sql.expression import asc, desc from sqlalchemy.sql.expression import asc, desc, nullslast
def get_query_descriptor(query, entity, attr): def get_query_descriptor(query, entity, attr):
@@ -225,7 +225,7 @@ class QuerySorter:
def assign_order_by(self, entity, attr, func): def assign_order_by(self, entity, attr, func):
expr = get_query_descriptor(self.query, entity, attr) expr = get_query_descriptor(self.query, entity, attr)
if expr is not None: if expr is not None:
return self.query.order_by(func(expr)) return self.query.order_by(nullslast(func(expr)))
if not self.silent: if not self.silent:
raise QuerySorterException("Could not sort query with expression '%s'" % attr) raise QuerySorterException("Could not sort query with expression '%s'" % attr)
return self.query return self.query

103
redash/version_check.py Normal file
View File

@@ -0,0 +1,103 @@
import logging
import requests
import semver
from redash import __version__ as current_version
from redash import redis_connection
from redash.models import Organization, db
REDIS_KEY = "new_version_available"
def usage_data():
counts_query = """
SELECT 'users_count' as name, count(0) as value
FROM users
WHERE disabled_at is null
UNION ALL
SELECT 'queries_count' as name, count(0) as value
FROM queries
WHERE is_archived is false
UNION ALL
SELECT 'alerts_count' as name, count(0) as value
FROM alerts
UNION ALL
SELECT 'dashboards_count' as name, count(0) as value
FROM dashboards
WHERE is_archived is false
UNION ALL
SELECT 'widgets_count' as name, count(0) as value
FROM widgets
WHERE visualization_id is not null
UNION ALL
SELECT 'textbox_count' as name, count(0) as value
FROM widgets
WHERE visualization_id is null
"""
data_sources_query = "SELECT type, count(0) FROM data_sources GROUP by 1"
visualizations_query = "SELECT type, count(0) FROM visualizations GROUP by 1"
destinations_query = "SELECT type, count(0) FROM notification_destinations GROUP by 1"
data = {name: value for (name, value) in db.session.execute(counts_query)}
data["data_sources"] = {name: value for (name, value) in db.session.execute(data_sources_query)}
data["visualization_types"] = {name: value for (name, value) in db.session.execute(visualizations_query)}
data["destination_types"] = {name: value for (name, value) in db.session.execute(destinations_query)}
return data
def run_version_check():
logging.info("Performing version check.")
logging.info("Current version: %s", current_version)
data = {"current_version": current_version}
if Organization.query.first().get_setting("beacon_consent"):
data["usage"] = usage_data()
try:
response = requests.post(
"https://version.redash.io/api/report?channel=stable",
json=data,
timeout=3.0,
)
latest_version = response.json()["release"]["version"]
_compare_and_update(latest_version)
except requests.RequestException:
logging.exception("Failed checking for new version.")
except (ValueError, KeyError):
logging.exception("Failed checking for new version (probably bad/non-JSON response).")
def reset_new_version_status():
latest_version = get_latest_version()
if latest_version:
_compare_and_update(latest_version)
def get_latest_version():
return redis_connection.get(REDIS_KEY)
def _compare_and_update(latest_version):
# TODO: support alpha channel (allow setting which channel to check & parse build number)
is_newer = semver.compare(current_version, latest_version) == -1
logging.info("Latest version: %s (newer: %s)", latest_version, is_newer)
if is_newer:
redis_connection.set(REDIS_KEY, latest_version)
else:
redis_connection.delete(REDIS_KEY)

View File

@@ -261,15 +261,19 @@ def test_webex_notify_calls_requests_post():
alert.name = "Test Alert" alert.name = "Test Alert"
alert.custom_subject = "Test custom subject" alert.custom_subject = "Test custom subject"
alert.custom_body = "Test custom body" alert.custom_body = "Test custom body"
alert.render_template = mock.Mock(return_value={"Rendered": "template"}) alert.render_template = mock.Mock(return_value={"Rendered": "template"})
query = mock.Mock() query = mock.Mock()
query.id = 1 query.id = 1
user = mock.Mock() user = mock.Mock()
app = mock.Mock() app = mock.Mock()
host = "https://localhost:5000" host = "https://localhost:5000"
options = {"webex_bot_token": "abcd", "to_room_ids": "1234"} options = {
"webex_bot_token": "abcd",
"to_room_ids": "1234,5678",
"to_person_emails": "example1@test.com,example2@test.com",
}
metadata = {"Scheduled": False} metadata = {"Scheduled": False}
new_state = Alert.TRIGGERED_STATE new_state = Alert.TRIGGERED_STATE
@@ -277,7 +281,7 @@ def test_webex_notify_calls_requests_post():
with mock.patch("redash.destinations.webex.requests.post") as mock_post: with mock.patch("redash.destinations.webex.requests.post") as mock_post:
mock_response = mock.Mock() mock_response = mock.Mock()
mock_response.status_code = 204 mock_response.status_code = 200
mock_post.return_value = mock_response mock_post.return_value = mock_response
destination.notify(alert, query, user, new_state, app, host, metadata, options) destination.notify(alert, query, user, new_state, app, host, metadata, options)
@@ -285,13 +289,111 @@ def test_webex_notify_calls_requests_post():
query_link = f"{host}/queries/{query.id}" query_link = f"{host}/queries/{query.id}"
alert_link = f"{host}/alerts/{alert.id}" alert_link = f"{host}/alerts/{alert.id}"
formatted_attachments = Webex.formatted_attachments_template( expected_attachments = Webex.formatted_attachments_template(
alert.custom_subject, alert.custom_body, query_link, alert_link
)
expected_payload_room = {
"markdown": alert.custom_subject + "\n" + alert.custom_body,
"attachments": expected_attachments,
"roomId": "1234",
}
expected_payload_email = {
"markdown": alert.custom_subject + "\n" + alert.custom_body,
"attachments": expected_attachments,
"toPersonEmail": "example1@test.com",
}
# Check that requests.post was called for both roomId and toPersonEmail destinations
mock_post.assert_any_call(
destination.api_base_url,
json=expected_payload_room,
headers={"Authorization": "Bearer abcd"},
timeout=5.0,
)
mock_post.assert_any_call(
destination.api_base_url,
json=expected_payload_email,
headers={"Authorization": "Bearer abcd"},
timeout=5.0,
)
assert mock_response.status_code == 200
def test_webex_notify_handles_blank_entries():
alert = mock.Mock(spec_set=["id", "name", "custom_subject", "custom_body", "render_template"])
alert.id = 1
alert.name = "Test Alert"
alert.custom_subject = "Test custom subject"
alert.custom_body = "Test custom body"
alert.render_template = mock.Mock(return_value={"Rendered": "template"})
query = mock.Mock()
query.id = 1
user = mock.Mock()
app = mock.Mock()
host = "https://localhost:5000"
options = {
"webex_bot_token": "abcd",
"to_room_ids": "",
"to_person_emails": "",
}
metadata = {"Scheduled": False}
new_state = Alert.TRIGGERED_STATE
destination = Webex(options)
with mock.patch("redash.destinations.webex.requests.post") as mock_post:
destination.notify(alert, query, user, new_state, app, host, metadata, options)
# Ensure no API calls are made when destinations are blank
mock_post.assert_not_called()
def test_webex_notify_handles_2d_array():
alert = mock.Mock(spec_set=["id", "name", "custom_subject", "custom_body", "render_template"])
alert.id = 1
alert.name = "Test Alert"
alert.custom_subject = "Test custom subject"
alert.custom_body = "Test custom body with table [['Col1', 'Col2'], ['Val1', 'Val2']]"
alert.render_template = mock.Mock(return_value={"Rendered": "template"})
query = mock.Mock()
query.id = 1
user = mock.Mock()
app = mock.Mock()
host = "https://localhost:5000"
options = {
"webex_bot_token": "abcd",
"to_room_ids": "1234",
}
metadata = {"Scheduled": False}
new_state = Alert.TRIGGERED_STATE
destination = Webex(options)
with mock.patch("redash.destinations.webex.requests.post") as mock_post:
mock_response = mock.Mock()
mock_response.status_code = 200
mock_post.return_value = mock_response
destination.notify(alert, query, user, new_state, app, host, metadata, options)
query_link = f"{host}/queries/{query.id}"
alert_link = f"{host}/alerts/{alert.id}"
expected_attachments = Webex.formatted_attachments_template(
alert.custom_subject, alert.custom_body, query_link, alert_link alert.custom_subject, alert.custom_body, query_link, alert_link
) )
expected_payload = { expected_payload = {
"markdown": alert.custom_subject + "\n" + alert.custom_body, "markdown": alert.custom_subject + "\n" + alert.custom_body,
"attachments": formatted_attachments, "attachments": expected_attachments,
"roomId": "1234", "roomId": "1234",
} }
@@ -302,7 +404,60 @@ def test_webex_notify_calls_requests_post():
timeout=5.0, timeout=5.0,
) )
assert mock_response.status_code == 204 assert mock_response.status_code == 200
def test_webex_notify_handles_1d_array():
alert = mock.Mock(spec_set=["id", "name", "custom_subject", "custom_body", "render_template"])
alert.id = 1
alert.name = "Test Alert"
alert.custom_subject = "Test custom subject"
alert.custom_body = "Test custom body with 1D array, however unlikely ['Col1', 'Col2']"
alert.render_template = mock.Mock(return_value={"Rendered": "template"})
query = mock.Mock()
query.id = 1
user = mock.Mock()
app = mock.Mock()
host = "https://localhost:5000"
options = {
"webex_bot_token": "abcd",
"to_room_ids": "1234",
}
metadata = {"Scheduled": False}
new_state = Alert.TRIGGERED_STATE
destination = Webex(options)
with mock.patch("redash.destinations.webex.requests.post") as mock_post:
mock_response = mock.Mock()
mock_response.status_code = 200
mock_post.return_value = mock_response
destination.notify(alert, query, user, new_state, app, host, metadata, options)
query_link = f"{host}/queries/{query.id}"
alert_link = f"{host}/alerts/{alert.id}"
expected_attachments = Webex.formatted_attachments_template(
alert.custom_subject, alert.custom_body, query_link, alert_link
)
expected_payload = {
"markdown": alert.custom_subject + "\n" + alert.custom_body,
"attachments": expected_attachments,
"roomId": "1234",
}
mock_post.assert_called_once_with(
destination.api_base_url,
json=expected_payload,
headers={"Authorization": "Bearer abcd"},
timeout=5.0,
)
assert mock_response.status_code == 200
def test_datadog_notify_calls_requests_post(): def test_datadog_notify_calls_requests_post():

View File

@@ -71,23 +71,56 @@ class TestAlertEvaluate(BaseTestCase):
alert = self.create_alert(results) alert = self.create_alert(results)
self.assertEqual(alert.evaluate(), Alert.UNKNOWN_STATE) self.assertEqual(alert.evaluate(), Alert.UNKNOWN_STATE)
def test_evaluates_correctly_with_max_selector(self):
results = {"rows": [{"foo": 1}, {"foo": 2}], "columns": [{"name": "foo", "type": "STRING"}]}
alert = self.create_alert(results)
alert.options["selector"] = "max"
self.assertEqual(alert.evaluate(), Alert.OK_STATE)
def test_evaluates_correctly_with_min_selector(self):
results = {"rows": [{"foo": 2}, {"foo": 1}], "columns": [{"name": "foo", "type": "STRING"}]}
alert = self.create_alert(results)
alert.options["selector"] = "min"
self.assertEqual(alert.evaluate(), Alert.TRIGGERED_STATE)
def test_evaluates_correctly_with_first_selector(self): def test_evaluates_correctly_with_first_selector(self):
results = {"rows": [{"foo": 1}, {"foo": 2}], "columns": [{"name": "foo", "type": "STRING"}]} results = {"rows": [{"foo": 1}, {"foo": 2}], "columns": [{"name": "foo", "type": "INTEGER"}]}
alert = self.create_alert(results) alert = self.create_alert(results)
alert.options["selector"] = "first" alert.options["selector"] = "first"
self.assertEqual(alert.evaluate(), Alert.TRIGGERED_STATE) self.assertEqual(alert.evaluate(), Alert.TRIGGERED_STATE)
results = {
"rows": [{"foo": "test"}, {"foo": "test"}, {"foo": "test"}],
"columns": [{"name": "foo", "type": "STRING"}],
}
alert = self.create_alert(results)
alert.options["selector"] = "first"
alert.options["op"] = "<"
self.assertEqual(alert.evaluate(), Alert.UNKNOWN_STATE)
def test_evaluates_correctly_with_min_selector(self):
results = {"rows": [{"foo": 2}, {"foo": 1}], "columns": [{"name": "foo", "type": "INTEGER"}]}
alert = self.create_alert(results)
alert.options["selector"] = "min"
self.assertEqual(alert.evaluate(), Alert.TRIGGERED_STATE)
results = {
"rows": [{"foo": "test"}, {"foo": "test"}, {"foo": "test"}],
"columns": [{"name": "foo", "type": "STRING"}],
}
alert = self.create_alert(results)
alert.options["selector"] = "min"
self.assertEqual(alert.evaluate(), Alert.UNKNOWN_STATE)
def test_evaluates_correctly_with_max_selector(self):
results = {"rows": [{"foo": 1}, {"foo": 2}], "columns": [{"name": "foo", "type": "INTEGER"}]}
alert = self.create_alert(results)
alert.options["selector"] = "max"
self.assertEqual(alert.evaluate(), Alert.OK_STATE)
results = {
"rows": [{"foo": "test"}, {"foo": "test"}, {"foo": "test"}],
"columns": [{"name": "foo", "type": "STRING"}],
}
alert = self.create_alert(results)
alert.options["selector"] = "max"
self.assertEqual(alert.evaluate(), Alert.UNKNOWN_STATE)
def test_evaluate_alerts_without_query_rel(self):
query = self.factory.create_query(latest_query_data_id=None)
alert = self.factory.create_alert(
query_rel=query, options={"selector": "first", "op": "equals", "column": "foo", "value": "1"}
)
self.assertEqual(alert.evaluate(), Alert.UNKNOWN_STATE)
def test_evaluate_return_unknown_when_value_is_none(self):
alert = self.create_alert(get_results(None))
self.assertEqual(alert.evaluate(), Alert.UNKNOWN_STATE)
class TestNextState(TestCase): class TestNextState(TestCase):

View File

@@ -75,7 +75,9 @@ class TestGlueSchema(TestCase):
{"DatabaseName": "test1"}, {"DatabaseName": "test1"},
) )
with self.stubber: with self.stubber:
assert query_runner.get_schema() == [{"columns": ["row_id"], "name": "test1.jdbc_table"}] assert query_runner.get_schema() == [
{"columns": [{"name": "row_id", "type": "int"}], "name": "test1.jdbc_table"}
]
def test_partitioned_table(self): def test_partitioned_table(self):
""" """
@@ -124,7 +126,12 @@ class TestGlueSchema(TestCase):
{"DatabaseName": "test1"}, {"DatabaseName": "test1"},
) )
with self.stubber: with self.stubber:
assert query_runner.get_schema() == [{"columns": ["sk", "category"], "name": "test1.partitioned_table"}] assert query_runner.get_schema() == [
{
"columns": [{"name": "sk", "type": "int"}, {"name": "category", "type": "int"}],
"name": "test1.partitioned_table",
}
]
def test_view(self): def test_view(self):
query_runner = Athena({"glue": True, "region": "mars-east-1"}) query_runner = Athena({"glue": True, "region": "mars-east-1"})
@@ -156,7 +163,7 @@ class TestGlueSchema(TestCase):
{"DatabaseName": "test1"}, {"DatabaseName": "test1"},
) )
with self.stubber: with self.stubber:
assert query_runner.get_schema() == [{"columns": ["sk"], "name": "test1.view"}] assert query_runner.get_schema() == [{"columns": [{"name": "sk", "type": "int"}], "name": "test1.view"}]
def test_dodgy_table_does_not_break_schema_listing(self): def test_dodgy_table_does_not_break_schema_listing(self):
""" """
@@ -196,7 +203,9 @@ class TestGlueSchema(TestCase):
{"DatabaseName": "test1"}, {"DatabaseName": "test1"},
) )
with self.stubber: with self.stubber:
assert query_runner.get_schema() == [{"columns": ["region"], "name": "test1.csv"}] assert query_runner.get_schema() == [
{"columns": [{"name": "region", "type": "string"}], "name": "test1.csv"}
]
def test_no_storage_descriptor_table(self): def test_no_storage_descriptor_table(self):
""" """
@@ -312,6 +321,6 @@ class TestGlueSchema(TestCase):
) )
with self.stubber: with self.stubber:
assert query_runner.get_schema() == [ assert query_runner.get_schema() == [
{"columns": ["row_id"], "name": "test1.jdbc_table"}, {"columns": [{"name": "row_id", "type": "int"}], "name": "test1.jdbc_table"},
{"columns": ["row_id"], "name": "test2.jdbc_table"}, {"columns": [{"name": "row_id", "type": "int"}], "name": "test2.jdbc_table"},
] ]

View File

@@ -39,18 +39,8 @@ class TestMongoDB(TestCase):
self.assertNotIn("password", mongo_client.call_args.kwargs) self.assertNotIn("password", mongo_client.call_args.kwargs)
def test_run_query_with_fields(self, mongo_client): def test_run_query_with_fields(self, mongo_client):
config = {
"connectionString": "mongodb://localhost:27017/test",
"username": "test_user",
"password": "test_pass",
"dbName": "test",
}
mongo_qr = MongoDB(config)
query = {"collection": "test", "query": {"age": 10}, "fields": {"_id": 1, "name": 2}} query = {"collection": "test", "query": {"age": 10}, "fields": {"_id": 1, "name": 2}}
return_value = [{"_id": "6569ee53d53db7930aaa0cc0", "name": "test2"}] return_value = [{"_id": "6569ee53d53db7930aaa0cc0", "name": "test2"}]
expected = { expected = {
"columns": [ "columns": [
{"name": "_id", "friendly_name": "_id", "type": TYPE_STRING}, {"name": "_id", "friendly_name": "_id", "type": TYPE_STRING},
@@ -60,20 +50,28 @@ class TestMongoDB(TestCase):
} }
mongo_client().__getitem__().__getitem__().find.return_value = return_value mongo_client().__getitem__().__getitem__().find.return_value = return_value
result, err = mongo_qr.run_query(json_dumps(query), None) self._test_query(query, return_value, expected)
self.assertIsNone(err) def test_run_query_with_func(self, mongo_client):
self.assertEqual(expected, result) query = {
"collection": "test",
"query": {"age": 10},
"fields": {"_id": 1, "name": 4, "link": {"$concat": ["hoge_", "$name"]}},
}
return_value = [{"_id": "6569ee53d53db7930aaa0cc0", "name": "test2", "link": "hoge_test2"}]
expected = {
"columns": [
{"name": "_id", "friendly_name": "_id", "type": TYPE_STRING},
{"name": "link", "friendly_name": "link", "type": TYPE_STRING},
{"name": "name", "friendly_name": "name", "type": TYPE_STRING},
],
"rows": return_value,
}
mongo_client().__getitem__().__getitem__().find.return_value = return_value
self._test_query(query, return_value, expected)
def test_run_query_with_aggregate(self, mongo_client): def test_run_query_with_aggregate(self, mongo_client):
config = {
"connectionString": "mongodb://localhost:27017/test",
"username": "test_user",
"password": "test_pass",
"dbName": "test",
}
mongo_qr = MongoDB(config)
query = { query = {
"collection": "test", "collection": "test",
"aggregate": [ "aggregate": [
@@ -82,9 +80,7 @@ class TestMongoDB(TestCase):
{"$sort": [{"name": "count", "direction": -1}, {"name": "_id", "direction": -1}]}, {"$sort": [{"name": "count", "direction": -1}, {"name": "_id", "direction": -1}]},
], ],
} }
return_value = [{"_id": "foo", "count": 10}, {"_id": "bar", "count": 9}] return_value = [{"_id": "foo", "count": 10}, {"_id": "bar", "count": 9}]
expected = { expected = {
"columns": [ "columns": [
{"name": "_id", "friendly_name": "_id", "type": TYPE_STRING}, {"name": "_id", "friendly_name": "_id", "type": TYPE_STRING},
@@ -94,6 +90,17 @@ class TestMongoDB(TestCase):
} }
mongo_client().__getitem__().__getitem__().aggregate.return_value = return_value mongo_client().__getitem__().__getitem__().aggregate.return_value = return_value
self._test_query(query, return_value, expected)
def _test_query(self, query, return_value, expected):
config = {
"connectionString": "mongodb://localhost:27017/test",
"username": "test_user",
"password": "test_pass",
"dbName": "test",
}
mongo_qr = MongoDB(config)
result, err = mongo_qr.run_query(json_dumps(query), None) result, err = mongo_qr.run_query(json_dumps(query), None)
self.assertIsNone(err) self.assertIsNone(err)
self.assertEqual(expected, result) self.assertEqual(expected, result)

View File

@@ -25,3 +25,19 @@ class TestBuildSchema(TestCase):
self.assertListEqual(schema["main.users"]["columns"], ["id", "name"]) self.assertListEqual(schema["main.users"]["columns"], ["id", "name"])
self.assertIn('public."main.users"', schema.keys()) self.assertIn('public."main.users"', schema.keys())
self.assertListEqual(schema['public."main.users"']["columns"], ["id"]) self.assertListEqual(schema['public."main.users"']["columns"], ["id"])
def test_build_schema_with_data_types(self):
results = {
"rows": [
{"table_schema": "main", "table_name": "users", "column_name": "id", "data_type": "integer"},
{"table_schema": "main", "table_name": "users", "column_name": "name", "data_type": "varchar"},
]
}
schema = {}
build_schema(results, schema)
self.assertListEqual(
schema["main.users"]["columns"], [{"name": "id", "type": "integer"}, {"name": "name", "type": "varchar"}]
)

View File

@@ -216,6 +216,20 @@ class QueryOutdatedQueriesTest(BaseTestCase):
self.assertEqual(list(models.Query.outdated_queries()), [query2]) self.assertEqual(list(models.Query.outdated_queries()), [query2])
def test_enqueues_scheduled_query_without_latest_query_data(self):
"""
Queries with a schedule but no latest_query_data will still be reported by Query.outdated_queries()
"""
query = self.factory.create_query(
schedule=self.schedule(interval="60"),
data_source=self.factory.create_data_source(),
)
outdated_queries = models.Query.outdated_queries()
self.assertEqual(query.latest_query_data, None)
self.assertEqual(len(outdated_queries), 1)
self.assertIn(query, outdated_queries)
def test_enqueues_query_with_correct_data_source(self): def test_enqueues_query_with_correct_data_source(self):
""" """
Queries from different data sources will be reported by Queries from different data sources will be reported by

23
tests/test_monitor.py Normal file
View File

@@ -0,0 +1,23 @@
from unittest.mock import MagicMock, patch
from redash import rq_redis_connection
from redash.monitor import rq_job_ids
def test_rq_job_ids_uses_rq_redis_connection():
mock_queue = MagicMock()
mock_queue.job_ids = []
mock_registry = MagicMock()
mock_registry.get_job_ids.return_value = []
with patch("redash.monitor.Queue") as mock_Queue, patch(
"redash.monitor.StartedJobRegistry"
) as mock_StartedJobRegistry:
mock_Queue.all.return_value = [mock_queue]
mock_StartedJobRegistry.return_value = mock_registry
rq_job_ids()
mock_Queue.all.assert_called_once_with(connection=rq_redis_connection)
mock_StartedJobRegistry.assert_called_once_with(queue=mock_queue)

View File

@@ -0,0 +1,31 @@
from redash.utils import json_dumps, json_loads
from tests import BaseTestCase
class TestJsonDumps(BaseTestCase):
"""
NaN, Inf, and -Inf are sanitized to None.
"""
def test_data_with_nan_is_sanitized(self):
input_data = {
"columns": [
{"name": "_col0", "friendly_name": "_col0", "type": "float"},
{"name": "_col1", "friendly_name": "_col1", "type": "float"},
{"name": "_col2", "friendly_name": "_col1", "type": "float"},
{"name": "_col3", "friendly_name": "_col1", "type": "float"},
],
"rows": [{"_col0": 1.0, "_col1": float("nan"), "_col2": float("inf"), "_col3": float("-inf")}],
}
expected_output_data = {
"columns": [
{"name": "_col0", "friendly_name": "_col0", "type": "float"},
{"name": "_col1", "friendly_name": "_col1", "type": "float"},
{"name": "_col2", "friendly_name": "_col1", "type": "float"},
{"name": "_col3", "friendly_name": "_col1", "type": "float"},
],
"rows": [{"_col0": 1.0, "_col1": None, "_col2": None, "_col3": None}],
}
json_data = json_dumps(input_data)
actual_output_data = json_loads(json_data)
self.assertEquals(actual_output_data, expected_output_data)

View File

@@ -46,7 +46,7 @@
"@types/jest": "^26.0.18", "@types/jest": "^26.0.18",
"@types/leaflet": "^1.5.19", "@types/leaflet": "^1.5.19",
"@types/numeral": "0.0.28", "@types/numeral": "0.0.28",
"@types/plotly.js": "^1.54.22", "@types/plotly.js": "^2.35.2",
"@types/react": "^17.0.0", "@types/react": "^17.0.0",
"@types/react-dom": "^17.0.0", "@types/react-dom": "^17.0.0",
"@types/tinycolor2": "^1.4.2", "@types/tinycolor2": "^1.4.2",
@@ -91,7 +91,7 @@
"leaflet.markercluster": "^1.1.0", "leaflet.markercluster": "^1.1.0",
"lodash": "^4.17.10", "lodash": "^4.17.10",
"numeral": "^2.0.6", "numeral": "^2.0.6",
"plotly.js": "1.58.5", "plotly.js": "2.35.3",
"react-pivottable": "^0.9.0", "react-pivottable": "^0.9.0",
"react-sortable-hoc": "^1.10.1", "react-sortable-hoc": "^1.10.1",
"tinycolor2": "^1.4.1", "tinycolor2": "^1.4.1",

View File

@@ -1,6 +1,6 @@
import { values } from "lodash"; import { values } from "lodash";
// The following colors will be used if you pick "Automatic" color // Define color palettes
export const BaseColors = { export const BaseColors = {
Blue: "#356AFF", Blue: "#356AFF",
Red: "#E92828", Red: "#E92828",
@@ -28,11 +28,78 @@ export const AdditionalColors = {
"Pink 2": "#C63FA9", "Pink 2": "#C63FA9",
}; };
export const ColorPaletteArray = values(BaseColors); const Viridis = {
1: '#440154',
2: '#48186a',
3: '#472d7b',
4: '#424086',
5: '#3b528b',
6: '#33638d',
7: '#2c728e',
8: '#26828e',
9: '#21918c',
10: '#1fa088',
11: '#28ae80',
12: '#3fbc73',
13: '#5ec962',
14: '#84d44b',
15: '#addc30',
16: '#d8e219',
17: '#fde725',
};
const ColorPalette = { const Tableau = {
1 : "#4e79a7",
2 : "#f28e2c",
3 : "#e15759",
4 : "#76b7b2",
5 : "#59a14f",
6 : "#edc949",
7 : "#af7aa1",
8 : "#ff9da7",
9 : "#9c755f",
10 : "#bab0ab",
}
const D3Category10 = {
1 : "#1f77b4",
2 : "#ff7f0e",
3 : "#2ca02c",
4 : "#d62728",
5 : "#9467bd",
6 : "#8c564b",
7 : "#e377c2",
8 : "#7f7f7f",
9 : "#bcbd22",
10 : "#17becf",
}
let ColorPalette = {
...BaseColors, ...BaseColors,
...AdditionalColors, ...AdditionalColors,
}; };
export const ColorPaletteArray = values(ColorPalette);
export default ColorPalette; export default ColorPalette;
export const AllColorPalettes = {
"Redash" : ColorPalette,
"Viridis" : Viridis,
"Tableau 10" : Tableau,
"D3 Category 10" : D3Category10,
}
export const AllColorPaletteArrays = {
"Redash" : ColorPaletteArray,
"Viridis" : values(Viridis),
"Tableau 10" : values(Tableau),
"D3 Category 10" : values(D3Category10),
};
export const ColorPaletteTypes = {
"Redash" : 'discrete',
"Viridis" : 'continuous',
"Tableau 10" : 'discrete',
"D3 Category 10" : 'discrete',
}

View File

@@ -3,16 +3,18 @@ import React, { useMemo, useCallback } from "react";
import Table from "antd/lib/table"; import Table from "antd/lib/table";
import ColorPicker from "@/components/ColorPicker"; import ColorPicker from "@/components/ColorPicker";
import { EditorPropTypes } from "@/visualizations/prop-types"; import { EditorPropTypes } from "@/visualizations/prop-types";
import ColorPalette from "@/visualizations/ColorPalette"; import { AllColorPalettes } from "@/visualizations/ColorPalette";
import getChartData from "../getChartData"; import getChartData from "../getChartData";
import { Section, Select } from "@/components/visualizations/editor";
export default function DefaultColorsSettings({ options, data, onOptionsChange }: any) { export default function DefaultColorsSettings({ options, data, onOptionsChange }: any) {
const colors = useMemo( const colors = useMemo(
() => ({ () => ({
Automatic: null, Automatic: null,
...ColorPalette, // @ts-expect-error ts-migrate(7053) FIXME: Element implicitly has an 'any' type because expre... Remove this comment to see the full error message
...AllColorPalettes[options.color_scheme],
}), }),
[] [options.color_scheme]
); );
const series = useMemo( const series = useMemo(
@@ -67,8 +69,25 @@ export default function DefaultColorsSettings({ options, data, onOptionsChange }
}, },
]; ];
// @ts-expect-error ts-migrate(2322) FIXME: Type 'boolean[]' is not assignable to type 'object... Remove this comment to see the full error message return (
return <Table showHeader={false} dataSource={series} columns={columns} pagination={false} />; <React.Fragment>
{/* @ts-expect-error ts-migrate(2745) FIXME: This JSX tag's 'children' prop expects type 'never... Remove this comment to see the full error message */}
<Section>
<Select
label="Color Scheme"
defaultValue={options.color_scheme}
data-test="ColorScheme"
onChange={(val : any) => onOptionsChange({ color_scheme: val })}>
{Object.keys(AllColorPalettes).map(option => (
// @ts-expect-error ts-migrate(2339) FIXME: Property 'Option' does not exist on type '({ class... Remove this comment to see the full error message
<Select.Option data-test={`ColorOption${option}`} key={option} value={option}>{option}</Select.Option>
))}
</Select>
</Section>
{/* @ts-expect-error ts-migrate(2322) FIXME: Type 'boolean[]' is not assignable to type 'object... Remove this comment to see the full error message */}
<Table showHeader={false} dataSource={series} columns={columns} pagination={false} />
</React.Fragment>
)
} }
DefaultColorsSettings.propTypes = EditorPropTypes; DefaultColorsSettings.propTypes = EditorPropTypes;

View File

@@ -3,7 +3,7 @@ import React, { useMemo } from "react";
import { Section, Select, Checkbox, InputNumber, ContextHelp, Input } from "@/components/visualizations/editor"; import { Section, Select, Checkbox, InputNumber, ContextHelp, Input } from "@/components/visualizations/editor";
import { UpdateOptionsStrategy } from "@/components/visualizations/editor/createTabbedEditor"; import { UpdateOptionsStrategy } from "@/components/visualizations/editor/createTabbedEditor";
import { EditorPropTypes } from "@/visualizations/prop-types"; import { EditorPropTypes } from "@/visualizations/prop-types";
import { AllColorPalettes } from "@/visualizations/ColorPalette";
import ChartTypeSelect from "./ChartTypeSelect"; import ChartTypeSelect from "./ChartTypeSelect";
import ColumnMappingSelect from "./ColumnMappingSelect"; import ColumnMappingSelect from "./ColumnMappingSelect";
import { useDebouncedCallback } from "use-debounce/lib"; import { useDebouncedCallback } from "use-debounce/lib";

View File

@@ -3,8 +3,9 @@ import React, { useMemo, useCallback } from "react";
import Table from "antd/lib/table"; import Table from "antd/lib/table";
import ColorPicker from "@/components/ColorPicker"; import ColorPicker from "@/components/ColorPicker";
import { EditorPropTypes } from "@/visualizations/prop-types"; import { EditorPropTypes } from "@/visualizations/prop-types";
import ColorPalette from "@/visualizations/ColorPalette"; import { AllColorPalettes } from "@/visualizations/ColorPalette";
import getChartData from "../getChartData"; import getChartData from "../getChartData";
import { Section, Select } from "@/components/visualizations/editor";
function getUniqueValues(chartData: any) { function getUniqueValues(chartData: any) {
const uniqueValuesNames = new Set(); const uniqueValuesNames = new Set();
@@ -20,9 +21,10 @@ export default function PieColorsSettings({ options, data, onOptionsChange }: an
const colors = useMemo( const colors = useMemo(
() => ({ () => ({
Automatic: null, Automatic: null,
...ColorPalette, // @ts-expect-error ts-migrate(7053) FIXME: Element implicitly has an 'any' type because expre... Remove this comment to see the full error message
...AllColorPalettes[options.color_scheme],
}), }),
[] [options.color_scheme]
); );
const series = useMemo( const series = useMemo(
@@ -78,7 +80,24 @@ export default function PieColorsSettings({ options, data, onOptionsChange }: an
}, },
]; ];
return <Table showHeader={false} dataSource={series} columns={columns} pagination={false} />; return (
<React.Fragment>
{/* @ts-expect-error ts-migrate(2745) FIXME: This JSX tag's 'children' prop expects type 'never... Remove this comment to see the full error message */}
<Section>
<Select
label="Color Scheme"
defaultValue={options.color_scheme}
data-test="ColorScheme"
onChange={(val : any) => onOptionsChange({ color_scheme: val })}>
{Object.keys(AllColorPalettes).map(option => (
// @ts-expect-error ts-migrate(2339) FIXME: Property 'Option' does not exist on type '({ class... Remove this comment to see the full error message
<Select.Option data-test={`ColorOption${option}`} key={option} value={option}>{option}</Select.Option>
))}
</Select>
</Section>
<Table showHeader={false} dataSource={series} columns={columns} pagination={false} />
</React.Fragment>
)
} }
PieColorsSettings.propTypes = EditorPropTypes; PieColorsSettings.propTypes = EditorPropTypes;

View File

@@ -17,6 +17,7 @@ const DEFAULT_OPTIONS = {
sizemode: "diameter", sizemode: "diameter",
coefficient: 1, coefficient: 1,
piesort: true, piesort: true,
color_scheme: "Redash",
// showDataLabels: false, // depends on chart type // showDataLabels: false, // depends on chart type
numberFormat: "0,0[.]00000", numberFormat: "0,0[.]00000",

View File

@@ -14,7 +14,8 @@
"columnMapping": { "columnMapping": {
"x": "x", "x": "x",
"y": "y" "y": "y"
} },
"color_scheme": "Redash"
}, },
"data": [ "data": [
{ {
@@ -47,7 +48,8 @@
"textfont": { "color": ["#ffffff", "#ffffff", "#333333", "#ffffff"] }, "textfont": { "color": ["#ffffff", "#ffffff", "#333333", "#ffffff"] },
"name": "a", "name": "a",
"direction": "counterclockwise", "direction": "counterclockwise",
"domain": { "x": [0, 0.98], "y": [0, 0.9] } "domain": { "x": [0, 0.98], "y": [0, 0.9] },
"color_scheme": "Redash"
} }
] ]
} }

View File

@@ -14,7 +14,8 @@
"columnMapping": { "columnMapping": {
"x": "x", "x": "x",
"y": "y" "y": "y"
} },
"color_scheme": "Redash"
}, },
"data": [ "data": [
{ {
@@ -47,7 +48,8 @@
"textfont": { "color": ["#ffffff", "#ffffff", "#333333", "#ffffff"] }, "textfont": { "color": ["#ffffff", "#ffffff", "#333333", "#ffffff"] },
"name": "a", "name": "a",
"direction": "counterclockwise", "direction": "counterclockwise",
"domain": { "x": [0, 0.98], "y": [0, 0.9] } "domain": { "x": [0, 0.98], "y": [0, 0.9] },
"color_scheme": "Redash"
} }
] ]
} }

View File

@@ -14,7 +14,8 @@
"columnMapping": { "columnMapping": {
"x": "x", "x": "x",
"y": "y" "y": "y"
} },
"color_scheme": "Redash"
}, },
"data": [ "data": [
{ {
@@ -47,7 +48,8 @@
"textfont": { "color": ["#ffffff", "#ffffff", "#333333", "#ffffff"] }, "textfont": { "color": ["#ffffff", "#ffffff", "#333333", "#ffffff"] },
"name": "a", "name": "a",
"direction": "counterclockwise", "direction": "counterclockwise",
"domain": { "x": [0, 0.98], "y": [0, 0.9] } "domain": { "x": [0, 0.98], "y": [0, 0.9] },
"color_scheme": "Redash"
} }
] ]
} }

View File

@@ -10,7 +10,8 @@
"direction": { "type": "counterclockwise" }, "direction": { "type": "counterclockwise" },
"xAxis": { "type": "-", "labels": { "enabled": true } }, "xAxis": { "type": "-", "labels": { "enabled": true } },
"yAxis": [{ "type": "linear" }, { "type": "linear", "opposite": true }], "yAxis": [{ "type": "linear" }, { "type": "linear", "opposite": true }],
"series": { "stacking": null, "error_y": { "type": "data", "visible": true } } "series": { "stacking": null, "error_y": { "type": "data", "visible": true } },
"color_scheme": "Redash"
}, },
"data": [ "data": [
{ {
@@ -43,7 +44,8 @@
"textfont": { "color": ["#ffffff"] }, "textfont": { "color": ["#ffffff"] },
"name": "a", "name": "a",
"direction": "counterclockwise", "direction": "counterclockwise",
"domain": { "x": [0, 0.98], "y": [0, 0.9] } "domain": { "x": [0, 0.98], "y": [0, 0.9] },
"color_scheme": "Redash"
} }
] ]
} }

View File

@@ -27,15 +27,17 @@
"automargin": true, "automargin": true,
"showticklabels": true, "showticklabels": true,
"title": null, "title": null,
"tickformat": null,
"type": "-" "type": "-"
}, },
"yaxis": { "yaxis": {
"automargin": true, "automargin": true,
"title": null, "title": null,
"tickformat": null,
"type": "linear", "type": "linear",
"autorange": true, "autorange": true,
"range": null "range": null
}, },
"hoverlabel": { "hoverlabel": {
"namelength": -1 "namelength": -1
} }

View File

@@ -30,11 +30,13 @@
"automargin": true, "automargin": true,
"showticklabels": true, "showticklabels": true,
"title": null, "title": null,
"tickformat": null,
"type": "-" "type": "-"
}, },
"yaxis": { "yaxis": {
"automargin": true, "automargin": true,
"title": null, "title": null,
"tickformat": null,
"type": "linear", "type": "linear",
"autorange": true, "autorange": true,
"range": null "range": null
@@ -42,12 +44,13 @@
"yaxis2": { "yaxis2": {
"automargin": true, "automargin": true,
"title": null, "title": null,
"tickformat": null,
"type": "linear", "type": "linear",
"autorange": true, "autorange": true,
"range": null, "range": null,
"overlaying": "y", "overlaying": "y",
"side": "right" "side": "right"
}, },
"hoverlabel": { "hoverlabel": {
"namelength": -1 "namelength": -1
} }

View File

@@ -25,18 +25,21 @@
"automargin": true, "automargin": true,
"showticklabels": true, "showticklabels": true,
"title": null, "title": null,
"tickformat": null,
"type": "-" "type": "-"
}, },
"yaxis": { "yaxis": {
"automargin": true, "automargin": true,
"title": null, "title": null,
"tickformat": null,
"type": "linear", "type": "linear",
"autorange": true, "autorange": true,
"range": null "range": null
}, },
"hoverlabel": { "hoverlabel": {
"namelength": -1 "namelength": -1
} },
"hovermode": "x"
} }
} }
} }

View File

@@ -28,11 +28,13 @@
"automargin": true, "automargin": true,
"showticklabels": true, "showticklabels": true,
"title": null, "title": null,
"tickformat": null,
"type": "-" "type": "-"
}, },
"yaxis": { "yaxis": {
"automargin": true, "automargin": true,
"title": null, "title": null,
"tickformat": null,
"type": "linear", "type": "linear",
"autorange": true, "autorange": true,
"range": null "range": null
@@ -40,15 +42,17 @@
"yaxis2": { "yaxis2": {
"automargin": true, "automargin": true,
"title": null, "title": null,
"tickformat": null,
"type": "linear", "type": "linear",
"autorange": true, "autorange": true,
"range": null, "range": null,
"overlaying": "y", "overlaying": "y",
"side": "right" "side": "right"
}, },
"hoverlabel": { "hoverlabel": {
"namelength": -1 "namelength": -1
} },
"hovermode": "x"
} }
} }
} }

View File

@@ -24,18 +24,21 @@
"automargin": true, "automargin": true,
"showticklabels": true, "showticklabels": true,
"title": null, "title": null,
"tickformat": null,
"type": "-" "type": "-"
}, },
"yaxis": { "yaxis": {
"automargin": true, "automargin": true,
"title": null, "title": null,
"tickformat": null,
"type": "linear", "type": "linear",
"autorange": true, "autorange": true,
"range": null "range": null
}, },
"hoverlabel": { "hoverlabel": {
"namelength": -1 "namelength": -1
} },
"hovermode": "x"
} }
} }
} }

View File

@@ -23,18 +23,21 @@
"automargin": true, "automargin": true,
"showticklabels": true, "showticklabels": true,
"title": null, "title": null,
"tickformat": null,
"type": "-" "type": "-"
}, },
"yaxis": { "yaxis": {
"automargin": true, "automargin": true,
"title": null, "title": null,
"tickformat": null,
"type": "linear", "type": "linear",
"autorange": true, "autorange": true,
"range": null "range": null
}, },
"hoverlabel": { "hoverlabel": {
"namelength": -1 "namelength": -1
} },
"hovermode": "x"
} }
} }
} }

View File

@@ -10,6 +10,7 @@ import { prepareCustomChartData, createCustomChartRenderer } from "./customChart
// @ts-expect-error ts-migrate(2339) FIXME: Property 'setPlotConfig' does not exist on type 't... Remove this comment to see the full error message // @ts-expect-error ts-migrate(2339) FIXME: Property 'setPlotConfig' does not exist on type 't... Remove this comment to see the full error message
Plotly.setPlotConfig({ Plotly.setPlotConfig({
modeBarButtonsToRemove: ["sendDataToCloud"], modeBarButtonsToRemove: ["sendDataToCloud"],
modeBarButtonsToAdd: ["togglespikelines", "v1hovermode"],
}); });
export { export {

View File

@@ -1,10 +1,18 @@
import { isNil, extend, each, includes, map, sortBy, toString } from "lodash"; import { isNil, extend, each, includes, map, sortBy, toString } from "lodash";
import chooseTextColorForBackground from "@/lib/chooseTextColorForBackground"; import chooseTextColorForBackground from "@/lib/chooseTextColorForBackground";
import { ColorPaletteArray } from "@/visualizations/ColorPalette"; import { AllColorPaletteArrays, ColorPaletteTypes } from "@/visualizations/ColorPalette";
import { cleanNumber, normalizeValue, getSeriesAxis } from "./utils"; import { cleanNumber, normalizeValue, getSeriesAxis } from "./utils";
function getSeriesColor(seriesOptions: any, seriesIndex: any) { function getSeriesColor(options: any, seriesOptions: any, seriesIndex: any, numSeries: any) {
return seriesOptions.color || ColorPaletteArray[seriesIndex % ColorPaletteArray.length]; // @ts-expect-error ts-migrate(7053) FIXME: Element implicitly has an 'any' type because expre... Remove this comment to see the full error message
let palette = AllColorPaletteArrays[options.color_scheme];
// @ts-expect-error ts-migrate(7053) FIXME: Element implicitly has an 'any' type because expre... Remove this comment to see the full error message
if (ColorPaletteTypes[options.color_scheme] === 'continuous' && palette.length > numSeries) {
const step = (palette.length - 1) / (numSeries - 1 || 1);
const index = Math.round(step * seriesIndex);
return seriesOptions.color || palette[index % palette.length];
}
return seriesOptions.color || palette[seriesIndex % palette.length];
} }
function getHoverInfoPattern(options: any) { function getHoverInfoPattern(options: any) {
@@ -71,11 +79,11 @@ function prepareBoxSeries(series: any, options: any, { seriesColor }: any) {
return series; return series;
} }
function prepareSeries(series: any, options: any, additionalOptions: any) { function prepareSeries(series: any, options: any, numSeries: any, additionalOptions: any) {
const { hoverInfoPattern, index } = additionalOptions; const { hoverInfoPattern, index } = additionalOptions;
const seriesOptions = extend({ type: options.globalSeriesType, yAxis: 0 }, options.seriesOptions[series.name]); const seriesOptions = extend({ type: options.globalSeriesType, yAxis: 0 }, options.seriesOptions[series.name]);
const seriesColor = getSeriesColor(seriesOptions, index); const seriesColor = getSeriesColor(options, seriesOptions, index, numSeries);
const seriesYAxis = getSeriesAxis(series, options); const seriesYAxis = getSeriesAxis(series, options);
// Sort by x - `Map` preserves order of items // Sort by x - `Map` preserves order of items
@@ -91,8 +99,8 @@ function prepareSeries(series: any, options: any, additionalOptions: any) {
}; };
const sourceData = new Map(); const sourceData = new Map();
const xValues: any[] = [];
const labelsValuesMap = new Map(); const yValues: any[] = [];
const yErrorValues: any = []; const yErrorValues: any = [];
each(data, row => { each(data, row => {
@@ -100,27 +108,20 @@ function prepareSeries(series: any, options: any, additionalOptions: any) {
const y = cleanYValue(row.y, seriesYAxis === "y2" ? options.yAxis[1].type : options.yAxis[0].type); // depends on series type! const y = cleanYValue(row.y, seriesYAxis === "y2" ? options.yAxis[1].type : options.yAxis[0].type); // depends on series type!
const yError = cleanNumber(row.yError); // always number const yError = cleanNumber(row.yError); // always number
const size = cleanNumber(row.size); // always number const size = cleanNumber(row.size); // always number
if (labelsValuesMap.has(x)) {
labelsValuesMap.set(x, labelsValuesMap.get(x) + y);
} else {
labelsValuesMap.set(x, y);
}
const aggregatedY = labelsValuesMap.get(x);
sourceData.set(x, { sourceData.set(x, {
x, x,
y: aggregatedY, y,
yError, yError,
size, size,
yPercent: null, // will be updated later yPercent: null, // will be updated later
row, row,
}); });
xValues.push(x);
yValues.push(y);
yErrorValues.push(yError); yErrorValues.push(yError);
}); });
const xValues = Array.from(labelsValuesMap.keys());
const yValues = Array.from(labelsValuesMap.values());
const plotlySeries = { const plotlySeries = {
visible: true, visible: true,
hoverinfo: hoverInfoPattern, hoverinfo: hoverInfoPattern,
@@ -166,6 +167,7 @@ export default function prepareDefaultData(seriesList: any, options: any) {
const additionalOptions = { const additionalOptions = {
hoverInfoPattern: getHoverInfoPattern(options), hoverInfoPattern: getHoverInfoPattern(options),
}; };
const numSeries = seriesList.length
return map(seriesList, (series, index) => prepareSeries(series, options, { ...additionalOptions, index })); return map(seriesList, (series, index) => prepareSeries(series, options, numSeries, { ...additionalOptions, index }));
} }

View File

@@ -21,7 +21,7 @@ function prepareXAxis(axisOptions: any, additionalOptions: any) {
title: getAxisTitle(axisOptions), title: getAxisTitle(axisOptions),
type: getAxisScaleType(axisOptions), type: getAxisScaleType(axisOptions),
automargin: true, automargin: true,
tickformat: axisOptions.tickFormat, tickformat: axisOptions.tickFormat ?? null,
}; };
if (additionalOptions.sortX && axis.type === "category") { if (additionalOptions.sortX && axis.type === "category") {
@@ -49,7 +49,7 @@ function prepareYAxis(axisOptions: any) {
automargin: true, automargin: true,
autorange: true, autorange: true,
range: null, range: null,
tickformat: axisOptions.tickFormat, tickformat: axisOptions.tickFormat ?? null,
}; };
} }
@@ -109,7 +109,7 @@ function prepareBoxLayout(layout: any, options: any, data: any) {
} }
export default function prepareLayout(element: any, options: any, data: any) { export default function prepareLayout(element: any, options: any, data: any) {
const layout = { const layout: any = {
margin: { l: 10, r: 10, b: 5, t: 20, pad: 4 }, margin: { l: 10, r: 10, b: 5, t: 20, pad: 4 },
// plot size should be at least 5x5px // plot size should be at least 5x5px
width: Math.max(5, Math.floor(element.offsetWidth)), width: Math.max(5, Math.floor(element.offsetWidth)),
@@ -124,6 +124,10 @@ export default function prepareLayout(element: any, options: any, data: any) {
}, },
}; };
if (["line", "area", "column"].includes(options.globalSeriesType)) {
layout.hovermode = options.swappedAxes ? 'y' : 'x';
}
switch (options.globalSeriesType) { switch (options.globalSeriesType) {
case "pie": case "pie":
return preparePieLayout(layout, options, data); return preparePieLayout(layout, options, data);

View File

@@ -1,7 +1,7 @@
import { isString, each, extend, includes, map, reduce } from "lodash"; import { isString, each, extend, includes, map, reduce } from "lodash";
import d3 from "d3"; import d3 from "d3";
import chooseTextColorForBackground from "@/lib/chooseTextColorForBackground"; import chooseTextColorForBackground from "@/lib/chooseTextColorForBackground";
import { ColorPaletteArray } from "@/visualizations/ColorPalette"; import { AllColorPaletteArrays, ColorPaletteTypes } from "@/visualizations/ColorPalette";
import { cleanNumber, normalizeValue } from "./utils"; import { cleanNumber, normalizeValue } from "./utils";
@@ -35,7 +35,6 @@ function prepareSeries(series: any, options: any, additionalOptions: any) {
hoverInfoPattern, hoverInfoPattern,
getValueColor, getValueColor,
} = additionalOptions; } = additionalOptions;
const seriesOptions = extend({ type: options.globalSeriesType, yAxis: 0 }, options.seriesOptions[series.name]); const seriesOptions = extend({ type: options.globalSeriesType, yAxis: 0 }, options.seriesOptions[series.name]);
const xPosition = (index % cellsInRow) * cellWidth; const xPosition = (index % cellsInRow) * cellWidth;
@@ -102,17 +101,35 @@ function prepareSeries(series: any, options: any, additionalOptions: any) {
}, },
sourceData, sourceData,
sort: options.piesort, sort: options.piesort,
color_scheme: options.color_scheme,
}; };
} }
export default function preparePieData(seriesList: any, options: any) { export default function preparePieData(seriesList: any, options: any) {
// we will use this to assign colors for values that have no explicitly set color // @ts-expect-error ts-migrate(7053) FIXME: Element implicitly has an 'any' type because expre... Remove this comment to see the full error message
// @ts-expect-error ts-migrate(2339) FIXME: Property 'scale' does not exist on type 'typeof im... Remove this comment to see the full error message const palette = AllColorPaletteArrays[options.color_scheme];
const getDefaultColor = d3.scale
.ordinal()
.domain([])
.range(ColorPaletteArray);
const valuesColors = {}; const valuesColors = {};
let getDefaultColor : Function;
// @ts-expect-error ts-migrate(7053) FIXME: Element implicitly has an 'any' type because expre... Remove this comment to see the full error message
if (typeof(seriesList[0]) !== 'undefined' && ColorPaletteTypes[options.color_scheme] === 'continuous') {
const uniqueXValues =[... new Set(seriesList[0].data.map((d: any) => d.x))];
const step = (palette.length - 1) / (uniqueXValues.length - 1 || 1);
const colorIndices = d3.range(uniqueXValues.length).map(function(i) {
return Math.round(step * i);
});
// @ts-expect-error ts-migrate(2339) FIXME: Property 'scale' does not exist on type 'typeof im... Remove this comment to see the full error message
getDefaultColor = d3.scale.ordinal()
.domain(uniqueXValues) // Set domain as the unique x-values
.range(colorIndices.map(index => palette[index]));
} else {
// @ts-expect-error ts-migrate(2339) FIXME: Property 'scale' does not exist on type 'typeof im... Remove this comment to see the full error message
getDefaultColor = d3.scale
.ordinal()
.domain([])
.range(palette);
};
each(options.valuesOptions, (item, key) => { each(options.valuesOptions, (item, key) => {
if (isString(item.color) && item.color !== "") { if (isString(item.color) && item.color !== "") {
// @ts-expect-error ts-migrate(7053) FIXME: Element implicitly has an 'any' type because expre... Remove this comment to see the full error message // @ts-expect-error ts-migrate(7053) FIXME: Element implicitly has an 'any' type because expre... Remove this comment to see the full error message

View File

@@ -1,51 +1,28 @@
import { map } from "lodash"; import { map } from "lodash";
import React, { useState } from "react"; import React from "react";
import { Section, Select } from "@/components/visualizations/editor"; import { Section, Select } from "@/components/visualizations/editor";
import { EditorPropTypes } from "@/visualizations/prop-types"; import { EditorPropTypes } from "@/visualizations/prop-types";
const ALLOWED_ITEM_PER_PAGE = [5, 10, 15, 20, 25, 50, 100, 150, 200, 250, 500]; const ALLOWED_ITEM_PER_PAGE = [5, 10, 15, 20, 25, 50, 100, 150, 200, 250, 500];
const ALLOWED_COLS_TO_FIX = [0, 1, 2, 3, 4]
export default function GridSettings({ options, onOptionsChange }: any) { export default function GridSettings({ options, onOptionsChange }: any) {
const numCols = options.columns.length;
const maxColsToFix = Math.min(4, numCols - 1);
return ( return (
<React.Fragment> // @ts-expect-error ts-migrate(2745) FIXME: This JSX tag's 'children' prop expects type 'never... Remove this comment to see the full error message
{/* @ts-expect-error ts-migrate(2745) FIXME: This JSX tag's 'children' prop expects type 'never' but its value is 'Element'. */} <Section>
<Section> <Select
<Select label="Items per page"
label="Items per page" data-test="Table.ItemsPerPage"
data-test="Table.ItemsPerPage" defaultValue={options.itemsPerPage}
defaultValue={options.itemsPerPage} onChange={(itemsPerPage: any) => onOptionsChange({ itemsPerPage })}>
onChange={(itemsPerPage: any) => onOptionsChange({ itemsPerPage })}> {map(ALLOWED_ITEM_PER_PAGE, value => (
{map(ALLOWED_ITEM_PER_PAGE, value => ( // @ts-expect-error ts-migrate(2339) FIXME: Property 'Option' does not exist on type '({ class... Remove this comment to see the full error message
// @ts-expect-error ts-migrate(2339) FIXME: Property 'Option' does not exist on type '({ class... Remove this comment to see the full error message <Select.Option key={`ipp${value}`} value={value} data-test={`Table.ItemsPerPage.${value}`}>
<Select.Option key={`ipp${value}`} value={value} data-test={`Table.ItemsPerPage.${value}`}> {value}
{value} {/* @ts-expect-error ts-migrate(2339) FIXME: Property 'Option' does not exist on type '({ class... Remove this comment to see the full error message */}
{/* @ts-expect-error ts-migrate(2339) FIXME: Property 'Option' does not exist on type '({ class... Remove this comment to see the full error message */} </Select.Option>
</Select.Option> ))}
))} </Select>
</Select> </Section>
</Section>
{/* @ts-expect-error ts-migrate(2745) FIXME: This JSX tag's 'children' prop expects type 'never' but its value is 'Element'. */}
<Section>
<Select
label="Number of Columns to Fix in Place"
data-test="FixedColumns"
defaultValue={options.fixedColumns}
onChange={(fixedColumns: number) => {onOptionsChange({ fixedColumns })}}>
{map(ALLOWED_COLS_TO_FIX.slice(0, maxColsToFix + 1), value => (
// @ts-expect-error ts-migrate(2339) FIXME: Property 'Option' does not exist on type '({ class... Remove this comment to see the full error message
<Select.Option key={`fc${value}`} value={value}>
{value}
{/* @ts-expect-error ts-migrate(2339) FIXME: Property 'Option' does not exist on type '({ class... Remove this comment to see the full error message */}
</Select.Option>
))}
</Select>
</Section>
</React.Fragment>
); );
} }

View File

@@ -5,7 +5,7 @@ Object {
"columns": Array [ "columns": Array [
Object { Object {
"alignContent": "right", "alignContent": "right",
"allowHTML": true, "allowHTML": false,
"allowSearch": false, "allowSearch": false,
"booleanValues": Array [ "booleanValues": Array [
"false", "false",
@@ -13,7 +13,6 @@ Object {
], ],
"dateTimeFormat": undefined, "dateTimeFormat": undefined,
"displayAs": "string", "displayAs": "string",
"fixed": false,
"highlightLinks": false, "highlightLinks": false,
"imageHeight": "", "imageHeight": "",
"imageTitleTemplate": "{{ @ }}", "imageTitleTemplate": "{{ @ }}",
@@ -39,7 +38,7 @@ Object {
"columns": Array [ "columns": Array [
Object { Object {
"alignContent": "left", "alignContent": "left",
"allowHTML": true, "allowHTML": false,
"allowSearch": false, "allowSearch": false,
"booleanValues": Array [ "booleanValues": Array [
"false", "false",
@@ -47,7 +46,6 @@ Object {
], ],
"dateTimeFormat": undefined, "dateTimeFormat": undefined,
"displayAs": "number", "displayAs": "number",
"fixed": false,
"highlightLinks": false, "highlightLinks": false,
"imageHeight": "", "imageHeight": "",
"imageTitleTemplate": "{{ @ }}", "imageTitleTemplate": "{{ @ }}",
@@ -73,7 +71,7 @@ Object {
"columns": Array [ "columns": Array [
Object { Object {
"alignContent": "left", "alignContent": "left",
"allowHTML": true, "allowHTML": false,
"allowSearch": false, "allowSearch": false,
"booleanValues": Array [ "booleanValues": Array [
"false", "false",
@@ -81,7 +79,6 @@ Object {
], ],
"dateTimeFormat": undefined, "dateTimeFormat": undefined,
"displayAs": "string", "displayAs": "string",
"fixed": false,
"highlightLinks": false, "highlightLinks": false,
"imageHeight": "", "imageHeight": "",
"imageTitleTemplate": "{{ @ }}", "imageTitleTemplate": "{{ @ }}",
@@ -107,7 +104,7 @@ Object {
"columns": Array [ "columns": Array [
Object { Object {
"alignContent": "left", "alignContent": "left",
"allowHTML": true, "allowHTML": false,
"allowSearch": true, "allowSearch": true,
"booleanValues": Array [ "booleanValues": Array [
"false", "false",
@@ -115,7 +112,6 @@ Object {
], ],
"dateTimeFormat": undefined, "dateTimeFormat": undefined,
"displayAs": "string", "displayAs": "string",
"fixed": false,
"highlightLinks": false, "highlightLinks": false,
"imageHeight": "", "imageHeight": "",
"imageTitleTemplate": "{{ @ }}", "imageTitleTemplate": "{{ @ }}",
@@ -141,7 +137,7 @@ Object {
"columns": Array [ "columns": Array [
Object { Object {
"alignContent": "left", "alignContent": "left",
"allowHTML": true, "allowHTML": false,
"allowSearch": false, "allowSearch": false,
"booleanValues": Array [ "booleanValues": Array [
"false", "false",
@@ -149,7 +145,6 @@ Object {
], ],
"dateTimeFormat": undefined, "dateTimeFormat": undefined,
"displayAs": "string", "displayAs": "string",
"fixed": false,
"highlightLinks": false, "highlightLinks": false,
"imageHeight": "", "imageHeight": "",
"imageTitleTemplate": "{{ @ }}", "imageTitleTemplate": "{{ @ }}",

View File

@@ -84,13 +84,6 @@ export default function Renderer({ options, data }: any) {
const [searchTerm, setSearchTerm] = useState(""); const [searchTerm, setSearchTerm] = useState("");
const [orderBy, setOrderBy] = useState([]); const [orderBy, setOrderBy] = useState([]);
const columnsToFix = new Set<string>();
for (let i = 0; i < options.fixedColumns; i++) {
if (options.columns[i]) {
columnsToFix.add(options.columns[i].name);
}
}
const searchColumns = useMemo(() => filter(options.columns, "allowSearch"), [options.columns]); const searchColumns = useMemo(() => filter(options.columns, "allowSearch"), [options.columns]);
const tableColumns = useMemo(() => { const tableColumns = useMemo(() => {
@@ -104,7 +97,7 @@ export default function Renderer({ options, data }: any) {
// Remove text selection - may occur accidentally // Remove text selection - may occur accidentally
// @ts-expect-error ts-migrate(2531) FIXME: Object is possibly 'null'. // @ts-expect-error ts-migrate(2531) FIXME: Object is possibly 'null'.
document.getSelection().removeAllRanges(); document.getSelection().removeAllRanges();
}, columnsToFix); });
}, [options.columns, searchColumns, orderBy]); }, [options.columns, searchColumns, orderBy]);
const preparedRows = useMemo(() => sortRows(filterRows(initRows(data.rows), searchTerm, searchColumns), orderBy), [ const preparedRows = useMemo(() => sortRows(filterRows(initRows(data.rows), searchTerm, searchColumns), orderBy), [
@@ -141,7 +134,6 @@ export default function Renderer({ options, data }: any) {
showSizeChanger: false, showSizeChanger: false,
}} }}
showSorterTooltip={false} showSorterTooltip={false}
scroll = {{x : 'max-content'}}
/> />
</div> </div>
); );

View File

@@ -4,7 +4,6 @@ import { visualizationsSettings } from "@/visualizations/visualizationsSettings"
const DEFAULT_OPTIONS = { const DEFAULT_OPTIONS = {
itemsPerPage: 25, itemsPerPage: 25,
paginationSize: "default", // not editable through Editor paginationSize: "default", // not editable through Editor
fixedColumns: 0,
}; };
const filterTypes = ["filter", "multi-filter", "multiFilter"]; const filterTypes = ["filter", "multi-filter", "multiFilter"];
@@ -55,9 +54,8 @@ function getDefaultColumnsOptions(columns: any) {
allowSearch: false, allowSearch: false,
alignContent: getColumnContentAlignment(col.type), alignContent: getColumnContentAlignment(col.type),
// `string` cell options // `string` cell options
allowHTML: true, allowHTML: false,
highlightLinks: false, highlightLinks: false,
fixed: false,
})); }));
} }

View File

@@ -21,6 +21,7 @@
left: 0; left: 0;
top: 0; top: 0;
border-top: 0; border-top: 0;
z-index: 1;
background: #fafafa !important; background: #fafafa !important;
} }
} }
@@ -156,11 +157,3 @@
color: @text-color-secondary; color: @text-color-secondary;
} }
} }
.ant-table-cell-fix-left{
background-color: #fff !important;
}
.ant-table-tbody > tr.ant-table-row:hover > .ant-table-cell-fix-left {
background-color: rgb(248, 249, 250) !important;
}

View File

@@ -50,7 +50,7 @@ function getOrderByInfo(orderBy: any) {
return result; return result;
} }
export function prepareColumns(columns: any, searchInput: any, orderBy: any, onOrderByChange: any, columnsToFix: Set<string>) { export function prepareColumns(columns: any, searchInput: any, orderBy: any, onOrderByChange: any) {
columns = filter(columns, "visible"); columns = filter(columns, "visible");
columns = sortBy(columns, "order"); columns = sortBy(columns, "order");
@@ -96,7 +96,6 @@ export function prepareColumns(columns: any, searchInput: any, orderBy: any, onO
}), }),
onClick: (event: any) => onOrderByChange(toggleOrderBy(column.name, orderBy, event.shiftKey)), onClick: (event: any) => onOrderByChange(toggleOrderBy(column.name, orderBy, event.shiftKey)),
}), }),
fixed: columnsToFix.has(column.name) ? 'left' : false
}; };
// @ts-expect-error ts-migrate(7053) FIXME: Element implicitly has an 'any' type because expre... Remove this comment to see the full error message // @ts-expect-error ts-migrate(7053) FIXME: Element implicitly has an 'any' type because expre... Remove this comment to see the full error message

File diff suppressed because it is too large Load Diff

2131
yarn.lock

File diff suppressed because it is too large Load Diff