Compare commits

...

126 Commits

Author SHA1 Message Date
dependabot[bot]
af71e0ec13 Bump serialize-javascript from 6.0.1 to 6.0.2 in /viz-lib
Bumps [serialize-javascript](https://github.com/yahoo/serialize-javascript) from 6.0.1 to 6.0.2.
- [Release notes](https://github.com/yahoo/serialize-javascript/releases)
- [Commits](https://github.com/yahoo/serialize-javascript/compare/v6.0.1...v6.0.2)

---
updated-dependencies:
- dependency-name: serialize-javascript
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-03-05 14:32:00 +00:00
Tsuneo Yoshioka
594e2f24ef Upgrade plotly.js to version 2 to fix the UI crashing issue (#7359)
* Upgrade plotly.js to version 2

* Fix styling error reported by styled
2025-03-05 14:30:28 +00:00
github-actions[bot]
3275a9e459 Snapshot: 25.03.0-dev 2025-03-01 00:35:44 +00:00
Shunki
3bad8c8e8c TiDB: Exclude INFORMATION_SCHEMA (#7352)
Co-authored-by: snickerjp <snickerjp@gmail.com>
2025-02-28 11:09:46 +09:00
Tsuneo Yoshioka
d0af4499d6 Sanitize NaN, Infinite, -Infinite causing error when saving as PostgreSQL JSON #7339 (2nd try) (#7348)
* Sanitize NaN, Infinite, -Infinite causing error when saving as PostgreSQL JSON #7339 (2nd try)

* Move json nsanitaize to on the top of json_dumps

* Fix comment
2025-02-27 01:40:43 -08:00
Ran Benita
4357ea56ae Fix UnboundLocalError when checking alerts for query (#7346)
This fixes the following exception:

```
UnboundLocalError: local variable 'value_is_number' referenced before assignment
  File "rq/worker.py", line 1431, in perform_job
    rv = job.perform()
  File "rq/job.py", line 1280, in perform
    self._result = self._execute()
  File "rq/job.py", line 1317, in _execute
    result = self.func(*self.args, **self.kwargs)
  File "redash/tasks/alerts.py", line 36, in check_alerts_for_query
    new_state = alert.evaluate()
  File "redash/models/__init__.py", line 1002, in evaluate
    new_state = next_state(op, value, threshold)
  File "redash/models/__init__.py", line 928, in next_state
    elif not value_is_number and op not in [OPERATORS.get("!="), OPERATORS.get("=="), OPERATORS.get("equals")]:
```
2025-02-25 09:15:20 -05:00
Tsuneo Yoshioka
5df5ca87a2 add NULLS LAST option for Query order (#7341) 2025-02-25 10:58:48 +08:00
Tsuneo Yoshioka
8387fe6fcb Fix the issue that chart(scatter, line, bubble...) having same x-value have wrong y-value (#7330) 2025-02-18 20:04:12 +00:00
snickerjp
e95de2ee4c Update oracledb package to version 2.5.1 and adjust Python version compatibility (#7316) 2025-02-18 23:00:09 +09:00
Lee2532
71902e5933 FIX : redash docker image TAG (#7280)
Co-authored-by: snickerjp <snickerjp@gmail.com>
2025-02-15 01:38:23 +09:00
Tsuneo Yoshioka
53eab14cef Make autocomplete always available (#7326) 2025-02-13 15:25:39 -05:00
Eric Radman
925bb91d8e Use absolute path for image resources (#7322)
When MULTI_ORG is enabled, 'static/' resolves to '<org>/static/'
2025-02-12 08:37:40 -05:00
Tsuneo Yoshioka
ec2ca6f986 BigQuery: show column type on Schema Browser (#7257) 2025-02-05 18:25:39 +00:00
Matt Nelson
96ea0194e8 Fix errors in webex alert destination. Add formatting support for QUERY_RESULT_TABLE. (#7296)
* prevent text values in payload being detected as 'set' on send.
Webex send ERROR:: Object of type set is not JSON serializable

Signed-off-by: Matt Nelson <metheos@gmail.com>

* add support for formatted QUERY_RESULT_TABLE in webex card

Signed-off-by: Matt Nelson <metheos@gmail.com>

* don't try to send to blank destinations

Signed-off-by: Matt Nelson <metheos@gmail.com>

* fix handling of the encoded QUERY_RESULTS_TABLE text

Signed-off-by: Matt Nelson <metheos@gmail.com>

* re-sort imports for ruff

Signed-off-by: Matt Nelson <metheos@gmail.com>

* change formatter to black

Signed-off-by: Matt Nelson <metheos@gmail.com>

* Add additional tests for Webex notification handling

ensure blank entries are handled for room IDs and person emails.
ensure that the API is not called when no valid destinations are provided.
ensure proper attachment formatting for alerts containing 2D arrays.

Signed-off-by: Matt Nelson <metheos@gmail.com>

* Add test for Webex notification with 1D array handling

This commit introduces a new test case to verify that the Webex
notification function correctly handles a 1D array input in the alert body.
The test ensures that the expected payload is constructed properly and that
the requests.post method is called with the correct parameters.

Signed-off-by: Matt Nelson <metheos@gmail.com>

---------

Signed-off-by: Matt Nelson <metheos@gmail.com>
2025-02-04 11:05:13 +00:00
github-actions[bot]
2776992101 Snapshot: 25.02.0-dev 2025-02-01 00:33:52 +00:00
Arik Fraimovich
85f001982e GitHub Actions Workflow updates (#7298)
* Split out secrets requiring workflows

* Update target

* Update Cypress run command
2025-01-31 10:20:04 +02:00
Motoi Washida
d03a2c4096 Fix error in rehash DB migration with Elasticsearch queries (#7292)
Fixes #7272
2025-01-22 21:19:59 -05:00
SeongTae Jeong
8c5890482a Use ARM64 runners instead of virtualization for ARM64 image builds (#7291) 2025-01-19 16:00:19 +10:00
Ezra Odio
10ce280a96 Default to not allow HTML content in tables (#7064)
Co-authored-by: Ezra Odio <eodio@starfishstorage.com>
2025-01-15 10:09:24 -05:00
dependabot[bot]
0dd7ac3d2e Bump virtualenv from 20.25.0 to 20.26.6 (#7276) 2025-01-14 01:45:58 +00:00
github-actions[bot]
4ee53a9445 Snapshot: 25.01.0-dev 2025-01-01 00:35:12 +00:00
SeongTae Jeong
c08292d90e Use Codecov token (#7265) 2024-12-30 21:06:09 +00:00
SeongTae Jeong
3142131cdd Bump actions/upload-artifact from v3 to v4 (#7266)
Related: https://github.blog/changelog/2024-04-16-deprecation-notice-v3-of-the-artifact-actions/
2024-12-30 15:31:03 -05:00
Daisuke Taniwaki
530c1a0734 Support result reuse in Athena data sources (#7202)
* Support result reuse

* Update pyathena to 2.25.2

* Separate options

* Regenerate the Poetry lock file

---------

Co-authored-by: SeongTae Jeong <seongtaejg@gmail.com>
2024-12-28 05:50:16 +09:00
dependabot[bot]
52dc1769a1 Bump jinja2 from 3.1.4 to 3.1.5 (#7262)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-12-27 13:55:21 +10:00
Eric Radman
b9583c0b48 Create workflow trigger for publishing release image (#7259)
Co-authored-by: Justin Clift <justin@postgresql.org>
2024-12-27 12:19:32 +10:00
Arik Fraimovich
89d7f54e90 Handle the case when query runner configuration is an empty dict. (#7258) 2024-12-24 09:42:39 -05:00
Tsuneo Yoshioka
d884da2b0b BigQuery: add date, datetime type mapping (#7252) 2024-12-18 14:24:45 +02:00
dependabot[bot]
f7d485082c Bump nanoid from 3.3.6 to 3.3.8 (#7249)
Bumps [nanoid](https://github.com/ai/nanoid) from 3.3.6 to 3.3.8.
- [Release notes](https://github.com/ai/nanoid/releases)
- [Changelog](https://github.com/ai/nanoid/blob/main/CHANGELOG.md)
- [Commits](https://github.com/ai/nanoid/compare/3.3.6...3.3.8)

---
updated-dependencies:
- dependency-name: nanoid
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-12-13 17:57:05 +09:00
Eric Radman
130ab1fe1a Update to paramiko-3.4.1 (#7240)
Solves the deprecation warning for TripleDES
Related: https://github.com/paramiko/paramiko/issues/2419
2024-12-07 11:23:45 +09:00
github-actions[bot]
2ff83679fe Snapshot: 24.12.0-dev 2024-12-01 00:40:40 +00:00
Eric Radman
de49b73855 Replace ptvsd with debugpy to match modern VS Code (#7234) 2024-11-27 08:19:05 +10:00
thiagogds
c12e68f5d1 Only evaluate the next state if there's a value (#7222)
I've experience this on my Redash in production. I'm not sure what can cause the value to exist, but be None. I guess it depends on the SQL query.

I followed the same idea of returning a self.UNKNOWN_STATE for cases that we can't know what's happening.
2024-11-26 12:57:34 -05:00
Eric Radman
baa9bbd505 Use head.sha for restyled checkout (#7227) 2024-11-22 10:34:16 +10:00
Arik Fraimovich
349cd5d031 Bring back version check & beacon reporting (#7211)
Co-authored-by: Restyled.io <commits@restyled.io>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
2024-11-06 01:21:03 +00:00
github-actions[bot]
49277d27f8 Snapshot: 24.11.0-dev 2024-11-01 00:35:04 +00:00
Yeger
2aae5705c9 don't crash when there is no data (#7208)
* don't crash when there is no data

* Add test
2024-10-31 08:49:57 +00:00
dependabot[bot]
38d0579660 Bump elliptic from 6.5.7 to 6.6.0 (#7214)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-10-31 02:41:33 +00:00
Ezra Odio
673ba769c7 Fix issue with scheduled queries (#7111)
Co-authored-by: Ezra Odio <eodio@starfishstorage.com>
Co-authored-by: Arik Fraimovich <arik@arikfr.com>
2024-10-29 10:36:05 +00:00
Eric Radman
b922730482 Docker build: use heredoc for multi-line actions (#7210) 2024-10-29 10:23:15 +10:00
Arik Fraimovich
ba973eb1fe Fixes #6767: correctly rehash queries in a migration (#7184) 2024-10-25 01:00:29 +00:00
dependabot[bot]
d8dde6c544 Bump cryptography from 42.0.8 to 43.0.1 (#7205)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-10-25 00:27:23 +00:00
dependabot[bot]
d359a716a7 Bump http-proxy-middleware from 2.0.6 to 2.0.7 (#7204)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-10-24 23:56:09 +00:00
dependabot[bot]
ba4293912b Bump snowflake-connector-python from 3.12.0 to 3.12.3 (#7203)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-10-24 23:23:28 +00:00
thiagogds
ee359120ee Use correct redis connection (#7077) 2024-10-24 17:54:09 +10:00
thiagogds
04a25f4327 Fix RQ wrongly moving jobs to FailedJobRegistry (#7186)
Something changed in python-rq and the old code was behaving in a way that if a job ran for longer than 2 min it would be automatically set as failed, but it would continue running.

This causes a problem in the UI because it is as if the job stopped, but it actually didn't
2024-10-17 13:30:02 -04:00
Eric Radman
7c22756e66 Move restyled to a github action (#7191) 2024-10-16 09:45:25 +03:00
dependabot[bot]
a03668f5b2 Bump restrictedpython from 6.2 to 7.3 (#7181)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-10-01 13:22:44 +10:00
github-actions[bot]
e4a841a0c5 Snapshot: 24.10.0-dev 2024-10-01 00:34:37 +00:00
Zach Liu
38dc31a49b Get rid of the strange looking 0 following "Running..." and "runtime" (#7099)
* Snapshot: 24.08.0-dev

* no more Running...0 or runtime0

* also missing a space

* Restyled by prettier

* check if data_scanned is defined

otherwise we could get "Data Scanned ?" if it's not supported
by some data sources

---------

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: Restyled.io <commits@restyled.io>
2024-09-19 13:25:05 +03:00
Justin Clift
c42b15125c Automatically remove orphans when running make up (#7164) 2024-09-17 05:11:51 +00:00
dependabot[bot]
590d39bc8d Bump dompurify from 2.0.17 to 2.5.4 in /viz-lib (#7163)
Bumps [dompurify](https://github.com/cure53/DOMPurify) from 2.0.17 to 2.5.4.
- [Release notes](https://github.com/cure53/DOMPurify/releases)
- [Commits](https://github.com/cure53/DOMPurify/compare/2.0.17...2.5.4)

---
updated-dependencies:
- dependency-name: dompurify
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-09-17 13:54:32 +10:00
Justin Clift
79bbb248bb Update make up to automatically initialise the db (#7161)
It does this by (very quickly) checking if the organization table
is present, running `make create_database` if not.
2024-09-14 16:29:04 +08:00
Zach Liu
5cf0b7b038 Better error msg for token validation (#7159)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
2024-09-14 12:03:20 +10:00
Justin Clift
fb1a056561 Add REDASH_HOST to the docker compose file (#7157)
This ensures emails generated in the development environment have the port number included in their urls.
2024-09-12 10:06:52 +00:00
dependabot[bot]
75e1ce4c9c Bump body-parser from 1.20.1 to 1.20.3 (#7156)
Bumps [body-parser](https://github.com/expressjs/body-parser) from 1.20.1 to 1.20.3.
- [Release notes](https://github.com/expressjs/body-parser/releases)
- [Changelog](https://github.com/expressjs/body-parser/blob/master/HISTORY.md)
- [Commits](https://github.com/expressjs/body-parser/compare/1.20.1...1.20.3)

---
updated-dependencies:
- dependency-name: body-parser
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-09-12 15:33:59 +10:00
dependabot[bot]
d6c6e3bb7a Bump express from 4.19.2 to 4.21.0 (#7155)
Bumps [express](https://github.com/expressjs/express) from 4.19.2 to 4.21.0.
- [Release notes](https://github.com/expressjs/express/releases)
- [Changelog](https://github.com/expressjs/express/blob/4.21.0/History.md)
- [Commits](https://github.com/expressjs/express/compare/4.19.2...4.21.0)

---
updated-dependencies:
- dependency-name: express
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-09-12 14:46:25 +10:00
dependabot[bot]
821c1a9488 Bump path-to-regexp from 3.2.0 to 3.3.0 (#7154)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-09-12 12:42:27 +10:00
Zach Liu
76eeea1f64 Make schema refresh timeout configurable via env var (#7114)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
2024-09-09 01:40:35 +00:00
Justin Clift
2ab07f9fc3 Remove left over compose.base.yaml file (#7142) 2024-09-06 17:47:56 +10:00
Justin Clift
a85b9d7801 Update pymssql to fix some problems with macOS ARM64 (2.3.1) (#7140)
Related: https://github.com/pymssql/pymssql/blob/master/ChangeLog.rst
2024-09-04 17:27:02 +09:00
github-actions[bot]
3330815081 Snapshot: 24.09.0-dev 2024-09-01 00:35:07 +00:00
dependabot[bot]
c25c65bc04 Bump webpack from 5.88.2 to 5.94.0 in /viz-lib (#7135)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-31 13:48:45 +10:00
Eric Radman
79a4c4c9c9 Revert "Adding ability to fix table columns in place (#7019)" (#7131) 2024-08-26 22:57:47 +10:00
Justin Clift
58a7438cc8 Bump python-rapidjson to 1.20 (#7126) 2024-08-20 08:35:54 +00:00
Zach Liu
c073c1e154 Fix mismatched poetry version (#7122)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
2024-08-16 16:01:53 +10:00
Justin Clift
159a329e26 Bump elliptic to version 6.5.7 to fix a Dependabot warning (#7120) 2024-08-14 14:11:38 +10:00
Ezra Odio
9de135c0bd Add option to choose color scheme for charts (#7062) 2024-08-08 13:08:49 -04:00
Zach Liu
285c2b6e56 Add data type to athena query runner (#7112)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
2024-08-07 03:36:58 +00:00
dependabot[bot]
b1fe2d4162 Bump sentry-sdk from 1.28.1 to 2.8.0 (#7069)
The Dependabot alert for sentry-sdk says that the security fix has
been backported to the 1.x series as well, in version 1.45.1.

So, lets use that as it should be more compatible that jumping to
a new major series version.
2024-08-06 10:05:21 +10:00
Zach Liu
a4f92a8fb5 Add data type to redshift query runner (#7109)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
2024-08-06 08:43:13 +10:00
Ezra Odio
51ef625a30 Fix alert evaluation logic and issue with calculating min and max of columns without numbers (#7103) 2024-08-05 15:20:26 +00:00
Masayuki Takahashi
a2611b89a3 Fix a display order bug in MongoDB Query Runner (#7106) 2024-08-04 04:22:59 +10:00
SeongTae Jeong
a531597016 Add the option to take new custom version for Snapshot (#7096) 2024-08-02 06:08:16 +00:00
Justin Clift
e59c02f497 Bump bootstrap to 3.4.1
Related:
- https://blog.getbootstrap.com/2018/12/13/bootstrap-3-4-0/
- https://blog.getbootstrap.com/2019/02/13/bootstrap-4-3-1-and-3-4-1/
2024-08-02 13:37:17 +09:00
github-actions[bot]
c1a60bf6d2 Snapshot: 24.08.1-dev 2024-08-02 02:49:08 +00:00
Ariel Richtman
72203655ec update rds trust (#7100) 2024-08-02 11:09:11 +10:00
Eric Radman
5257e39282 Revert "Removed unused configuration class (#6682)" (#7071) 2024-08-01 23:08:09 +00:00
Justin Clift
ec70ff4408 Bump cryptography to 42.0.x & snowflake-connector-python to 3.12.0 (#7097) 2024-08-02 08:35:36 +10:00
Masayuki Takahashi
ed8c05f634 Fix columns duplication on MongoDB Query Runner #6640 (#6641)
Co-authored-by: Konstantin Smirnov <46676677+konnectr@users.noreply.github.com>
2024-08-01 22:34:50 +00:00
Zach Liu
86b75db82e get data size in memory for better logs (#7090) 2024-08-01 12:17:54 -04:00
Ezra Odio
660d04b0f1 Adding Evaluate button for alerts to test them (#7032) 2024-08-01 15:02:12 +00:00
Ezra Odio
fc1e1f7a01 Add min/max/first selector for alerts (#7076) 2024-08-01 10:30:57 -04:00
SeongTae Jeong
8725fa4737 Add support for 'linux/arm64' platforms (#7094)
Co-authored-by: Justin Clift <justin@postgresql.org>
2024-08-01 08:52:45 +00:00
Justin Clift
ea0b3cbe3a Add the asdf .tool-versions file to .gitignore (#7095) 2024-08-01 08:19:11 +00:00
Justin Clift
714b950fde Match FROM and AS capitalisation in Dockerfile (#7093) 2024-08-01 17:48:57 +10:00
github-actions[bot]
a9c9f085af Snapshot: 24.08.0-dev 2024-08-01 00:30:32 +00:00
Ezra Odio
a69f7fb2fe Add new text pattern parameter (#7025)
Co-authored-by: Ezra Odio <eodio@starfishstorage.com>
Co-authored-by: Restyled.io <commits@restyled.io>
2024-07-24 13:20:33 -04:00
Daisuke Taniwaki
c244e75352 Support Arbitrary Catalog IDs on Athena Data Source (#7059)
Co-authored-by: SeongTae Jeong <seongtaejg@gmail.com>
2024-07-24 16:57:27 +10:00
Ezra Odio
80f7ba1b91 Added option to toggle sort on pie charts (#7055)
Co-authored-by: Ezra Odio <eodio@starfishstorage.com>
Co-authored-by: Eric Radman <eradman@starfishstorage.com>
2024-07-22 15:13:00 +00:00
SeongTae Jeong
d2745e5acc Add a label for Restyler's PR and Bump component version (#7037) 2024-07-18 22:00:29 +00:00
Ezra Odio
4114227471 Remove defaults set during schema upgrade/downgrade (#7068) 2024-07-18 16:05:34 -04:00
Ezra Odio
8fc4ce1494 Conditionally render tooltip for Edit alert button (#7054)
* Made Edit alert tooltip render conditionally
2024-07-18 14:31:31 +00:00
Ezra Odio
ebb0e2c9ad Adding ability to fix table columns in place (#7019)
This change involved adding an extra option to the GridSettings editor,
adding the "fixed" option to columns, and adding styling for the fixed
columns. In order to change the number of fixed columns, which will
default to 0, one has to go to Edit visualization -> Grid -> Choose
number of columns to fix -> Save.
2024-07-17 13:59:47 +00:00
dependabot[bot]
57a79bc96b Bump setuptools from 69.0.3 to 70.0.0 (#7060)
Bumps [setuptools](https://github.com/pypa/setuptools) from 69.0.3 to 70.0.0.
- [Release notes](https://github.com/pypa/setuptools/releases)
- [Changelog](https://github.com/pypa/setuptools/blob/main/NEWS.rst)
- [Commits](https://github.com/pypa/setuptools/compare/v69.0.3...v70.0.0)

---
updated-dependencies:
- dependency-name: setuptools
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-16 05:45:28 +10:00
Justin Clift
77f108dd09 Bump requests to 2.32.3 (#7057) 2024-07-15 09:39:14 +09:00
dependabot[bot]
dd1a9b96da Bump zipp from 3.17.0 to 3.19.1 (#7051)
Bumps [zipp](https://github.com/jaraco/zipp) from 3.17.0 to 3.19.1.
- [Release notes](https://github.com/jaraco/zipp/releases)
- [Changelog](https://github.com/jaraco/zipp/blob/main/NEWS.rst)
- [Commits](https://github.com/jaraco/zipp/compare/v3.17.0...v3.19.1)

---
updated-dependencies:
- dependency-name: zipp
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Justin Clift <justin@postgresql.org>
2024-07-12 09:25:12 +00:00
Ezra Odio
d9282b2688 Add usedforsecurity=False flag to md5 hashes (#7049)
Co-authored-by: Ezra Odio <eodio@starfishstorage.com>
Co-authored-by: Justin Clift <justin@postgresql.org>
2024-07-12 03:34:53 +10:00
Eric Radman
28c39219af Update requests module to 2.32.2 (#7053)
2.32.0 was yanked
2024-07-11 16:24:18 +00:00
Ezra Odio
a37ef3b235 Fixed frontend test deprecation warnings (#7013)
Created Moment in ISO 8601 format instead of using
the default Date() constructor.

Co-authored-by: Ezra Odio <eodio@starfishstorage.com>
2024-07-08 14:29:41 +00:00
dependabot[bot]
0056aa68f8 Bump certifi from 2023.11.17 to 2024.7.4 (#7047)
Bumps [certifi](https://github.com/certifi/python-certifi) from 2023.11.17 to 2024.7.4.
- [Commits](https://github.com/certifi/python-certifi/compare/2023.11.17...2024.07.04)

---
updated-dependencies:
- dependency-name: certifi
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-08 22:57:01 +10:00
dependabot[bot]
76b5a30fd9 Bump ws from 5.2.3 to 5.2.4 in /viz-lib (#7040)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-01 01:19:54 +00:00
github-actions[bot]
db4fdd003e Snapshot: 24.07.0-dev 2024-07-01 00:31:06 +00:00
Ezra Odio
4cb32fc1c3 Map() implementation fix for chart labels (#7022)
Co-authored-by: Ezra Odio <eodio@starfishstorage.com>
Co-authored-by: Eric Radman <eradman@starfishstorage.com>
2024-06-18 18:05:37 +00:00
dependabot[bot]
a6c728b99c Bump ws from 5.2.3 to 5.2.4 (#7021)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-18 02:25:45 +00:00
dependabot[bot]
01e036d0a9 Bump urllib3 from 1.26.18 to 1.26.19 (#7020)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-18 01:55:21 +00:00
Eric Radman
17fe69f551 PG: Only list tables where schema has USAGE permission (#7000)
This covers cases where partitioned tables are part of a schema that is
not accessible by the current user.

CREATE SCHEMA xyz;

CREATE TABLE xyz.tab (
   id bigint GENERATED ALWAYS AS IDENTITY,
   ts timestamp NOT NULL
) PARTITION BY LIST ((ts::date));

CREATE TABLE xyz.tab_default PARTITION OF xyz.tab DEFAULT;
2024-06-06 10:49:00 +10:00
Ezra Odio
bceaab0496 Update to Python 3.10 (#6991)
Updated from Python 3.8 to 3.10. Python 3.10 is the default for Ubuntu 22. This change necessitated upgrading to
SQLAlchemy_Utils 0.38.3, and importing the sort_query function from an older version of SQLAlchemy_Utils because it was dropped in newer versions.

Co-authored-by: Ezra Odio <eodio@starfishstorage.com>
2024-06-05 17:41:49 +10:00
Lucas Fernando Cardoso Nunes
70dd05916f ci: bot identity correction (#6997)
Signed-off-by: Lucas Fernando Cardoso Nunes <lucasfc.nunes@gmail.com>
2024-06-02 06:54:08 +10:00
github-actions
60a12e906e Snapshot: 24.06.0-dev 2024-06-01 00:27:53 +00:00
dependabot[bot]
ec051a8939 --- (#6981)
updated-dependencies:
- dependency-name: requests
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-21 06:50:52 +00:00
Arik Fraimovich
60d3c66a8b Merge pull request from GHSA-32fw-wc7f-7qg9 2024-05-18 07:36:29 -07:00
Justin Clift
bd4ba96c43 Typo fix in message (#6979) 2024-05-18 08:18:37 +00:00
Eric Radman
10a46fd33c Revert "show pg and athena column comments and table descriptions as antd tooltip if they are defined (#6582)" (#6971)
This reverts commit c12d45077a.

This commit did not sort tables properly by schema, then name
2024-05-16 11:28:42 +08:00
Eric Radman
c874eb6b11 Revert changes to job status (#6969)
"Query in queue" should switch to "Executing query", but does not.

Commands:

git revert --no-commit bd17662005
git revert --no-commit 5ac5d86f5e
vim tests/handlers/test_query_results.py
git add tests/handlers/test_query_results.py

Co-authored-by: Justin Clift <justin@postgresql.org>
2024-05-14 22:06:45 -04:00
Taehyung Lim
f3a323695f Bump pyodbc from 4.0.28 to 5.1.0 (#6962) 2024-05-14 16:26:28 +00:00
Eric Radman
408ba78bd0 Update MSSQL OBDC driver to v18 (#6968) 2024-05-15 01:55:32 +10:00
Eric Radman
58cc49bc88 Revert build (2 of 2) (#6967) 2024-05-14 12:30:48 +00:00
Eric Radman
753ea846ff Revert CI workflow (1 of 2) (#6965) 2024-05-14 21:54:51 +10:00
Taehyung Lim
1b946b59ec sync .nvmrc with workflow (#6958) 2024-05-10 21:32:52 +10:00
dependabot[bot]
4569191113 Bump jinja2 from 3.1.3 to 3.1.4 (#6951)
Bumps [jinja2](https://github.com/pallets/jinja) from 3.1.3 to 3.1.4.
- [Release notes](https://github.com/pallets/jinja/releases)
- [Changelog](https://github.com/pallets/jinja/blob/main/CHANGES.rst)
- [Commits](https://github.com/pallets/jinja/compare/3.1.3...3.1.4)

---
updated-dependencies:
- dependency-name: jinja2
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-07 07:14:20 +10:00
Justin Clift
62890c3ec4 Revert "Remove deprecated advocate package (#6944)"
This reverts commit bd115e7f5f, as
it turns out to be a useful security feature.

In order to remove this in a better way, we'll need to replace it
with something that provides equivalent functionality.
2024-05-07 03:20:05 +10:00
Andrii Chubatiuk
bd115e7f5f Remove deprecated advocate package (#6944) 2024-05-06 23:14:13 +10:00
Andrii Chubatiuk
bd17662005 Fixed error serialization (#6937)
* serialize errors

* lint fix

* cover successful case
2024-05-02 13:52:35 +00:00
Jason Cowley
b7f22b1896 Fix 'str' object has no attribute 'pop' error when parsing query (#6941) 2024-05-02 21:31:23 +10:00
Justin Clift
897c683980 pgautoupgrade now does multi-arch builds (#6939)
Thanks to substantial efforts by @andyundso, the Docker Hub
images for pgautoupgrade are now multi-arch (x86_64 and ARM64). :)
2024-05-01 23:03:06 +10:00
151 changed files with 7677 additions and 5117 deletions

25
.ci/compose.ci.yaml Normal file
View File

@@ -0,0 +1,25 @@
services:
redash:
build: ../
command: manage version
depends_on:
- postgres
- redis
ports:
- "5000:5000"
environment:
PYTHONUNBUFFERED: 0
REDASH_LOG_LEVEL: "INFO"
REDASH_REDIS_URL: "redis://redis:6379/0"
POSTGRES_PASSWORD: "FmTKs5vX52ufKR1rd8tn4MoSP7zvCJwb"
REDASH_DATABASE_URL: "postgresql://postgres:FmTKs5vX52ufKR1rd8tn4MoSP7zvCJwb@postgres/postgres"
REDASH_COOKIE_SECRET: "2H9gNG9obnAQ9qnR9BDTQUph6CbXKCzF"
redis:
image: redis:7-alpine
restart: unless-stopped
postgres:
image: pgautoupgrade/pgautoupgrade:latest
command: "postgres -c fsync=off -c full_page_writes=off -c synchronous_commit=OFF"
restart: unless-stopped
environment:
POSTGRES_HOST_AUTH_METHOD: "trust"

73
.ci/compose.cypress.yaml Normal file
View File

@@ -0,0 +1,73 @@
x-redash-service: &redash-service
build:
context: ../
args:
install_groups: "main"
code_coverage: ${CODE_COVERAGE}
x-redash-environment: &redash-environment
REDASH_LOG_LEVEL: "INFO"
REDASH_REDIS_URL: "redis://redis:6379/0"
POSTGRES_PASSWORD: "FmTKs5vX52ufKR1rd8tn4MoSP7zvCJwb"
REDASH_DATABASE_URL: "postgresql://postgres:FmTKs5vX52ufKR1rd8tn4MoSP7zvCJwb@postgres/postgres"
REDASH_RATELIMIT_ENABLED: "false"
REDASH_ENFORCE_CSRF: "true"
REDASH_COOKIE_SECRET: "2H9gNG9obnAQ9qnR9BDTQUph6CbXKCzF"
services:
server:
<<: *redash-service
command: server
depends_on:
- postgres
- redis
ports:
- "5000:5000"
environment:
<<: *redash-environment
PYTHONUNBUFFERED: 0
scheduler:
<<: *redash-service
command: scheduler
depends_on:
- server
environment:
<<: *redash-environment
worker:
<<: *redash-service
command: worker
depends_on:
- server
environment:
<<: *redash-environment
PYTHONUNBUFFERED: 0
cypress:
ipc: host
build:
context: ../
dockerfile: .ci/Dockerfile.cypress
depends_on:
- server
- worker
- scheduler
environment:
CYPRESS_baseUrl: "http://server:5000"
CYPRESS_coverage: ${CODE_COVERAGE}
PERCY_TOKEN: ${PERCY_TOKEN}
PERCY_BRANCH: ${CIRCLE_BRANCH}
PERCY_COMMIT: ${CIRCLE_SHA1}
PERCY_PULL_REQUEST: ${CIRCLE_PR_NUMBER}
COMMIT_INFO_BRANCH: ${CIRCLE_BRANCH}
COMMIT_INFO_MESSAGE: ${COMMIT_INFO_MESSAGE}
COMMIT_INFO_AUTHOR: ${CIRCLE_USERNAME}
COMMIT_INFO_SHA: ${CIRCLE_SHA1}
COMMIT_INFO_REMOTE: ${CIRCLE_REPOSITORY_URL}
CYPRESS_PROJECT_ID: ${CYPRESS_PROJECT_ID}
CYPRESS_RECORD_KEY: ${CYPRESS_RECORD_KEY}
redis:
image: redis:7-alpine
restart: unless-stopped
postgres:
image: pgautoupgrade/pgautoupgrade:latest
command: "postgres -c fsync=off -c full_page_writes=off -c synchronous_commit=OFF"
restart: unless-stopped
environment:
POSTGRES_HOST_AUTH_METHOD: "trust"

39
.ci/docker_build Executable file
View File

@@ -0,0 +1,39 @@
#!/bin/bash
# This script only needs to run on the main Redash repo
if [ "${GITHUB_REPOSITORY}" != "getredash/redash" ]; then
echo "Skipping image build for Docker Hub, as this isn't the main Redash repository"
exit 0
fi
if [ "${GITHUB_REF_NAME}" != "master" ] && [ "${GITHUB_REF_NAME}" != "preview-image" ]; then
echo "Skipping image build for Docker Hub, as this isn't the 'master' nor 'preview-image' branch"
exit 0
fi
if [ "x${DOCKER_USER}" = "x" ] || [ "x${DOCKER_PASS}" = "x" ]; then
echo "Skipping image build for Docker Hub, as the login details aren't available"
exit 0
fi
set -e
VERSION=$(jq -r .version package.json)
VERSION_TAG="$VERSION.b${GITHUB_RUN_ID}.${GITHUB_RUN_NUMBER}"
export DOCKER_BUILDKIT=1
export COMPOSE_DOCKER_CLI_BUILD=1
docker login -u "${DOCKER_USER}" -p "${DOCKER_PASS}"
DOCKERHUB_REPO="redash/redash"
DOCKER_TAGS="-t redash/redash:preview -t redash/preview:${VERSION_TAG}"
# Build the docker container
docker build --build-arg install_groups="main,all_ds,dev" ${DOCKER_TAGS} .
# Push the container to the preview build locations
docker push "${DOCKERHUB_REPO}:preview"
docker push "redash/preview:${VERSION_TAG}"
echo "Built: ${VERSION_TAG}"

9
.ci/pack Executable file
View File

@@ -0,0 +1,9 @@
#!/bin/bash
NAME=redash
VERSION=$(jq -r .version package.json)
FULL_VERSION=$VERSION+b$CIRCLE_BUILD_NUM
FILENAME=$NAME.$FULL_VERSION.tar.gz
mkdir -p /tmp/artifacts/
tar -zcv -f /tmp/artifacts/$FILENAME --exclude=".git" --exclude="optipng*" --exclude="cypress" --exclude="*.pyc" --exclude="*.pyo" --exclude="venv" *

6
.ci/update_version Executable file
View File

@@ -0,0 +1,6 @@
#!/bin/bash
VERSION=$(jq -r .version package.json)
FULL_VERSION=${VERSION}+b${GITHUB_RUN_ID}.${GITHUB_RUN_NUMBER}
sed -ri "s/^__version__ = '([A-Za-z0-9.-]*)'/__version__ = '${FULL_VERSION}'/" redash/__init__.py
sed -i "s/dev/${GITHUB_SHA}/" client/app/version.json

View File

@@ -3,29 +3,12 @@ on:
push:
branches:
- master
tags:
- '*'
pull_request_target:
pull_request:
branches:
- master
env:
CYPRESS_COVERAGE: "true"
NODE_VERSION: 18
YARN_VERSION: 1.22.22
REDASH_COOKIE_SECRET: 2H9gNG9obnAQ9qnR9BDTQUph6CbXKCzF
REDASH_SECRET_KEY: 2H9gNG9obnAQ9qnR9BDTQUph6CbXKCzF
COMPOSE_DOCKER_CLI_BUILD: 1
DOCKER_BUILDKIT: 1
FRONTEND_BUILD_MODE: 1
INSTALL_GROUPS: main,all_ds,dev
PERCY_BRANCH: ${{github.head_ref || github.ref_name}}
PERCY_COMMIT: ${{github.sha}}
PERCY_PULL_REQUEST: ${{github.event.number}}
COMMIT_INFO_BRANCH: ${{github.head_ref || github.ref_name}}
COMMIT_INFO_MESSAGE: ${{github.event.head_commit.message}}
COMMIT_INFO_AUTHOR: ${{github.event.pull_request.user.login}}
COMMIT_INFO_SHA: ${{github.sha}}
COMMIT_INFO_REMOTE: ${{github.server_url}}/${{github.repository}}
jobs:
backend-lint:
runs-on: ubuntu-22.04
@@ -40,7 +23,7 @@ jobs:
- uses: actions/setup-python@v5
with:
python-version: '3.8'
- run: sudo pip install black==24.3.0 ruff==0.1.9
- run: sudo pip install black==23.1.0 ruff==0.0.287
- run: ruff check .
- run: black --check .
@@ -48,7 +31,10 @@ jobs:
runs-on: ubuntu-22.04
needs: backend-lint
env:
FRONTEND_BUILD_MODE: 0
COMPOSE_FILE: .ci/compose.ci.yaml
COMPOSE_PROJECT_NAME: redash
COMPOSE_DOCKER_CLI_BUILD: 1
DOCKER_BUILDKIT: 1
steps:
- if: github.event.pull_request.mergeable == 'false'
name: Exit if PR is not mergeable
@@ -60,25 +46,24 @@ jobs:
- name: Build Docker Images
run: |
set -x
touch .env
docker compose build
docker compose build --build-arg install_groups="main,all_ds,dev" --build-arg skip_frontend_build=true
docker compose up -d
sleep 10
- name: Create Test Database
run: docker compose run --rm postgres psql -h postgres -U postgres -c "create database tests;"
run: docker compose -p redash run --rm postgres psql -h postgres -U postgres -c "create database tests;"
- name: List Enabled Query Runners
run: docker compose run --rm server manage ds list_types
run: docker compose -p redash run --rm redash manage ds list_types
- name: Run Tests
run: docker compose run --name tests server tests --junitxml=junit.xml --cov-report=xml --cov=redash --cov-config=.coveragerc tests/
run: docker compose -p redash run --name tests redash tests --junitxml=junit.xml --cov-report=xml --cov=redash --cov-config=.coveragerc tests/
- name: Copy Test Results
run: |
mkdir -p /tmp/test-results/unit-tests
docker cp tests:/app/coverage.xml ./coverage.xml
docker cp tests:/app/junit.xml /tmp/test-results/unit-tests/results.xml
- name: Upload coverage reports to Codecov
uses: codecov/codecov-action@v3
with:
token: ${{ secrets.CODECOV_TOKEN }}
# - name: Upload coverage reports to Codecov
# uses: codecov/codecov-action@v3
# with:
# token: ${{ secrets.CODECOV_TOKEN }}
- name: Store Test Results
uses: actions/upload-artifact@v4
with:
@@ -87,7 +72,7 @@ jobs:
- name: Store Coverage Results
uses: actions/upload-artifact@v4
with:
name: backend-coverage
name: coverage
path: coverage.xml
frontend-lint:
@@ -107,8 +92,7 @@ jobs:
- name: Install Dependencies
run: |
npm install --global --force yarn@$YARN_VERSION
yarn cache clean
yarn --frozen-lockfile --network-concurrency 1
yarn cache clean && yarn --frozen-lockfile --network-concurrency 1
- name: Run Lint
run: yarn lint:ci
- name: Store Test Results
@@ -135,27 +119,24 @@ jobs:
- name: Install Dependencies
run: |
npm install --global --force yarn@$YARN_VERSION
yarn cache clean
yarn --frozen-lockfile --network-concurrency 1
yarn cache clean && yarn --frozen-lockfile --network-concurrency 1
- name: Run App Tests
run: yarn test
- name: Run Visualizations Tests
run: |
cd viz-lib
yarn test
run: cd viz-lib && yarn test
- run: yarn lint
frontend-e2e-tests:
runs-on: ubuntu-22.04
needs: frontend-lint
env:
COMPOSE_FILE: .ci/compose.cypress.yaml
COMPOSE_PROJECT_NAME: cypress
CYPRESS_INSTALL_BINARY: 0
PUPPETEER_SKIP_CHROMIUM_DOWNLOAD: 1
INSTALL_GROUPS: main
COMPOSE_PROFILES: e2e
PERCY_TOKEN: ${{ secrets.PERCY_TOKEN }}
CYPRESS_PROJECT_ID: ${{ secrets.CYPRESS_PROJECT_ID }}
CYPRESS_RECORD_KEY: ${{ secrets.CYPRESS_RECORD_KEY }}
# PERCY_TOKEN: ${{ secrets.PERCY_TOKEN }}
# CYPRESS_PROJECT_ID: ${{ secrets.CYPRESS_PROJECT_ID }}
# CYPRESS_RECORD_KEY: ${{ secrets.CYPRESS_RECORD_KEY }}
steps:
- if: github.event.pull_request.mergeable == 'false'
name: Exit if PR is not mergeable
@@ -168,16 +149,17 @@ jobs:
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'yarn'
- name: Enable Code Coverage Report For Master Branch
if: endsWith(github.ref, '/master')
run: |
echo "CODE_COVERAGE=true" >> "$GITHUB_ENV"
- name: Install Dependencies
run: |
npm install --global --force yarn@$YARN_VERSION
yarn cache clean
yarn --frozen-lockfile --network-concurrency 1
yarn cache clean && yarn --frozen-lockfile --network-concurrency 1
- name: Setup Redash Server
run: |
set -x
touch .env
yarn build
yarn cypress build
yarn cypress start -- --skip-db-seed
docker compose run cypress yarn cypress db-seed
@@ -191,10 +173,5 @@ jobs:
- name: Store Coverage Results
uses: actions/upload-artifact@v4
with:
name: frontend-coverage
name: coverage
path: coverage
- uses: actions/upload-artifact@v4
with:
name: frontend
path: client/dist
retention-days: 1

View File

@@ -1,11 +1,24 @@
name: Periodic Snapshot
# 10 minutes after midnight on the first of every month
on:
schedule:
- cron: "10 0 1 * *"
- cron: '10 0 1 * *' # 10 minutes after midnight on the first of every month
workflow_dispatch:
inputs:
bump:
description: 'Bump the last digit of the version'
required: false
type: boolean
version:
description: 'Specific version to set'
required: false
default: ''
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
permissions:
actions: write
contents: write
jobs:
@@ -15,14 +28,58 @@ jobs:
- uses: actions/checkout@v4
with:
ssh-key: ${{ secrets.ACTION_PUSH_KEY }}
- run: |
date="$(date +%y.%m).0-dev"
gawk -i inplace -F: -v q=\" -v tag=$date '/^ "version": / { print $1 FS, q tag q ","; next} { print }' package.json
gawk -i inplace -F= -v q=\" -v tag=$date '/^__version__ =/ { print $1 FS, q tag q; next} { print }' redash/__init__.py
gawk -i inplace -F= -v q=\" -v tag=$date '/^version =/ { print $1 FS, q tag q; next} { print }' pyproject.toml
git config user.name github-actions
git config user.email github-actions@github.com
git config user.name 'github-actions[bot]'
git config user.email '41898282+github-actions[bot]@users.noreply.github.com'
# Function to bump the version
bump_version() {
local version="$1"
local IFS=.
read -r major minor patch <<< "$version"
patch=$((patch + 1))
echo "$major.$minor.$patch-dev"
}
# Determine the new version tag
if [ "${{ github.event_name }}" = "workflow_dispatch" ]; then
BUMP_INPUT="${{ github.event.inputs.bump }}"
SPECIFIC_VERSION="${{ github.event.inputs.version }}"
# Check if both bump and specific version are provided
if [ "$BUMP_INPUT" = "true" ] && [ -n "$SPECIFIC_VERSION" ]; then
echo "::error::Error: Cannot specify both bump and specific version."
exit 1
fi
if [ -n "$SPECIFIC_VERSION" ]; then
TAG_NAME="$SPECIFIC_VERSION-dev"
elif [ "$BUMP_INPUT" = "true" ]; then
CURRENT_VERSION=$(grep '"version":' package.json | awk -F\" '{print $4}')
TAG_NAME=$(bump_version "$CURRENT_VERSION")
else
echo "No version bump or specific version provided for manual dispatch."
exit 1
fi
else
TAG_NAME="$(date +%y.%m).0-dev"
fi
echo "New version tag: $TAG_NAME"
# Update version in files
gawk -i inplace -F: -v q=\" -v tag=${TAG_NAME} '/^ "version": / { print $1 FS, q tag q ","; next} { print }' package.json
gawk -i inplace -F= -v q=\" -v tag=${TAG_NAME} '/^__version__ =/ { print $1 FS, q tag q; next} { print }' redash/__init__.py
gawk -i inplace -F= -v q=\" -v tag=${TAG_NAME} '/^version =/ { print $1 FS, q tag q; next} { print }' pyproject.toml
git add package.json redash/__init__.py pyproject.toml
git commit -m "Snapshot: ${date}"
git tag $date
git push --atomic origin master refs/tags/$date
git commit -m "Snapshot: ${TAG_NAME}"
git tag ${TAG_NAME}
git push --atomic origin master refs/tags/${TAG_NAME}
# Run the 'preview-image' workflow if run this workflow manually
# For more information, please see the: https://docs.github.com/en/actions/security-guides/automatic-token-authentication
if [ "$BUMP_INPUT" = "true" ] || [ -n "$SPECIFIC_VERSION" ]; then
gh workflow run preview-image.yml --ref $TAG_NAME
fi

View File

@@ -1,20 +1,25 @@
name: Preview Image
on:
workflow_run:
workflows:
- Tests
types:
- completed
branches:
- master
push:
tags:
- '*-dev'
workflow_dispatch:
inputs:
dockerRepository:
description: 'Docker repository'
required: true
default: 'preview'
type: choice
options:
- preview
- redash
env:
DOCKER_REPO: redash
NODE_VERSION: 18
jobs:
build-skip-check:
runs-on: ubuntu-22.04
if: ${{ github.event.workflow_run.conclusion == 'success' }}
outputs:
skip: ${{ steps.skip-check.outputs.skip }}
steps:
@@ -34,121 +39,144 @@ jobs:
fi
build-docker-image:
runs-on: ubuntu-22.04
needs:
- build-skip-check
outputs:
version: ${{ steps.version.outputs.VERSION_TAG }}
repo: ${{ steps.version.outputs.DOCKER_REPO }}
if: needs.build-skip-check.outputs.skip == 'false'
runs-on: ${{ matrix.os }}
strategy:
fail-fast: false
matrix:
platform:
- linux/amd64
- linux/arm64
arch:
- amd64
- arm64
include:
- arch: amd64
os: ubuntu-22.04
- arch: arm64
os: ubuntu-22.04-arm
outputs:
VERSION_TAG: ${{ steps.version.outputs.VERSION_TAG }}
needs:
- build-skip-check
if: needs.build-skip-check.outputs.skip == 'false'
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 1
ref: ${{ github.event.push.after }}
- uses: dawidd6/action-download-artifact@v3
- uses: actions/setup-node@v4
with:
name: frontend
workflow: ci.yml
github_token: ${{ secrets.GITHUB_TOKEN }}
run_id: ${{ github.event.workflow_run.id }}
path: client/dist
node-version: ${{ env.NODE_VERSION }}
cache: 'yarn'
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Login to DockerHub
uses: docker/login-action@v3
with:
username: ${{ vars.DOCKER_USER }}
password: ${{ secrets.DOCKER_PASS }}
- name: Install Dependencies
env:
PUPPETEER_SKIP_CHROMIUM_DOWNLOAD: true
run: |
npm install --global --force yarn@1.22.22
yarn cache clean && yarn --frozen-lockfile --network-concurrency 1
- name: Set version
id: version
run: |
set -x
VERSION=$(jq -r .version package.json)
FULL_VERSION=${VERSION}-b${GITHUB_RUN_ID}.${GITHUB_RUN_NUMBER}
sed -ri "s/^__version__ = ([A-Za-z0-9.-]*)'/__version__ = '${FULL_VERSION}'/" redash/__init__.py
sed -i "s/dev/${GITHUB_SHA}/" client/app/version.json
echo "VERSION_TAG=$FULL_VERSION" >> "$GITHUB_OUTPUT"
platform=${{ matrix.platform }}
echo "PLATFORM_PAIR=${platform//\//-}" >> $GITHUB_ENV
echo "SCOPE=${platform//\//-}" >> $GITHUB_ENV
if [[ "${{ vars.DOCKER_REPO }}" != "" ]]; then
echo "DOCKER_REPO=${{ vars.DOCKER_REPO }}" >> $GITHUB_ENV
echo "DOCKER_REPO=${{ vars.DOCKER_REPO }}" >> $GITHUB_OUTPUT
else
echo "DOCKER_REPO=${DOCKER_REPO}" >> $GITHUB_ENV
echo "DOCKER_REPO=${DOCKER_REPO}" >> $GITHUB_OUTPUT
fi
.ci/update_version
VERSION_TAG=$(jq -r .version package.json)
echo "VERSION_TAG=$VERSION_TAG" >> "$GITHUB_OUTPUT"
- name: Build and push preview image to Docker Hub
uses: docker/build-push-action@v5
id: build
id: build-preview
uses: docker/build-push-action@v4
if: ${{ github.event.inputs.dockerRepository == 'preview' || !github.event.workflow_run }}
with:
push: true
tags: |
${{ vars.DOCKER_USER }}/redash
${{ vars.DOCKER_USER }}/preview
context: .
cache-from: type=gha,scope=${{ env.SCOPE }}
cache-to: type=gha,mode=max,scope=${{ env.SCOPE }}
platforms: ${{ matrix.platform }}
outputs: type=image,name=${{ env.DOCKER_REPO }}/redash,push-by-digest=true,name-canonical=true,push=true
build-args: |
FRONTEND_BUILD_MODE=1
test_all_deps=true
outputs: type=image,push-by-digest=true,push=true
cache-from: type=gha,scope=${{ matrix.arch }}
cache-to: type=gha,mode=max,scope=${{ matrix.arch }}
env:
DOCKER_CONTENT_TRUST: true
- name: Build and push release image to Docker Hub
id: build-release
uses: docker/build-push-action@v4
if: ${{ github.event.inputs.dockerRepository == 'redash' }}
with:
tags: |
${{ vars.DOCKER_USER }}/redash:${{ steps.version.outputs.VERSION_TAG }}
context: .
build-args: |
test_all_deps=true
outputs: type=image,push-by-digest=true,push=true
cache-from: type=gha,scope=${{ matrix.arch }}
cache-to: type=gha,mode=max,scope=${{ matrix.arch }}
env:
DOCKER_CONTENT_TRUST: true
- name: "Failure: output container logs to console"
if: failure()
run: docker compose logs
- name: Export digest
run: |
mkdir -p /tmp/digests
digest="${{ steps.build.outputs.digest }}"
touch "/tmp/digests/${digest#sha256:}"
mkdir -p ${{ runner.temp }}/digests
if [[ "${{ github.event.inputs.dockerRepository }}" == 'preview' || !github.event.workflow_run ]]; then
digest="${{ steps.build-preview.outputs.digest}}"
else
digest="${{ steps.build-release.outputs.digest}}"
fi
touch "${{ runner.temp }}/digests/${digest#sha256:}"
- name: Upload digest
uses: actions/upload-artifact@v4
with:
name: digests-${{ env.PLATFORM_PAIR }}
path: /tmp/digests/*
name: digests-${{ matrix.arch }}
path: ${{ runner.temp }}/digests/*
if-no-files-found: error
retention-days: 1
publish-docker-manifest:
merge-docker-image:
runs-on: ubuntu-22.04
needs:
- build-skip-check
- build-docker-image
if: needs.build-skip-check.outputs.skip == 'false'
needs: build-docker-image
steps:
- name: Download digests
uses: actions/download-artifact@v4
with:
pattern: digests-*
path: /tmp/digests
merge-multiple: true
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Docker meta
id: meta
uses: docker/metadata-action@v5
with:
images: ${{ needs.build-docker-image.outputs.repo }}/redash
tags: preview
- name: Login to DockerHub
uses: docker/login-action@v3
with:
username: ${{ vars.DOCKER_USER }}
password: ${{ secrets.DOCKER_PASS }}
- name: Create manifest list and push
working-directory: /tmp/digests
- name: Download digests
uses: actions/download-artifact@v4
with:
path: ${{ runner.temp }}/digests
pattern: digests-*
merge-multiple: true
- name: Create and push manifest for the preview image
if: ${{ github.event.inputs.dockerRepository == 'preview' || !github.event.workflow_run }}
working-directory: ${{ runner.temp }}/digests
run: |
docker buildx imagetools create $(jq -cr '.tags | map("-t " + .) | join(" ")' <<< "$DOCKER_METADATA_OUTPUT_JSON") \
$(printf '${{ needs.build-docker-image.outputs.repo }}/redash@sha256:%s ' *)
- name: Inspect image
docker buildx imagetools create -t ${{ vars.DOCKER_USER }}/redash:preview \
$(printf '${{ vars.DOCKER_USER }}/redash:preview@sha256:%s ' *)
docker buildx imagetools create -t ${{ vars.DOCKER_USER }}/preview:${{ needs.build-docker-image.outputs.VERSION_TAG }} \
$(printf '${{ vars.DOCKER_USER }}/preview:${{ needs.build-docker-image.outputs.VERSION_TAG }}@sha256:%s ' *)
- name: Create and push manifest for the release image
if: ${{ github.event.inputs.dockerRepository == 'redash' }}
working-directory: ${{ runner.temp }}/digests
run: |
REDASH_IMAGE="${{ needs.build-docker-image.outputs.repo }}/redash:${{ steps.meta.outputs.version }}"
docker buildx imagetools inspect $REDASH_IMAGE
- name: Push image ${{ needs.build-docker-image.outputs.repo }}/preview image
run: |
REDASH_IMAGE="${{ needs.build-docker-image.outputs.repo }}/redash:preview"
PREVIEW_IMAGE="${{ needs.build-docker-image.outputs.repo }}/preview:${{ needs.build-docker-image.outputs.version }}"
docker buildx imagetools create --tag $PREVIEW_IMAGE $REDASH_IMAGE
docker buildx imagetools create -t ${{ vars.DOCKER_USER }}/redash:${{ needs.build-docker-image.outputs.VERSION_TAG }} \
$(printf '${{ vars.DOCKER_USER }}/redash:${{ needs.build-docker-image.outputs.VERSION_TAG }}@sha256:%s ' *)

36
.github/workflows/restyled.yml vendored Normal file
View File

@@ -0,0 +1,36 @@
name: Restyled
on:
pull_request:
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
restyled:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
- uses: restyled-io/actions/setup@v4
- id: restyler
uses: restyled-io/actions/run@v4
with:
fail-on-differences: true
- if: |
!cancelled() &&
steps.restyler.outputs.success == 'true' &&
github.event.pull_request.head.repo.full_name == github.repository
uses: peter-evans/create-pull-request@v6
with:
base: ${{ steps.restyler.outputs.restyled-base }}
branch: ${{ steps.restyler.outputs.restyled-head }}
title: ${{ steps.restyler.outputs.restyled-title }}
body: ${{ steps.restyler.outputs.restyled-body }}
labels: "restyled"
reviewers: ${{ github.event.pull_request.user.login }}
delete-branch: true

1
.gitignore vendored
View File

@@ -17,6 +17,7 @@ client/dist
_build
.vscode
.env
.tool-versions
dump.rdb

2
.nvmrc
View File

@@ -1 +1 @@
v16.20.1
v18

View File

@@ -38,7 +38,9 @@ request_review: author
#
# These can be used to tell other automation to avoid our PRs.
#
labels: ["Skip CI"]
labels:
- restyled
- "Skip CI"
# Labels to ignore
#
@@ -50,13 +52,13 @@ labels: ["Skip CI"]
# Restylers to run, and how
restylers:
- name: black
image: restyled/restyler-black:v19.10b0
image: restyled/restyler-black:v24.4.2
include:
- redash
- tests
- migrations/versions
- name: prettier
image: restyled/restyler-prettier:v1.19.1-2
image: restyled/restyler-prettier:v3.3.2-2
command:
- prettier
- --write

View File

@@ -1,37 +1,43 @@
# Controls whether to build the frontend assets
ARG FRONTEND_BUILD_MODE=0
FROM node:18-bookworm AS frontend-builder
# MODE 0: create empty files. useful for backend tests
FROM alpine:3.19 as frontend-builder-0
RUN \
mkdir -p /frontend/client/dist && \
touch /frontend/client/dist/multi_org.html && \
touch /frontend/client/dist/index.html
# MODE 1: copy static frontend from host, useful for CI to ignore building static content multiple times
FROM alpine:3.19 as frontend-builder-1
COPY client/dist /frontend/client/dist
# MODE 2: build static content in docker, can be used for a local development
FROM node:18-bookworm as frontend-builder-2
RUN npm install --global --force yarn@1.22.22
# Controls whether to build the frontend assets
ARG skip_frontend_build
ENV CYPRESS_INSTALL_BINARY=0
ENV PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=1
RUN useradd -m -d /frontend redash
USER redash
WORKDIR /frontend
COPY --chown=redash package.json yarn.lock .yarnrc /frontend/
COPY --chown=redash viz-lib /frontend/viz-lib
COPY --chown=redash scripts /frontend/scripts
RUN yarn --frozen-lockfile --network-concurrency 1;
# Controls whether to instrument code for coverage information
ARG code_coverage
ENV BABEL_ENV=${code_coverage:+test}
# Avoid issues caused by lags in disk and network I/O speeds when working on top of QEMU emulation for multi-platform image building.
RUN yarn config set network-timeout 300000
RUN if [ "x$skip_frontend_build" = "x" ] ; then yarn --frozen-lockfile --network-concurrency 1; fi
COPY --chown=redash client /frontend/client
COPY --chown=redash webpack.config.js /frontend/
RUN yarn build
RUN <<EOF
if [ "x$skip_frontend_build" = "x" ]; then
yarn build
else
mkdir -p /frontend/client/dist
touch /frontend/client/dist/multi_org.html
touch /frontend/client/dist/index.html
fi
EOF
FROM frontend-builder-${FRONTEND_BUILD_MODE} as frontend-builder
FROM python:3.8-slim-bookworm
FROM python:3.10-slim-bookworm
EXPOSE 5000
@@ -66,39 +72,44 @@ RUN apt-get update && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*
RUN \
curl https://packages.microsoft.com/config/debian/12/prod.list > /etc/apt/sources.list.d/mssql-release.list && \
curl https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor -o /usr/share/keyrings/microsoft-prod.gpg && \
apt update && \
ACCEPT_EULA=Y apt install -y --no-install-recommends msodbcsql18 && \
apt clean && \
rm -rf /var/lib/apt/lists/*
ARG TARGETPLATFORM
ARG databricks_odbc_driver_url=https://databricks-bi-artifacts.s3.us-east-2.amazonaws.com/simbaspark-drivers/odbc/2.6.26/SimbaSparkODBC-2.6.26.1045-Debian-64bit.zip
RUN if [ "$TARGETPLATFORM" = "linux/amd64" ]; then \
curl "$databricks_odbc_driver_url" --location --output /tmp/simba_odbc.zip \
&& chmod 600 /tmp/simba_odbc.zip \
&& unzip /tmp/simba_odbc.zip -d /tmp/simba \
&& dpkg -i /tmp/simba/*.deb \
&& printf "[Simba]\nDriver = /opt/simba/spark/lib/64/libsparkodbc_sb64.so" >> /etc/odbcinst.ini \
&& rm /tmp/simba_odbc.zip \
&& rm -rf /tmp/simba; fi
RUN <<EOF
if [ "$TARGETPLATFORM" = "linux/amd64" ]; then
curl https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor -o /usr/share/keyrings/microsoft-prod.gpg
curl https://packages.microsoft.com/config/debian/12/prod.list > /etc/apt/sources.list.d/mssql-release.list
apt-get update
ACCEPT_EULA=Y apt-get install -y --no-install-recommends msodbcsql18
apt-get clean
rm -rf /var/lib/apt/lists/*
curl "$databricks_odbc_driver_url" --location --output /tmp/simba_odbc.zip
chmod 600 /tmp/simba_odbc.zip
unzip /tmp/simba_odbc.zip -d /tmp/simba
dpkg -i /tmp/simba/*.deb
printf "[Simba]\nDriver = /opt/simba/spark/lib/64/libsparkodbc_sb64.so" >> /etc/odbcinst.ini
rm /tmp/simba_odbc.zip
rm -rf /tmp/simba
fi
EOF
WORKDIR /app
ENV POETRY_VERSION=1.6.1
ENV POETRY_VERSION=1.8.3
ENV POETRY_HOME=/etc/poetry
ENV POETRY_VIRTUALENVS_CREATE=false
RUN curl -sSL https://install.python-poetry.org | python3 -
# Avoid crashes, including corrupted cache artifacts, when building multi-platform images with GitHub Actions.
RUN /etc/poetry/bin/poetry cache clear pypi --all
COPY pyproject.toml poetry.lock ./
ARG POETRY_OPTIONS="--no-root --no-interaction --no-ansi"
# for LDAP authentication, install with `ldap3` group
# disabled by default due to GPL license conflict
ARG INSTALL_GROUPS="main,all_ds,dev"
RUN /etc/poetry/bin/poetry install --only $INSTALL_GROUPS $POETRY_OPTIONS
ARG install_groups="main,all_ds,dev"
RUN /etc/poetry/bin/poetry install --only $install_groups $POETRY_OPTIONS
COPY --chown=redash . /app
COPY --from=frontend-builder --chown=redash /frontend/client/dist /app/client/dist

View File

@@ -1,18 +1,14 @@
.PHONY: compose_build up test_db create_database create_db clean clean-all down tests lint backend-unit-tests frontend-unit-tests pydeps test build watch start redis-cli bash
export COMPOSE_DOCKER_CLI_BUILD=1
export DOCKER_BUILDKIT=1
export COMPOSE_PROFILES=local
.PHONY: compose_build up test_db create_database clean clean-all down tests lint backend-unit-tests frontend-unit-tests test build watch start redis-cli bash
compose_build: .env
docker compose build
COMPOSE_DOCKER_CLI_BUILD=1 DOCKER_BUILDKIT=1 docker compose build
up:
docker compose up -d redis postgres
docker compose up -d redis postgres --remove-orphans
docker compose exec -u postgres postgres psql postgres --csv \
-1tqc "SELECT table_name FROM information_schema.tables WHERE table_name = 'organizations'" 2> /dev/null \
| grep -q "organizations" || make create_database
docker compose up -d --build
COMPOSE_DOCKER_CLI_BUILD=1 DOCKER_BUILDKIT=1 docker compose up -d --build --remove-orphans
test_db:
@for i in `seq 1 5`; do \
@@ -21,11 +17,9 @@ test_db:
done
docker compose exec postgres sh -c 'psql -U postgres -c "drop database if exists tests;" && psql -U postgres -c "create database tests;"'
create_db: .env
create_database: .env
docker compose run server create_db
create_database: create_db
clean:
docker compose down
docker compose --project-name cypress down
@@ -40,7 +34,7 @@ clean:
clean-all: clean
docker image rm --force \
redash/redash:10.1.0.b50633 redis:7-alpine maildev/maildev:latest \
redash/redash:latest redis:7-alpine maildev/maildev:latest \
pgautoupgrade/pgautoupgrade:15-alpine3.8 pgautoupgrade/pgautoupgrade:latest
down:
@@ -54,12 +48,6 @@ env: .env
format:
pre-commit run --all-files
pydeps:
pip3 install wheel
pip3 install --upgrade black ruff launchpadlib pip setuptools
pip3 install poetry
poetry install --only main,all_ds,dev
tests:
docker compose run server tests

View File

@@ -1,48 +1,25 @@
#!/bin/bash
set -e
if [ -z $REDASH_REDIS_URL ]; then
export REDASH_REDIS_URL=redis://:${REDASH_REDIS_PASSWORD}@${REDASH_REDIS_HOSTNAME}:${REDASH_REDIS_PORT}/${REDASH_REDIS_NAME}
fi
if [ -z $REDASH_DATABASE_URL ]; then
export REDASH_DATABASE_URL=postgresql://${REDASH_DATABASE_USER}:${REDASH_DATABASE_PASSWORD}@${REDASH_DATABASE_HOSTNAME}:${REDASH_DATABASE_PORT}/${REDASH_DATABASE_NAME}
fi
scheduler() {
echo "Starting RQ scheduler..."
case $REDASH_PRODUCTION in
true)
echo "Starting RQ scheduler in production mode"
exec ./manage.py rq scheduler
;;
*)
echo "Starting RQ scheduler in dev mode"
exec watchmedo auto-restart \
--directory=./redash/ \
--pattern=*.py \
--recursive -- ./manage.py rq scheduler $QUEUES
;;
esac
exec /app/manage.py rq scheduler
}
dev_scheduler() {
echo "Starting dev RQ scheduler..."
exec watchmedo auto-restart --directory=./redash/ --pattern=*.py --recursive -- ./manage.py rq scheduler
}
worker() {
echo "Starting RQ worker..."
export WORKERS_COUNT=${WORKERS_COUNT:-2}
export QUEUES=${QUEUES:-}
case $REDASH_PRODUCTION in
true)
echo "Starting RQ worker in production mode"
exec supervisord -c worker.conf
;;
*)
echo "Starting RQ worker in dev mode"
exec watchmedo auto-restart \
--directory=./redash/ \
--pattern=*.py \
--recursive -- ./manage.py rq worker $QUEUES
;;
esac
}
workers_healthcheck() {
@@ -58,63 +35,22 @@ workers_healthcheck() {
fi
}
dev_worker() {
echo "Starting dev RQ worker..."
exec watchmedo auto-restart --directory=./redash/ --pattern=*.py --recursive -- ./manage.py rq worker $QUEUES
}
server() {
# Recycle gunicorn workers every n-th request. See http://docs.gunicorn.org/en/stable/settings.html#max-requests for more details.
case $REDASH_PRODUCTION in
true)
echo "Starting Redash Server in production mode"
MAX_REQUESTS=${MAX_REQUESTS:-1000}
MAX_REQUESTS_JITTER=${MAX_REQUESTS_JITTER:-100}
TIMEOUT=${REDASH_GUNICORN_TIMEOUT:-60}
exec /usr/local/bin/gunicorn \
-b 0.0.0.0:5000 \
--name redash \
-w${REDASH_WEB_WORKERS:-4} redash.wsgi:app \
--max-requests $MAX_REQUESTS \
--max-requests-jitter $MAX_REQUESTS_JITTER \
--timeout $TIMEOUT
;;
*)
echo "Starting Redash Server in a dev mode"
export FLASK_DEBUG=1
exec /app/manage.py runserver --debugger --reload -h 0.0.0.0
;;
esac
exec /usr/local/bin/gunicorn -b 0.0.0.0:5000 --name redash -w${REDASH_WEB_WORKERS:-4} redash.wsgi:app --max-requests $MAX_REQUESTS --max-requests-jitter $MAX_REQUESTS_JITTER --timeout $TIMEOUT
}
create_db() {
REDASH_DATABASE_MIGRATE_TIMEOUT=${REDASH_DATABASE_UPGRADE_TIMEOUT:-600}
REDASH_DATABASE_MIGRATE_MAX_ATTEMPTS=${REDASH_DATABASE_MIGRATE_MAX_ATTEMPTS:-5}
REDASH_DATABASE_MIGRATE_RETRY_WAIT=${REDASH_DATABASE_MIGRATE_RETRY_WAIT:-10}
ATTEMPTS=1
while ((ATTEMPTS <= REDASH_DATABASE_MIGRATE_MAX_ATTEMPTS)); do
echo "Creating or updating Redash database, attempt ${ATTEMPTS} of ${REDASH_DATABASE_MIGRATE_MAX_ATTEMPTS}"
ATTEMPTS=$((ATTEMPTS+1))
timeout $REDASH_DATABASE_MIGRATE_TIMEOUT /app/manage.py database create_tables
timeout $REDASH_DATABASE_MIGRATE_TIMEOUT /app/manage.py db upgrade
STATUS=$(timeout $REDASH_DATABASE_MIGRATE_TIMEOUT /app/manage.py status 2>&1)
RETCODE=$?
case "$RETCODE" in
0)
exit 0
;;
124)
echo "Status command timed out after ${REDASH_DATABASE_MIGRATE_TIMEOUT} seconds."
;;
esac
case "$STATUS" in
*sqlalchemy.exc.OperationalError*)
echo "Database not yet functional, waiting."
;;
*sqlalchemy.exc.ProgrammingError*)
echo "Database does not appear to be installed."
;;
esac
echo "Waiting ${REDASH_DATABASE_MIGRATE_RETRY_WAIT} seconds before retrying."
sleep ${REDASH_DATABASE_MIGRATE_RETRY_WAIT}
done
echo "Reached ${REDASH_DATABASE_MIGRATE_MAX_ATTEMPTS} attempts, giving up."
exit 1
exec /app/manage.py database create_tables
}
help() {
@@ -125,16 +61,21 @@ help() {
echo "server -- start Redash server (with gunicorn)"
echo "worker -- start a single RQ worker"
echo "dev_worker -- start a single RQ worker with code reloading"
echo "scheduler -- start an rq-scheduler instance"
echo "dev_scheduler -- start an rq-scheduler instance with code reloading"
echo ""
echo "shell -- open shell"
echo "debug -- start Flask development server with remote debugger via ptvsd"
echo "create_db -- create database tables and run migrations"
echo "dev_server -- start Flask development server with debugger and auto reload"
echo "debug -- start Flask development server with remote debugger via debugpy"
echo "create_db -- create database tables"
echo "manage -- CLI to manage redash"
echo "tests -- run tests"
}
tests() {
export REDASH_DATABASE_URL="postgresql://postgres@postgres/tests"
if [ $# -eq 0 ]; then
TEST_ARGS=tests/
else
@@ -160,10 +101,22 @@ case "$1" in
shift
scheduler
;;
dev_scheduler)
shift
dev_scheduler
;;
dev_worker)
shift
dev_worker
;;
celery_healthcheck)
shift
echo "DEPRECATED: Celery has been replaced with RQ and now performs healthchecks autonomously as part of the 'worker' entrypoint."
;;
dev_server)
export FLASK_DEBUG=1
exec /app/manage.py runserver --debugger --reload -h 0.0.0.0
;;
debug)
export FLASK_DEBUG=1
export REMOTE_DEBUG=1

View File

@@ -1,5 +1,6 @@
import React from "react";
import { clientConfig } from "@/services/auth";
import Link from "@/components/Link";
import { clientConfig, currentUser } from "@/services/auth";
import frontendVersion from "@/version.json";
export default function VersionInfo() {
@@ -9,6 +10,15 @@ export default function VersionInfo() {
Version: {clientConfig.version}
{frontendVersion !== clientConfig.version && ` (${frontendVersion.substring(0, 8)})`}
</div>
{clientConfig.newVersionAvailable && currentUser.hasPermission("super_admin") && (
<div className="m-t-10">
{/* eslint-disable react/jsx-no-target-blank */}
<Link href="https://version.redash.io/" className="update-available" target="_blank" rel="noopener">
Update Available <i className="fa fa-external-link m-l-5" aria-hidden="true" />
<span className="sr-only">(opens in a new tab)</span>
</Link>
</div>
)}
</React.Fragment>
);
}

View File

@@ -0,0 +1,79 @@
import React, { useState } from "react";
import Card from "antd/lib/card";
import Button from "antd/lib/button";
import Typography from "antd/lib/typography";
import { clientConfig } from "@/services/auth";
import Link from "@/components/Link";
import HelpTrigger from "@/components/HelpTrigger";
import DynamicComponent from "@/components/DynamicComponent";
import OrgSettings from "@/services/organizationSettings";
const Text = Typography.Text;
function BeaconConsent() {
const [hide, setHide] = useState(false);
if (!clientConfig.showBeaconConsentMessage || hide) {
return null;
}
const hideConsentCard = () => {
clientConfig.showBeaconConsentMessage = false;
setHide(true);
};
const confirmConsent = (confirm) => {
let message = "🙏 Thank you.";
if (!confirm) {
message = "Settings Saved.";
}
OrgSettings.save({ beacon_consent: confirm }, message)
// .then(() => {
// // const settings = get(response, 'settings');
// // this.setState({ settings, formValues: { ...settings } });
// })
.finally(hideConsentCard);
};
return (
<DynamicComponent name="BeaconConsent">
<div className="m-t-10 tiled">
<Card
title={
<>
Would you be ok with sharing anonymous usage data with the Redash team?{" "}
<HelpTrigger type="USAGE_DATA_SHARING" />
</>
}
bordered={false}
>
<Text>Help Redash improve by automatically sending anonymous usage data:</Text>
<div className="m-t-5">
<ul>
<li> Number of users, queries, dashboards, alerts, widgets and visualizations.</li>
<li> Types of data sources, alert destinations and visualizations.</li>
</ul>
</div>
<Text>All data is aggregated and will never include any sensitive or private data.</Text>
<div className="m-t-5">
<Button type="primary" className="m-r-5" onClick={() => confirmConsent(true)}>
Yes
</Button>
<Button type="default" onClick={() => confirmConsent(false)}>
No
</Button>
</div>
<div className="m-t-15">
<Text type="secondary">
You can change this setting anytime from the <Link href="settings/general">Settings</Link> page.
</Text>
</div>
</Card>
</div>
</DynamicComponent>
);
}
export default BeaconConsent;

View File

@@ -12,6 +12,7 @@ import { wrap as wrapDialog, DialogPropType } from "@/components/DialogWrapper";
import QuerySelector from "@/components/QuerySelector";
import { Query } from "@/services/query";
import { useUniqueId } from "@/lib/hooks/useUniqueId";
import "./EditParameterSettingsDialog.less";
const { Option } = Select;
const formItemProps = { labelCol: { span: 6 }, wrapperCol: { span: 16 } };
@@ -26,7 +27,7 @@ function isTypeDateRange(type) {
function joinExampleList(multiValuesOptions) {
const { prefix, suffix } = multiValuesOptions;
return ["value1", "value2", "value3"].map(value => `${prefix}${value}${suffix}`).join(",");
return ["value1", "value2", "value3"].map((value) => `${prefix}${value}${suffix}`).join(",");
}
function NameInput({ name, type, onChange, existingNames, setValidation }) {
@@ -54,7 +55,7 @@ function NameInput({ name, type, onChange, existingNames, setValidation }) {
return (
<Form.Item required label="Keyword" help={helpText} validateStatus={validateStatus} {...formItemProps}>
<Input onChange={e => onChange(e.target.value)} autoFocus />
<Input onChange={(e) => onChange(e.target.value)} autoFocus />
</Form.Item>
);
}
@@ -71,6 +72,8 @@ function EditParameterSettingsDialog(props) {
const [param, setParam] = useState(clone(props.parameter));
const [isNameValid, setIsNameValid] = useState(true);
const [initialQuery, setInitialQuery] = useState();
const [userInput, setUserInput] = useState(param.regex || "");
const [isValidRegex, setIsValidRegex] = useState(true);
const isNew = !props.parameter.name;
@@ -114,6 +117,17 @@ function EditParameterSettingsDialog(props) {
const paramFormId = useUniqueId("paramForm");
const handleRegexChange = (e) => {
setUserInput(e.target.value);
try {
new RegExp(e.target.value);
setParam({ ...param, regex: e.target.value });
setIsValidRegex(true);
} catch (error) {
setIsValidRegex(false);
}
};
return (
<Modal
{...props.dialog.props}
@@ -129,15 +143,17 @@ function EditParameterSettingsDialog(props) {
disabled={!isFulfilled()}
type="primary"
form={paramFormId}
data-test="SaveParameterSettings">
data-test="SaveParameterSettings"
>
{isNew ? "Add Parameter" : "OK"}
</Button>,
]}>
]}
>
<Form layout="horizontal" onFinish={onConfirm} id={paramFormId}>
{isNew && (
<NameInput
name={param.name}
onChange={name => setParam({ ...param, name })}
onChange={(name) => setParam({ ...param, name })}
setValidation={setIsNameValid}
existingNames={props.existingParams}
type={param.type}
@@ -146,15 +162,16 @@ function EditParameterSettingsDialog(props) {
<Form.Item required label="Title" {...formItemProps}>
<Input
value={isNull(param.title) ? getDefaultTitle(param.name) : param.title}
onChange={e => setParam({ ...param, title: e.target.value })}
onChange={(e) => setParam({ ...param, title: e.target.value })}
data-test="ParameterTitleInput"
/>
</Form.Item>
<Form.Item label="Type" {...formItemProps}>
<Select value={param.type} onChange={type => setParam({ ...param, type })} data-test="ParameterTypeSelect">
<Select value={param.type} onChange={(type) => setParam({ ...param, type })} data-test="ParameterTypeSelect">
<Option value="text" data-test="TextParameterTypeOption">
Text
</Option>
<Option value="text-pattern">Text Pattern</Option>
<Option value="number" data-test="NumberParameterTypeOption">
Number
</Option>
@@ -180,12 +197,26 @@ function EditParameterSettingsDialog(props) {
<Option value="datetime-range-with-seconds">Date and Time Range (with seconds)</Option>
</Select>
</Form.Item>
{param.type === "text-pattern" && (
<Form.Item
label="Regex"
help={!isValidRegex ? "Invalid Regex Pattern" : "Valid Regex Pattern"}
{...formItemProps}
>
<Input
value={userInput}
onChange={handleRegexChange}
className={!isValidRegex ? "input-error" : ""}
data-test="RegexPatternInput"
/>
</Form.Item>
)}
{param.type === "enum" && (
<Form.Item label="Values" help="Dropdown list values (newline delimited)" {...formItemProps}>
<Input.TextArea
rows={3}
value={param.enumOptions}
onChange={e => setParam({ ...param, enumOptions: e.target.value })}
onChange={(e) => setParam({ ...param, enumOptions: e.target.value })}
/>
</Form.Item>
)}
@@ -193,7 +224,7 @@ function EditParameterSettingsDialog(props) {
<Form.Item label="Query" help="Select query to load dropdown values from" {...formItemProps}>
<QuerySelector
selectedQuery={initialQuery}
onChange={q => setParam({ ...param, queryId: q && q.id })}
onChange={(q) => setParam({ ...param, queryId: q && q.id })}
type="select"
/>
</Form.Item>
@@ -202,7 +233,7 @@ function EditParameterSettingsDialog(props) {
<Form.Item className="m-b-0" label=" " colon={false} {...formItemProps}>
<Checkbox
defaultChecked={!!param.multiValuesOptions}
onChange={e =>
onChange={(e) =>
setParam({
...param,
multiValuesOptions: e.target.checked
@@ -214,7 +245,8 @@ function EditParameterSettingsDialog(props) {
: null,
})
}
data-test="AllowMultipleValuesCheckbox">
data-test="AllowMultipleValuesCheckbox"
>
Allow multiple values
</Checkbox>
</Form.Item>
@@ -227,10 +259,11 @@ function EditParameterSettingsDialog(props) {
Placed in query as: <code>{joinExampleList(param.multiValuesOptions)}</code>
</React.Fragment>
}
{...formItemProps}>
{...formItemProps}
>
<Select
value={param.multiValuesOptions.prefix}
onChange={quoteOption =>
onChange={(quoteOption) =>
setParam({
...param,
multiValuesOptions: {
@@ -240,7 +273,8 @@ function EditParameterSettingsDialog(props) {
},
})
}
data-test="QuotationSelect">
data-test="QuotationSelect"
>
<Option value="">None (default)</Option>
<Option value="'">Single Quotation Mark</Option>
<Option value={'"'} data-test="DoubleQuotationMarkOption">

View File

@@ -0,0 +1,3 @@
.input-error {
border-color: red !important;
}

View File

@@ -23,6 +23,7 @@ export const TYPES = mapValues(
VALUE_SOURCE_OPTIONS: ["/user-guide/querying/query-parameters#Value-Source-Options", "Guide: Value Source Options"],
SHARE_DASHBOARD: ["/user-guide/dashboards/sharing-dashboards", "Guide: Sharing and Embedding Dashboards"],
AUTHENTICATION_OPTIONS: ["/user-guide/users/authentication-options", "Guide: Authentication Options"],
USAGE_DATA_SHARING: ["/open-source/admin-guide/usage-data", "Help: Anonymous Usage Data Sharing"],
DS_ATHENA: ["/data-sources/amazon-athena-setup", "Guide: Help Setting up Amazon Athena"],
DS_BIGQUERY: ["/data-sources/bigquery-setup", "Guide: Help Setting up BigQuery"],
DS_URL: ["/data-sources/querying-urls", "Guide: Help Setting up URL"],
@@ -100,7 +101,7 @@ export function helpTriggerWithTypes(types, allowedDomains = [], drawerClassName
clearTimeout(this.iframeLoadingTimeout);
}
loadIframe = url => {
loadIframe = (url) => {
clearTimeout(this.iframeLoadingTimeout);
this.setState({ loading: true, error: false });
@@ -115,8 +116,8 @@ export function helpTriggerWithTypes(types, allowedDomains = [], drawerClassName
clearTimeout(this.iframeLoadingTimeout);
};
onPostMessageReceived = event => {
if (!some(allowedDomains, domain => startsWith(event.origin, domain))) {
onPostMessageReceived = (event) => {
if (!some(allowedDomains, (domain) => startsWith(event.origin, domain))) {
return;
}
@@ -133,7 +134,7 @@ export function helpTriggerWithTypes(types, allowedDomains = [], drawerClassName
return helpTriggerType ? helpTriggerType[0] : this.props.href;
};
openDrawer = e => {
openDrawer = (e) => {
// keep "open in new tab" behavior
if (!e.shiftKey && !e.ctrlKey && !e.metaKey) {
e.preventDefault();
@@ -143,7 +144,7 @@ export function helpTriggerWithTypes(types, allowedDomains = [], drawerClassName
}
};
closeDrawer = event => {
closeDrawer = (event) => {
if (event) {
event.preventDefault();
}
@@ -160,7 +161,7 @@ export function helpTriggerWithTypes(types, allowedDomains = [], drawerClassName
const tooltip = get(types, `${this.props.type}[1]`, this.props.title);
const className = cx("help-trigger", this.props.className);
const url = this.state.currentUrl;
const isAllowedDomain = some(allowedDomains, domain => startsWith(url || targetUrl, domain));
const isAllowedDomain = some(allowedDomains, (domain) => startsWith(url || targetUrl, domain));
const shouldRenderAsLink = this.props.renderAsLink || !isAllowedDomain;
return (
@@ -179,13 +180,15 @@ export function helpTriggerWithTypes(types, allowedDomains = [], drawerClassName
)}
</>
) : null
}>
}
>
<Link
href={url || this.getUrl()}
className={className}
rel="noopener noreferrer"
target="_blank"
onClick={shouldRenderAsLink ? () => {} : this.openDrawer}>
onClick={shouldRenderAsLink ? () => {} : this.openDrawer}
>
{this.props.children}
</Link>
</Tooltip>
@@ -196,7 +199,8 @@ export function helpTriggerWithTypes(types, allowedDomains = [], drawerClassName
visible={this.state.visible}
className={cx("help-drawer", drawerClassName)}
destroyOnClose
width={400}>
width={400}
>
<div className="drawer-wrapper">
<div className="drawer-menu">
{url && (

View File

@@ -33,10 +33,10 @@ export const MappingType = {
};
export function parameterMappingsToEditableMappings(mappings, parameters, existingParameterNames = []) {
return map(mappings, mapping => {
return map(mappings, (mapping) => {
const result = extend({}, mapping);
const alreadyExists = includes(existingParameterNames, mapping.mapTo);
result.param = find(parameters, p => p.name === mapping.name);
result.param = find(parameters, (p) => p.name === mapping.name);
switch (mapping.type) {
case ParameterMappingType.DashboardLevel:
result.type = alreadyExists ? MappingType.DashboardMapToExisting : MappingType.DashboardAddNew;
@@ -62,7 +62,7 @@ export function editableMappingsToParameterMappings(mappings) {
map(
// convert to map
mappings,
mapping => {
(mapping) => {
const result = extend({}, mapping);
switch (mapping.type) {
case MappingType.DashboardAddNew:
@@ -95,11 +95,11 @@ export function editableMappingsToParameterMappings(mappings) {
export function synchronizeWidgetTitles(sourceMappings, widgets) {
const affectedWidgets = [];
each(sourceMappings, sourceMapping => {
each(sourceMappings, (sourceMapping) => {
if (sourceMapping.type === ParameterMappingType.DashboardLevel) {
each(widgets, widget => {
each(widgets, (widget) => {
const widgetMappings = widget.options.parameterMappings;
each(widgetMappings, widgetMapping => {
each(widgetMappings, (widgetMapping) => {
// check if mapped to the same dashboard-level parameter
if (
widgetMapping.type === ParameterMappingType.DashboardLevel &&
@@ -140,7 +140,7 @@ export class ParameterMappingInput extends React.Component {
className: "form-item",
};
updateSourceType = type => {
updateSourceType = (type) => {
let {
mapping: { mapTo },
} = this.props;
@@ -155,7 +155,7 @@ export class ParameterMappingInput extends React.Component {
this.updateParamMapping({ type, mapTo });
};
updateParamMapping = update => {
updateParamMapping = (update) => {
const { onChange, mapping } = this.props;
const newMapping = extend({}, mapping, update);
if (newMapping.value !== mapping.value) {
@@ -175,7 +175,7 @@ export class ParameterMappingInput extends React.Component {
renderMappingTypeSelector() {
const noExisting = isEmpty(this.props.existingParamNames);
return (
<Radio.Group value={this.props.mapping.type} onChange={e => this.updateSourceType(e.target.value)}>
<Radio.Group value={this.props.mapping.type} onChange={(e) => this.updateSourceType(e.target.value)}>
<Radio className="radio" value={MappingType.DashboardAddNew} data-test="NewDashboardParameterOption">
New dashboard parameter
</Radio>
@@ -205,16 +205,16 @@ export class ParameterMappingInput extends React.Component {
<Input
value={mapTo}
aria-label="Parameter name (key)"
onChange={e => this.updateParamMapping({ mapTo: e.target.value })}
onChange={(e) => this.updateParamMapping({ mapTo: e.target.value })}
/>
);
}
renderDashboardMapToExisting() {
const { mapping, existingParamNames } = this.props;
const options = map(existingParamNames, paramName => ({ label: paramName, value: paramName }));
const options = map(existingParamNames, (paramName) => ({ label: paramName, value: paramName }));
return <Select value={mapping.mapTo} onChange={mapTo => this.updateParamMapping({ mapTo })} options={options} />;
return <Select value={mapping.mapTo} onChange={(mapTo) => this.updateParamMapping({ mapTo })} options={options} />;
}
renderStaticValue() {
@@ -226,7 +226,8 @@ export class ParameterMappingInput extends React.Component {
enumOptions={mapping.param.enumOptions}
queryId={mapping.param.queryId}
parameter={mapping.param}
onSelect={value => this.updateParamMapping({ value })}
onSelect={(value) => this.updateParamMapping({ value })}
regex={mapping.param.regex}
/>
);
}
@@ -284,12 +285,12 @@ class MappingEditor extends React.Component {
};
}
onVisibleChange = visible => {
onVisibleChange = (visible) => {
if (visible) this.show();
else this.hide();
};
onChange = mapping => {
onChange = (mapping) => {
let inputError = null;
if (mapping.type === MappingType.DashboardAddNew) {
@@ -351,7 +352,8 @@ class MappingEditor extends React.Component {
trigger="click"
content={this.renderContent()}
visible={visible}
onVisibleChange={this.onVisibleChange}>
onVisibleChange={this.onVisibleChange}
>
<Button size="small" type="dashed" data-test={`EditParamMappingButton-${mapping.param.name}`}>
<EditOutlinedIcon />
</Button>
@@ -376,14 +378,14 @@ class TitleEditor extends React.Component {
title: "", // will be set on editing
};
onPopupVisibleChange = showPopup => {
onPopupVisibleChange = (showPopup) => {
this.setState({
showPopup,
title: showPopup ? this.getMappingTitle() : "",
});
};
onEditingTitleChange = event => {
onEditingTitleChange = (event) => {
this.setState({ title: event.target.value });
};
@@ -460,7 +462,8 @@ class TitleEditor extends React.Component {
trigger="click"
content={this.renderPopover()}
visible={this.state.showPopup}
onVisibleChange={this.onPopupVisibleChange}>
onVisibleChange={this.onPopupVisibleChange}
>
<Button size="small" type="dashed">
<EditOutlinedIcon />
</Button>
@@ -508,7 +511,7 @@ export class ParameterMappingListInput extends React.Component {
// just to be safe, array or object
if (typeof value === "object") {
return map(value, v => this.getStringValue(v)).join(", ");
return map(value, (v) => this.getStringValue(v)).join(", ");
}
// rest
@@ -574,7 +577,7 @@ export class ParameterMappingListInput extends React.Component {
render() {
const { existingParams } = this.props; // eslint-disable-line react/prop-types
const dataSource = this.props.mappings.map(mapping => ({ mapping }));
const dataSource = this.props.mappings.map((mapping) => ({ mapping }));
return (
<div className="parameters-mapping-list">
@@ -583,11 +586,11 @@ export class ParameterMappingListInput extends React.Component {
title="Title"
dataIndex="mapping"
key="title"
render={mapping => (
render={(mapping) => (
<TitleEditor
existingParams={existingParams}
mapping={mapping}
onChange={newMapping => this.updateParamMapping(mapping, newMapping)}
onChange={(newMapping) => this.updateParamMapping(mapping, newMapping)}
/>
)}
/>
@@ -596,19 +599,19 @@ export class ParameterMappingListInput extends React.Component {
dataIndex="mapping"
key="keyword"
className="keyword"
render={mapping => <code>{`{{ ${mapping.name} }}`}</code>}
render={(mapping) => <code>{`{{ ${mapping.name} }}`}</code>}
/>
<Table.Column
title="Default Value"
dataIndex="mapping"
key="value"
render={mapping => this.constructor.getDefaultValue(mapping, this.props.existingParams)}
render={(mapping) => this.constructor.getDefaultValue(mapping, this.props.existingParams)}
/>
<Table.Column
title="Value Source"
dataIndex="mapping"
key="source"
render={mapping => {
render={(mapping) => {
const existingParamsNames = existingParams
.filter(({ type }) => type === mapping.param.type) // exclude mismatching param types
.map(({ name }) => name); // keep names only

View File

@@ -9,11 +9,12 @@ import DateRangeParameter from "@/components/dynamic-parameters/DateRangeParamet
import QueryBasedParameterInput from "./QueryBasedParameterInput";
import "./ParameterValueInput.less";
import Tooltip from "./Tooltip";
const multipleValuesProps = {
maxTagCount: 3,
maxTagTextLength: 10,
maxTagPlaceholder: num => `+${num.length} more`,
maxTagPlaceholder: (num) => `+${num.length} more`,
};
class ParameterValueInput extends React.Component {
@@ -25,6 +26,7 @@ class ParameterValueInput extends React.Component {
parameter: PropTypes.any, // eslint-disable-line react/forbid-prop-types
onSelect: PropTypes.func,
className: PropTypes.string,
regex: PropTypes.string,
};
static defaultProps = {
@@ -35,6 +37,7 @@ class ParameterValueInput extends React.Component {
parameter: null,
onSelect: () => {},
className: "",
regex: "",
};
constructor(props) {
@@ -45,7 +48,7 @@ class ParameterValueInput extends React.Component {
};
}
componentDidUpdate = prevProps => {
componentDidUpdate = (prevProps) => {
const { value, parameter } = this.props;
// if value prop updated, reset dirty state
if (prevProps.value !== value || prevProps.parameter !== parameter) {
@@ -56,7 +59,7 @@ class ParameterValueInput extends React.Component {
}
};
onSelect = value => {
onSelect = (value) => {
const isDirty = !isEqual(value, this.props.value);
this.setState({ value, isDirty });
this.props.onSelect(value, isDirty);
@@ -93,9 +96,9 @@ class ParameterValueInput extends React.Component {
renderEnumInput() {
const { enumOptions, parameter } = this.props;
const { value } = this.state;
const enumOptionsArray = enumOptions.split("\n").filter(v => v !== "");
const enumOptionsArray = enumOptions.split("\n").filter((v) => v !== "");
// Antd Select doesn't handle null in multiple mode
const normalize = val => (parameter.multiValuesOptions && val === null ? [] : val);
const normalize = (val) => (parameter.multiValuesOptions && val === null ? [] : val);
return (
<SelectWithVirtualScroll
@@ -103,7 +106,7 @@ class ParameterValueInput extends React.Component {
mode={parameter.multiValuesOptions ? "multiple" : "default"}
value={normalize(value)}
onChange={this.onSelect}
options={map(enumOptionsArray, opt => ({ label: String(opt), value: opt }))}
options={map(enumOptionsArray, (opt) => ({ label: String(opt), value: opt }))}
showSearch
showArrow
notFoundContent={isEmpty(enumOptionsArray) ? "No options available" : null}
@@ -133,18 +136,36 @@ class ParameterValueInput extends React.Component {
const { className } = this.props;
const { value } = this.state;
const normalize = val => (isNaN(val) ? undefined : val);
const normalize = (val) => (isNaN(val) ? undefined : val);
return (
<InputNumber
className={className}
value={normalize(value)}
aria-label="Parameter number value"
onChange={val => this.onSelect(normalize(val))}
onChange={(val) => this.onSelect(normalize(val))}
/>
);
}
renderTextPatternInput() {
const { className } = this.props;
const { value } = this.state;
return (
<React.Fragment>
<Tooltip title={`Regex to match: ${this.props.regex}`} placement="right">
<Input
className={className}
value={value}
aria-label="Parameter text pattern value"
onChange={(e) => this.onSelect(e.target.value)}
/>
</Tooltip>
</React.Fragment>
);
}
renderTextInput() {
const { className } = this.props;
const { value } = this.state;
@@ -155,7 +176,7 @@ class ParameterValueInput extends React.Component {
value={value}
aria-label="Parameter text value"
data-test="TextParamInput"
onChange={e => this.onSelect(e.target.value)}
onChange={(e) => this.onSelect(e.target.value)}
/>
);
}
@@ -177,6 +198,8 @@ class ParameterValueInput extends React.Component {
return this.renderQueryBasedInput();
case "number":
return this.renderNumberInput();
case "text-pattern":
return this.renderTextPatternInput();
default:
return this.renderTextInput();
}

View File

@@ -14,7 +14,7 @@ import "./Parameters.less";
function updateUrl(parameters) {
const params = extend({}, location.search);
parameters.forEach(param => {
parameters.forEach((param) => {
extend(params, param.toUrlParams());
});
location.setSearch(params, true);
@@ -43,7 +43,7 @@ export default class Parameters extends React.Component {
appendSortableToParent: true,
};
toCamelCase = str => {
toCamelCase = (str) => {
if (isEmpty(str)) {
return "";
}
@@ -59,10 +59,10 @@ export default class Parameters extends React.Component {
}
const hideRegex = /hide_filter=([^&]+)/g;
const matches = window.location.search.matchAll(hideRegex);
this.hideValues = Array.from(matches, match => match[1]);
this.hideValues = Array.from(matches, (match) => match[1]);
}
componentDidUpdate = prevProps => {
componentDidUpdate = (prevProps) => {
const { parameters, disableUrlUpdate } = this.props;
const parametersChanged = prevProps.parameters !== parameters;
const disableUrlUpdateChanged = prevProps.disableUrlUpdate !== disableUrlUpdate;
@@ -74,7 +74,7 @@ export default class Parameters extends React.Component {
}
};
handleKeyDown = e => {
handleKeyDown = (e) => {
// Cmd/Ctrl/Alt + Enter
if (e.keyCode === 13 && (e.ctrlKey || e.metaKey || e.altKey)) {
e.stopPropagation();
@@ -109,8 +109,8 @@ export default class Parameters extends React.Component {
applyChanges = () => {
const { onValuesChange, disableUrlUpdate } = this.props;
this.setState(({ parameters }) => {
const parametersWithPendingValues = parameters.filter(p => p.hasPendingValue);
forEach(parameters, p => p.applyPendingValue());
const parametersWithPendingValues = parameters.filter((p) => p.hasPendingValue);
forEach(parameters, (p) => p.applyPendingValue());
if (!disableUrlUpdate) {
updateUrl(parameters);
}
@@ -121,7 +121,7 @@ export default class Parameters extends React.Component {
showParameterSettings = (parameter, index) => {
const { onParametersEdit } = this.props;
EditParameterSettingsDialog.showModal({ parameter }).onClose(updated => {
EditParameterSettingsDialog.showModal({ parameter }).onClose((updated) => {
this.setState(({ parameters }) => {
const updatedParameter = extend(parameter, updated);
parameters[index] = createParameter(updatedParameter, updatedParameter.parentQueryId);
@@ -132,7 +132,7 @@ export default class Parameters extends React.Component {
};
renderParameter(param, index) {
if (this.hideValues.some(value => this.toCamelCase(value) === this.toCamelCase(param.name))) {
if (this.hideValues.some((value) => this.toCamelCase(value) === this.toCamelCase(param.name))) {
return null;
}
const { editable } = this.props;
@@ -149,7 +149,8 @@ export default class Parameters extends React.Component {
aria-label="Edit"
onClick={() => this.showParameterSettings(param, index)}
data-test={`ParameterSettings-${param.name}`}
type="button">
type="button"
>
<i className="fa fa-cog" aria-hidden="true" />
</PlainButton>
)}
@@ -162,6 +163,7 @@ export default class Parameters extends React.Component {
enumOptions={param.enumOptions}
queryId={param.queryId}
onSelect={(value, isDirty) => this.setPendingValue(param, value, isDirty)}
regex={param.regex}
/>
</div>
);
@@ -178,20 +180,22 @@ export default class Parameters extends React.Component {
useDragHandle
lockToContainerEdges
helperClass="parameter-dragged"
helperContainer={containerEl => (appendSortableToParent ? containerEl : document.body)}
helperContainer={(containerEl) => (appendSortableToParent ? containerEl : document.body)}
updateBeforeSortStart={this.onBeforeSortStart}
onSortEnd={this.moveParameter}
containerProps={{
className: "parameter-container",
onKeyDown: dirtyParamCount ? this.handleKeyDown : null,
}}>
}}
>
{parameters &&
parameters.map((param, index) => (
<SortableElement key={param.name} index={index}>
<div
className="parameter-block"
data-editable={sortable || null}
data-test={`ParameterBlock-${param.name}`}>
data-test={`ParameterBlock-${param.name}`}
>
{sortable && <DragHandle data-test={`DragHandle-${param.name}`} />}
{this.renderParameter(param, index)}
</div>

View File

@@ -69,7 +69,7 @@ UserPreviewCard.defaultProps = {
// DataSourcePreviewCard
export function DataSourcePreviewCard({ dataSource, withLink, children, ...props }) {
const imageUrl = `static/images/db-logos/${dataSource.type}.png`;
const imageUrl = `/static/images/db-logos/${dataSource.type}.png`;
const title = withLink ? <Link href={"data_sources/" + dataSource.id}>{dataSource.name}</Link> : dataSource.name;
return (
<PreviewCard {...props} imageUrl={imageUrl} title={title}>

View File

@@ -19,7 +19,6 @@ import PlainButton from "@/components/PlainButton";
import ExpandedWidgetDialog from "@/components/dashboards/ExpandedWidgetDialog";
import EditParameterMappingsDialog from "@/components/dashboards/EditParameterMappingsDialog";
import VisualizationRenderer from "@/components/visualizations/VisualizationRenderer";
import { ExecutionStatus } from "@/services/query-result";
import Widget from "./Widget";
@@ -279,7 +278,7 @@ class VisualizationWidget extends React.Component {
const widgetQueryResult = widget.getQueryResult();
const widgetStatus = widgetQueryResult && widgetQueryResult.getStatus();
switch (widgetStatus) {
case ExecutionStatus.FAILED:
case "failed":
return (
<div className="body-row-auto scrollbox">
{widgetQueryResult.getError() && (
@@ -289,7 +288,7 @@ class VisualizationWidget extends React.Component {
)}
</div>
);
case ExecutionStatus.FINISHED:
case "done":
return (
<div className="body-row-auto scrollbox">
<VisualizationRenderer

View File

@@ -96,7 +96,7 @@ function EmptyState({
}, []);
// Show if `onboardingMode=false` or any requested step not completed
const shouldShow = !onboardingMode || some(keys(isAvailable), step => isAvailable[step] && !isCompleted[step]);
const shouldShow = !onboardingMode || some(keys(isAvailable), (step) => isAvailable[step] && !isCompleted[step]);
if (!shouldShow) {
return null;
@@ -181,7 +181,7 @@ function EmptyState({
];
const stepsItems = getStepsItems ? getStepsItems(defaultStepsItems) : defaultStepsItems;
const imageSource = illustrationPath ? illustrationPath : "static/images/illustrations/" + illustration + ".svg";
const imageSource = illustrationPath ? illustrationPath : "/static/images/illustrations/" + illustration + ".svg";
return (
<div className="empty-state-wrapper">
@@ -196,7 +196,7 @@ function EmptyState({
</div>
<div className="empty-state__steps">
<h4>Let&apos;s get started</h4>
<ol>{stepsItems.map(item => item.node)}</ol>
<ol>{stepsItems.map((item) => item.node)}</ol>
{helpMessage}
</div>
</div>

View File

@@ -65,6 +65,7 @@ export const Query = PropTypes.shape({
export const AlertOptions = PropTypes.shape({
column: PropTypes.string,
selector: PropTypes.oneOf(["first", "min", "max"]),
op: PropTypes.oneOf([">", ">=", "<", "<=", "==", "!="]),
value: PropTypes.oneOfType([PropTypes.string, PropTypes.number]),
custom_subject: PropTypes.string,
@@ -83,6 +84,7 @@ export const Alert = PropTypes.shape({
query: Query,
options: PropTypes.shape({
column: PropTypes.string,
selector: PropTypes.string,
op: PropTypes.string,
value: PropTypes.oneOfType([PropTypes.string, PropTypes.number]),
}).isRequired,

View File

@@ -16,7 +16,6 @@ import LoadingState from "../items-list/components/LoadingState";
const SchemaItemColumnType = PropTypes.shape({
name: PropTypes.string.isRequired,
type: PropTypes.string,
comment: PropTypes.string,
});
export const SchemaItemType = PropTypes.shape({
@@ -48,13 +47,6 @@ function SchemaItem({ item, expanded, onToggle, onSelect, ...props }) {
return (
<div {...props}>
<div className="schema-list-item">
{item.description ? (
<Tooltip
title={item.description}
mouseEnterDelay={0}
mouseLeaveDelay={0}
placement="right"
arrowPointAtCenter>
<PlainButton className="table-name" onClick={onToggle}>
<i className="fa fa-table m-r-5" aria-hidden="true" />
<strong>
@@ -62,16 +54,6 @@ function SchemaItem({ item, expanded, onToggle, onSelect, ...props }) {
{!isNil(item.size) && <span> ({item.size})</span>}
</strong>
</PlainButton>
</Tooltip>
) : (
<PlainButton className="table-name" onClick={onToggle}>
<i className="fa fa-table m-r-5" aria-hidden="true" />
<strong>
<span title={item.name}>{tableDisplayName}</span>
{!isNil(item.size) && <span> ({item.size})</span>}
</strong>
</PlainButton>
)}
<Tooltip
title="Insert table name into query text"
mouseEnterDelay={0}
@@ -91,14 +73,13 @@ function SchemaItem({ item, expanded, onToggle, onSelect, ...props }) {
map(item.columns, column => {
const columnName = get(column, "name");
const columnType = get(column, "type");
const columnComment = get(column, "comment");
if (columnComment) {
return (
<Tooltip title={columnComment} mouseEnterDelay={0} mouseLeaveDelay={0} placement="rightTop">
<PlainButton
key={columnName}
className="table-open-item"
onClick={e => handleSelect(e, columnName)}>
<Tooltip
title="Insert column name into query text"
mouseEnterDelay={0}
mouseLeaveDelay={0}
placement="rightTop">
<PlainButton key={columnName} className="table-open-item" onClick={e => handleSelect(e, columnName)}>
<div>
{columnName} {columnType && <span className="column-type">{columnType}</span>}
</div>
@@ -109,17 +90,6 @@ function SchemaItem({ item, expanded, onToggle, onSelect, ...props }) {
</PlainButton>
</Tooltip>
);
}
return (
<PlainButton key={columnName} className="table-open-item" onClick={e => handleSelect(e, columnName)}>
<div>
{columnName} {columnType && <span className="column-type">{columnType}</span>}
</div>
<div className="copy-to-editor">
<i className="fa fa-angle-double-right" aria-hidden="true" />
</div>
</PlainButton>
);
})
)}
</div>

View File

@@ -16,6 +16,7 @@ import MenuButton from "./components/MenuButton";
import AlertView from "./AlertView";
import AlertEdit from "./AlertEdit";
import AlertNew from "./AlertNew";
import notifications from "@/services/notifications";
const MODES = {
NEW: 0,
@@ -64,6 +65,7 @@ class Alert extends React.Component {
this.setState({
alert: {
options: {
selector: "first",
op: ">",
value: 1,
muted: false,
@@ -75,7 +77,7 @@ class Alert extends React.Component {
} else {
const { alertId } = this.props;
AlertService.get({ id: alertId })
.then(alert => {
.then((alert) => {
if (this._isMounted) {
const canEdit = currentUser.canEdit(alert);
@@ -93,7 +95,7 @@ class Alert extends React.Component {
this.onQuerySelected(alert.query);
}
})
.catch(error => {
.catch((error) => {
if (this._isMounted) {
this.props.onError(error);
}
@@ -112,7 +114,7 @@ class Alert extends React.Component {
alert.rearm = pendingRearm || null;
return AlertService.save(alert)
.then(alert => {
.then((alert) => {
notification.success("Saved.");
navigateTo(`alerts/${alert.id}`, true);
this.setState({ alert, mode: MODES.VIEW });
@@ -122,7 +124,7 @@ class Alert extends React.Component {
});
};
onQuerySelected = query => {
onQuerySelected = (query) => {
this.setState(({ alert }) => ({
alert: Object.assign(alert, { query }),
queryResult: null,
@@ -130,7 +132,7 @@ class Alert extends React.Component {
if (query) {
// get cached result for column names and values
new QueryService(query).getQueryResultPromise().then(queryResult => {
new QueryService(query).getQueryResultPromise().then((queryResult) => {
if (this._isMounted) {
this.setState({ queryResult });
let { column } = this.state.alert.options;
@@ -146,18 +148,18 @@ class Alert extends React.Component {
}
};
onNameChange = name => {
onNameChange = (name) => {
const { alert } = this.state;
this.setState({
alert: Object.assign(alert, { name }),
});
};
onRearmChange = pendingRearm => {
onRearmChange = (pendingRearm) => {
this.setState({ pendingRearm });
};
setAlertOptions = obj => {
setAlertOptions = (obj) => {
const { alert } = this.state;
const options = { ...alert.options, ...obj };
this.setState({
@@ -177,6 +179,17 @@ class Alert extends React.Component {
});
};
evaluate = () => {
const { alert } = this.state;
return AlertService.evaluate(alert)
.then(() => {
notification.success("Alert evaluated. Refresh page for updated status.");
})
.catch(() => {
notifications.error("Failed to evaluate alert.");
});
};
mute = () => {
const { alert } = this.state;
return AlertService.mute(alert)
@@ -223,7 +236,14 @@ class Alert extends React.Component {
const { queryResult, mode, canEdit, pendingRearm } = this.state;
const menuButton = (
<MenuButton doDelete={this.delete} muted={muted} mute={this.mute} unmute={this.unmute} canEdit={canEdit} />
<MenuButton
doDelete={this.delete}
muted={muted}
mute={this.mute}
unmute={this.unmute}
canEdit={canEdit}
evaluate={this.evaluate}
/>
);
const commonProps = {
@@ -258,7 +278,7 @@ routes.register(
routeWithUserSession({
path: "/alerts/new",
title: "New Alert",
render: pageProps => <Alert {...pageProps} mode={MODES.NEW} />,
render: (pageProps) => <Alert {...pageProps} mode={MODES.NEW} />,
})
);
routes.register(
@@ -266,7 +286,7 @@ routes.register(
routeWithUserSession({
path: "/alerts/:alertId",
title: "Alert",
render: pageProps => <Alert {...pageProps} mode={MODES.VIEW} />,
render: (pageProps) => <Alert {...pageProps} mode={MODES.VIEW} />,
})
);
routes.register(
@@ -274,6 +294,6 @@ routes.register(
routeWithUserSession({
path: "/alerts/:alertId/edit",
title: "Alert",
render: pageProps => <Alert {...pageProps} mode={MODES.EDIT} />,
render: (pageProps) => <Alert {...pageProps} mode={MODES.EDIT} />,
})
);

View File

@@ -68,13 +68,23 @@ export default class AlertView extends React.Component {
<>
<Title name={name} alert={alert}>
<DynamicComponent name="AlertView.HeaderExtra" alert={alert} />
<Tooltip title={canEdit ? "" : "You do not have sufficient permissions to edit this alert"}>
{canEdit ? (
<>
<Button type="default" onClick={canEdit ? onEdit : null} className={cx({ disabled: !canEdit })}>
<i className="fa fa-edit m-r-5" aria-hidden="true" />
Edit
</Button>
{menuButton}
</>
) : (
<Tooltip title="You do not have sufficient permissions to edit this alert">
<Button type="default" onClick={canEdit ? onEdit : null} className={cx({ disabled: !canEdit })}>
<i className="fa fa-edit m-r-5" aria-hidden="true" />
Edit
</Button>
{menuButton}
</Tooltip>
)}
</Title>
<div className="bg-white tiled p-20">
<Grid.Row type="flex" gutter={16}>

View File

@@ -54,23 +54,74 @@ export default function Criteria({ columnNames, resultValues, alertOptions, onCh
return null;
})();
const columnHint = (
let columnHint;
if (alertOptions.selector === "first") {
columnHint = (
<small className="alert-criteria-hint">
Top row value is <code className="p-0">{toString(columnValue) || "unknown"}</code>
</small>
);
} else if (alertOptions.selector === "max") {
columnHint = (
<small className="alert-criteria-hint">
Max column value is{" "}
<code className="p-0">
{toString(
Math.max(...resultValues.map((o) => Number(o[alertOptions.column])).filter((value) => !isNaN(value)))
) || "unknown"}
</code>
</small>
);
} else if (alertOptions.selector === "min") {
columnHint = (
<small className="alert-criteria-hint">
Min column value is{" "}
<code className="p-0">
{toString(
Math.min(...resultValues.map((o) => Number(o[alertOptions.column])).filter((value) => !isNaN(value)))
) || "unknown"}
</code>
</small>
);
}
return (
<div data-test="Criteria">
<div className="input-title">
<span className="input-label">Selector</span>
{editMode ? (
<Select
value={alertOptions.selector}
onChange={(selector) => onChange({ selector })}
optionLabelProp="label"
dropdownMatchSelectWidth={false}
style={{ width: 80 }}
>
<Select.Option value="first" label="first">
first
</Select.Option>
<Select.Option value="min" label="min">
min
</Select.Option>
<Select.Option value="max" label="max">
max
</Select.Option>
</Select>
) : (
<DisabledInput minWidth={60}>{alertOptions.selector}</DisabledInput>
)}
</div>
<div className="input-title">
<span className="input-label">Value column</span>
{editMode ? (
<Select
value={alertOptions.column}
onChange={column => onChange({ column })}
onChange={(column) => onChange({ column })}
dropdownMatchSelectWidth={false}
style={{ minWidth: 100 }}>
{columnNames.map(name => (
style={{ minWidth: 100 }}
>
{columnNames.map((name) => (
<Select.Option key={name}>{name}</Select.Option>
))}
</Select>
@@ -83,10 +134,11 @@ export default function Criteria({ columnNames, resultValues, alertOptions, onCh
{editMode ? (
<Select
value={alertOptions.op}
onChange={op => onChange({ op })}
onChange={(op) => onChange({ op })}
optionLabelProp="label"
dropdownMatchSelectWidth={false}
style={{ width: 55 }}>
style={{ width: 55 }}
>
<Select.Option value=">" label={CONDITIONS[">"]}>
{CONDITIONS[">"]} greater than
</Select.Option>
@@ -125,7 +177,7 @@ export default function Criteria({ columnNames, resultValues, alertOptions, onCh
id="threshold-criterion"
style={{ width: 90 }}
value={alertOptions.value}
onChange={e => onChange({ value: e.target.value })}
onChange={(e) => onChange({ value: e.target.value })}
/>
) : (
<DisabledInput minWidth={50}>{alertOptions.value}</DisabledInput>

View File

@@ -11,7 +11,7 @@ import LoadingOutlinedIcon from "@ant-design/icons/LoadingOutlined";
import EllipsisOutlinedIcon from "@ant-design/icons/EllipsisOutlined";
import PlainButton from "@/components/PlainButton";
export default function MenuButton({ doDelete, canEdit, mute, unmute, muted }) {
export default function MenuButton({ doDelete, canEdit, mute, unmute, evaluate, muted }) {
const [loading, setLoading] = useState(false);
const execute = useCallback(action => {
@@ -55,6 +55,9 @@ export default function MenuButton({ doDelete, canEdit, mute, unmute, muted }) {
<Menu.Item>
<PlainButton onClick={confirmDelete}>Delete</PlainButton>
</Menu.Item>
<Menu.Item>
<PlainButton onClick={() => execute(evaluate)}>Evaluate</PlainButton>
</Menu.Item>
</Menu>
}>
<Button aria-label="More actions">
@@ -69,6 +72,7 @@ MenuButton.propTypes = {
canEdit: PropTypes.bool.isRequired,
mute: PropTypes.func.isRequired,
unmute: PropTypes.func.isRequired,
evaluate: PropTypes.func.isRequired,
muted: PropTypes.bool,
};

View File

@@ -6,6 +6,7 @@ import Link from "@/components/Link";
import routeWithUserSession from "@/components/ApplicationArea/routeWithUserSession";
import EmptyState, { EmptyStateHelpMessage } from "@/components/empty-state/EmptyState";
import DynamicComponent from "@/components/DynamicComponent";
import BeaconConsent from "@/components/BeaconConsent";
import PlainButton from "@/components/PlainButton";
import { axios } from "@/services/axios";
@@ -30,7 +31,8 @@ function DeprecatedEmbedFeatureAlert() {
<Link
href="https://discuss.redash.io/t/support-for-parameters-in-embedded-visualizations/3337"
target="_blank"
rel="noopener noreferrer">
rel="noopener noreferrer"
>
Read more
</Link>
.
@@ -42,7 +44,7 @@ function DeprecatedEmbedFeatureAlert() {
function EmailNotVerifiedAlert() {
const verifyEmail = () => {
axios.post("verification_email/").then(data => {
axios.post("verification_email/").then((data) => {
notification.success(data.message);
});
};
@@ -88,6 +90,7 @@ export default function Home() {
</DynamicComponent>
<DynamicComponent name="HomeExtra" />
<DashboardAndQueryFavoritesList />
<BeaconConsent />
</div>
</div>
);
@@ -98,6 +101,6 @@ routes.register(
routeWithUserSession({
path: "/",
title: "Redash",
render: pageProps => <Home {...pageProps} />,
render: (pageProps) => <Home {...pageProps} />,
})
);

View File

@@ -380,9 +380,7 @@ function QuerySource(props) {
<QueryVisualizationTabs
queryResult={queryResult}
visualizations={query.visualizations}
showNewVisualizationButton={
queryFlags.canEdit && queryResultData.status === ExecutionStatus.FINISHED
}
showNewVisualizationButton={queryFlags.canEdit && queryResultData.status === ExecutionStatus.DONE}
canDeleteVisualizations={queryFlags.canEdit}
selectedTab={selectedVisualization}
onChangeTab={setSelectedVisualization}

View File

@@ -165,7 +165,7 @@ function QueryView(props) {
<QueryVisualizationTabs
queryResult={queryResult}
visualizations={query.visualizations}
showNewVisualizationButton={queryFlags.canEdit && queryResultData.status === ExecutionStatus.FINISHED}
showNewVisualizationButton={queryFlags.canEdit && queryResultData.status === ExecutionStatus.DONE}
canDeleteVisualizations={queryFlags.canEdit}
selectedTab={selectedVisualization}
onChangeTab={setSelectedVisualization}

View File

@@ -9,6 +9,7 @@ import QueryControlDropdown from "@/components/EditVisualizationButton/QueryCont
import EditVisualizationButton from "@/components/EditVisualizationButton";
import useQueryResultData from "@/lib/useQueryResultData";
import { durationHumanize, pluralize, prettySize } from "@/lib/utils";
import { isUndefined } from "lodash";
import "./QueryExecutionMetadata.less";
@@ -51,7 +52,8 @@ export default function QueryExecutionMetadata({
"Result truncated to " +
queryResultData.rows.length +
" rows. Databricks may truncate query results that are unstably large."
}>
}
>
<WarningTwoTone twoToneColor="#FF9800" />
</Tooltip>
</span>
@@ -67,10 +69,9 @@ export default function QueryExecutionMetadata({
)}
{isQueryExecuting && <span>Running&hellip;</span>}
</span>
{queryResultData.metadata.data_scanned && (
{!isUndefined(queryResultData.metadata.data_scanned) && !isQueryExecuting && (
<span className="m-l-5">
Data Scanned
<strong>{prettySize(queryResultData.metadata.data_scanned)}</strong>
Data Scanned <strong>{prettySize(queryResultData.metadata.data_scanned)}</strong>
</span>
)}
</span>

View File

@@ -1,45 +1,37 @@
import { includes } from "lodash";
import React from "react";
import PropTypes from "prop-types";
import Alert from "antd/lib/alert";
import Button from "antd/lib/button";
import Timer from "@/components/Timer";
import { ExecutionStatus } from "@/services/query-result";
export default function QueryExecutionStatus({ status, updatedAt, error, isCancelling, onCancel }) {
const alertType = status === ExecutionStatus.FAILED ? "error" : "info";
const showTimer = status !== ExecutionStatus.FAILED && updatedAt;
const isCancelButtonAvailable = [
ExecutionStatus.SCHEDULED,
ExecutionStatus.QUEUED,
ExecutionStatus.STARTED,
ExecutionStatus.DEFERRED,
].includes(status);
const alertType = status === "failed" ? "error" : "info";
const showTimer = status !== "failed" && updatedAt;
const isCancelButtonAvailable = includes(["waiting", "processing"], status);
let message = isCancelling ? <React.Fragment>Cancelling&hellip;</React.Fragment> : null;
switch (status) {
case ExecutionStatus.QUEUED:
case "waiting":
if (!isCancelling) {
message = <React.Fragment>Query in queue&hellip;</React.Fragment>;
}
break;
case ExecutionStatus.STARTED:
case "processing":
if (!isCancelling) {
message = <React.Fragment>Executing query&hellip;</React.Fragment>;
}
break;
case ExecutionStatus.LOADING_RESULT:
case "loading-result":
message = <React.Fragment>Loading results&hellip;</React.Fragment>;
break;
case ExecutionStatus.FAILED:
case "failed":
message = (
<React.Fragment>
Error running query: <strong>{error}</strong>
</React.Fragment>
);
break;
case ExecutionStatus.CANCELED:
message = <React.Fragment>Query was canceled</React.Fragment>;
break;
// no default
}
@@ -74,7 +66,7 @@ QueryExecutionStatus.propTypes = {
};
QueryExecutionStatus.defaultProps = {
status: ExecutionStatus.QUEUED,
status: "waiting",
updatedAt: null,
error: null,
isCancelling: true,

View File

@@ -2,7 +2,7 @@ import PropTypes from "prop-types";
import React from "react";
export function QuerySourceTypeIcon(props) {
return <img src={`static/images/db-logos/${props.type}.png`} width="20" alt={props.alt} />;
return <img src={`/static/images/db-logos/${props.type}.png`} width="20" alt={props.alt} />;
}
QuerySourceTypeIcon.propTypes = {

View File

@@ -18,7 +18,7 @@ function EmptyState({ title, message, refreshButton }) {
<div className="query-results-empty-state">
<div className="empty-state-content">
<div>
<img src="static/images/illustrations/no-query-results.svg" alt="No Query Results Illustration" />
<img src="/static/images/illustrations/no-query-results.svg" alt="No Query Results Illustration" />
</div>
<h3>{title}</h3>
<div className="m-b-20">{message}</div>
@@ -40,7 +40,7 @@ EmptyState.defaultProps = {
function TabWithDeleteButton({ visualizationName, canDelete, onDelete, ...props }) {
const handleDelete = useCallback(
e => {
(e) => {
e.stopPropagation();
Modal.confirm({
title: "Delete Visualization",
@@ -111,7 +111,8 @@ export default function QueryVisualizationTabs({
className="add-visualization-button"
data-test="NewVisualization"
type="link"
onClick={() => onAddVisualization()}>
onClick={() => onAddVisualization()}
>
<i className="fa fa-plus" aria-hidden="true" />
<span className="m-l-5 hidden-xs">Add Visualization</span>
</Button>
@@ -119,7 +120,7 @@ export default function QueryVisualizationTabs({
}
const orderedVisualizations = useMemo(() => orderBy(visualizations, ["id"]), [visualizations]);
const isFirstVisualization = useCallback(visId => visId === orderedVisualizations[0].id, [orderedVisualizations]);
const isFirstVisualization = useCallback((visId) => visId === orderedVisualizations[0].id, [orderedVisualizations]);
const isMobile = useMedia({ maxWidth: 768 });
const [filters, setFilters] = useState([]);
@@ -132,9 +133,10 @@ export default function QueryVisualizationTabs({
data-test="QueryPageVisualizationTabs"
animated={false}
tabBarGutter={0}
onChange={activeKey => onChangeTab(+activeKey)}
destroyInactiveTabPane>
{orderedVisualizations.map(visualization => (
onChange={(activeKey) => onChangeTab(+activeKey)}
destroyInactiveTabPane
>
{orderedVisualizations.map((visualization) => (
<TabPane
key={`${visualization.id}`}
tab={
@@ -144,7 +146,8 @@ export default function QueryVisualizationTabs({
visualizationName={visualization.name}
onDelete={() => onDeleteVisualization(visualization.id)}
/>
}>
}
>
{queryResult ? (
<VisualizationRenderer
visualization={visualization}

View File

@@ -1,16 +1,11 @@
import { useCallback, useMemo, useState } from "react";
import { reduce } from "lodash";
import localOptions from "@/lib/localOptions";
function calculateTokensCount(schema) {
return reduce(schema, (totalLength, table) => totalLength + table.columns.length, 0);
}
export default function useAutocompleteFlags(schema) {
const isAvailable = useMemo(() => calculateTokensCount(schema) <= 5000, [schema]);
const isAvailable = true;
const [isEnabled, setIsEnabled] = useState(localOptions.get("liveAutocomplete", true));
const toggleAutocomplete = useCallback(state => {
const toggleAutocomplete = useCallback((state) => {
setIsEnabled(state);
localOptions.set("liveAutocomplete", state);
}, []);

View File

@@ -0,0 +1,40 @@
import React from "react";
import Form from "antd/lib/form";
import Checkbox from "antd/lib/checkbox";
import Skeleton from "antd/lib/skeleton";
import HelpTrigger from "@/components/HelpTrigger";
import DynamicComponent from "@/components/DynamicComponent";
import { SettingsEditorPropTypes, SettingsEditorDefaultProps } from "../prop-types";
export default function BeaconConsentSettings(props) {
const { values, onChange, loading } = props;
return (
<DynamicComponent name="OrganizationSettings.BeaconConsentSettings" {...props}>
<Form.Item
label={
<span>
Anonymous Usage Data Sharing
<HelpTrigger className="m-l-5 m-r-5" type="USAGE_DATA_SHARING" />
</span>
}
>
{loading ? (
<Skeleton title={{ width: 300 }} paragraph={false} active />
) : (
<Checkbox
name="beacon_consent"
checked={values.beacon_consent}
onChange={(e) => onChange({ beacon_consent: e.target.checked })}
>
Help Redash improve by automatically sending anonymous usage data
</Checkbox>
)}
</Form.Item>
</DynamicComponent>
);
}
BeaconConsentSettings.propTypes = SettingsEditorPropTypes;
BeaconConsentSettings.defaultProps = SettingsEditorDefaultProps;

View File

@@ -4,6 +4,7 @@ import DynamicComponent from "@/components/DynamicComponent";
import FormatSettings from "./FormatSettings";
import PlotlySettings from "./PlotlySettings";
import FeatureFlagsSettings from "./FeatureFlagsSettings";
import BeaconConsentSettings from "./BeaconConsentSettings";
export default function GeneralSettings(props) {
return (
@@ -13,6 +14,7 @@ export default function GeneralSettings(props) {
<FormatSettings {...props} />
<PlotlySettings {...props} />
<FeatureFlagsSettings {...props} />
<BeaconConsentSettings {...props} />
</DynamicComponent>
);
}

View File

@@ -36,6 +36,7 @@ const Alert = {
delete: data => axios.delete(`api/alerts/${data.id}`),
mute: data => axios.post(`api/alerts/${data.id}/mute`),
unmute: data => axios.delete(`api/alerts/${data.id}/mute`),
evaluate: data => axios.post(`api/alerts/${data.id}/eval`),
};
export default Alert;

View File

@@ -4,19 +4,19 @@ import { fetchDataFromJob } from "@/services/query-result";
export const SCHEMA_NOT_SUPPORTED = 1;
export const SCHEMA_LOAD_ERROR = 2;
export const IMG_ROOT = "static/images/db-logos";
export const IMG_ROOT = "/static/images/db-logos";
function mapSchemaColumnsToObject(columns) {
return map(columns, column => (isObject(column) ? column : { name: column }));
return map(columns, (column) => (isObject(column) ? column : { name: column }));
}
const DataSource = {
query: () => axios.get("api/data_sources"),
get: ({ id }) => axios.get(`api/data_sources/${id}`),
types: () => axios.get("api/data_sources/types"),
create: data => axios.post(`api/data_sources`, data),
save: data => axios.post(`api/data_sources/${data.id}`, data),
test: data => axios.post(`api/data_sources/${data.id}/test`),
create: (data) => axios.post(`api/data_sources`, data),
save: (data) => axios.post(`api/data_sources/${data.id}`, data),
test: (data) => axios.post(`api/data_sources/${data.id}/test`),
delete: ({ id }) => axios.delete(`api/data_sources/${id}`),
fetchSchema: (data, refresh = false) => {
const params = {};
@@ -27,15 +27,15 @@ const DataSource = {
return axios
.get(`api/data_sources/${data.id}/schema`, { params })
.then(data => {
.then((data) => {
if (has(data, "job")) {
return fetchDataFromJob(data.job.id).catch(error =>
return fetchDataFromJob(data.job.id).catch((error) =>
error.code === SCHEMA_NOT_SUPPORTED ? [] : Promise.reject(new Error(data.job.error))
);
}
return has(data, "schema") ? data.schema : Promise.reject();
})
.then(tables => map(tables, table => ({ ...table, columns: mapSchemaColumnsToObject(table.columns) })));
.then((tables) => map(tables, (table) => ({ ...table, columns: mapSchemaColumnsToObject(table.columns) })));
},
};

View File

@@ -61,7 +61,7 @@ class DateParameter extends Parameter {
return value;
}
const normalizedValue = moment(value);
const normalizedValue = moment(value, moment.ISO_8601, true);
return normalizedValue.isValid() ? normalizedValue : null;
}

View File

@@ -0,0 +1,29 @@
import { toString, isNull } from "lodash";
import Parameter from "./Parameter";
class TextPatternParameter extends Parameter {
constructor(parameter, parentQueryId) {
super(parameter, parentQueryId);
this.regex = parameter.regex;
this.setValue(parameter.value);
}
// eslint-disable-next-line class-methods-use-this
normalizeValue(value) {
const normalizedValue = toString(value);
if (isNull(normalizedValue)) {
return null;
}
var re = new RegExp(this.regex);
if (re !== null) {
if (re.test(normalizedValue)) {
return normalizedValue;
}
}
return null;
}
}
export default TextPatternParameter;

View File

@@ -5,6 +5,7 @@ import EnumParameter from "./EnumParameter";
import QueryBasedDropdownParameter from "./QueryBasedDropdownParameter";
import DateParameter from "./DateParameter";
import DateRangeParameter from "./DateRangeParameter";
import TextPatternParameter from "./TextPatternParameter";
function createParameter(param, parentQueryId) {
switch (param.type) {
@@ -22,6 +23,8 @@ function createParameter(param, parentQueryId) {
case "datetime-range":
case "datetime-range-with-seconds":
return new DateRangeParameter(param, parentQueryId);
case "text-pattern":
return new TextPatternParameter({ ...param, type: "text-pattern" }, parentQueryId);
default:
return new TextParameter({ ...param, type: "text" }, parentQueryId);
}
@@ -34,6 +37,7 @@ function cloneParameter(param) {
export {
Parameter,
TextParameter,
TextPatternParameter,
NumberParameter,
EnumParameter,
QueryBasedDropdownParameter,

View File

@@ -1,6 +1,7 @@
import {
createParameter,
TextParameter,
TextPatternParameter,
NumberParameter,
EnumParameter,
QueryBasedDropdownParameter,
@@ -12,6 +13,7 @@ describe("Parameter", () => {
describe("create", () => {
const parameterTypes = [
["text", TextParameter],
["text-pattern", TextPatternParameter],
["number", NumberParameter],
["enum", EnumParameter],
["query", QueryBasedDropdownParameter],

View File

@@ -0,0 +1,21 @@
import { createParameter } from "..";
describe("TextPatternParameter", () => {
let param;
beforeEach(() => {
param = createParameter({ name: "param", title: "Param", type: "text-pattern", regex: "a+" });
});
describe("noramlizeValue", () => {
test("converts matching strings", () => {
const normalizedValue = param.normalizeValue("art");
expect(normalizedValue).toBe("art");
});
test("returns null when string does not match pattern", () => {
const normalizedValue = param.normalizeValue("brt");
expect(normalizedValue).toBeNull();
});
});
});

View File

@@ -50,15 +50,18 @@ const QueryResultResource = {
};
export const ExecutionStatus = {
QUEUED: "queued",
STARTED: "started",
FINISHED: "finished",
WAITING: "waiting",
PROCESSING: "processing",
DONE: "done",
FAILED: "failed",
LOADING_RESULT: "loading-result",
CANCELED: "canceled",
DEFERRED: "deferred",
SCHEDULED: "scheduled",
STOPPED: "stopped",
};
const statuses = {
1: ExecutionStatus.WAITING,
2: ExecutionStatus.PROCESSING,
3: ExecutionStatus.DONE,
4: ExecutionStatus.FAILED,
};
function handleErrorResponse(queryResult, error) {
@@ -77,7 +80,7 @@ function handleErrorResponse(queryResult, error) {
queryResult.update({
job: {
error: "cached query result unavailable, please execute again.",
status: ExecutionStatus.FAILED,
status: 4,
},
});
return;
@@ -88,7 +91,7 @@ function handleErrorResponse(queryResult, error) {
queryResult.update({
job: {
error: get(error, "response.data.message", "Unknown error occurred. Please try again later."),
status: ExecutionStatus.FAILED,
status: 4,
},
});
}
@@ -99,19 +102,11 @@ function sleep(ms) {
export function fetchDataFromJob(jobId, interval = 1000) {
return axios.get(`api/jobs/${jobId}`).then(data => {
const status = data.job.status;
if (
[ExecutionStatus.QUEUED, ExecutionStatus.STARTED, ExecutionStatus.SCHEDULED, ExecutionStatus.DEFERRED].includes(
status
)
) {
const status = statuses[data.job.status];
if (status === ExecutionStatus.WAITING || status === ExecutionStatus.PROCESSING) {
return sleep(interval).then(() => fetchDataFromJob(data.job.id));
} else if (status === ExecutionStatus.FINISHED) {
return data.job.result_id;
} else if (status === ExecutionStatus.CANCELED) {
return Promise.reject("Job was canceled");
} else if (status === ExecutionStatus.STOPPED) {
return Promise.reject("Job was stopped");
} else if (status === ExecutionStatus.DONE) {
return data.job.result;
} else if (status === ExecutionStatus.FAILED) {
return Promise.reject(data.job.error);
}
@@ -119,7 +114,7 @@ export function fetchDataFromJob(jobId, interval = 1000) {
}
export function isDateTime(v) {
return isString(v) && moment(v).isValid() && /^\d{4}-\d{2}-\d{2}T/.test(v);
return isString(v) && moment(v, moment.ISO_8601, true).isValid() && /^\d{4}-\d{2}-\d{2}T/.test(v);
}
class QueryResult {
@@ -127,7 +122,7 @@ class QueryResult {
this.deferred = defer();
this.job = {};
this.query_result = {};
this.status = ExecutionStatus.QUEUED;
this.status = "waiting";
this.updatedAt = moment();
@@ -143,8 +138,8 @@ class QueryResult {
extend(this, props);
if ("query_result" in props) {
this.status = ExecutionStatus.FINISHED;
this.deferred.onStatusChange(ExecutionStatus.FINISHED);
this.status = ExecutionStatus.DONE;
this.deferred.onStatusChange(ExecutionStatus.DONE);
const columnTypes = {};
@@ -188,10 +183,11 @@ class QueryResult {
});
this.deferred.resolve(this);
} else if (this.job.status === ExecutionStatus.STARTED || this.job.status === ExecutionStatus.FINISHED) {
this.status = ExecutionStatus.STARTED;
} else if (this.job.status === ExecutionStatus.FAILED) {
this.status = this.job.status;
} else if (this.job.status === 3 || this.job.status === 2) {
this.deferred.onStatusChange(ExecutionStatus.PROCESSING);
this.status = "processing";
} else if (this.job.status === 4) {
this.status = statuses[this.job.status];
this.deferred.reject(new QueryResultError(this.job.error));
} else {
this.deferred.onStatusChange(undefined);
@@ -215,7 +211,7 @@ class QueryResult {
if (this.isLoadingResult) {
return ExecutionStatus.LOADING_RESULT;
}
return this.status || this.job.status;
return this.status || statuses[this.job.status];
}
getError() {
@@ -378,7 +374,7 @@ class QueryResult {
this.isLoadingResult = true;
this.deferred.onStatusChange(ExecutionStatus.LOADING_RESULT);
QueryResultResource.get({ id: this.job.result_id })
QueryResultResource.get({ id: this.job.query_result_id })
.then(response => {
this.update(response);
this.isLoadingResult = false;
@@ -393,7 +389,7 @@ class QueryResult {
this.update({
job: {
error: "failed communicating with server. Please check your Internet connection and try again.",
status: ExecutionStatus.FAILED,
status: 4,
},
});
this.isLoadingResult = false;
@@ -417,9 +413,9 @@ class QueryResult {
.then(jobResponse => {
this.update(jobResponse);
if (this.getStatus() === ExecutionStatus.STARTED && this.job.result_id && this.job.result_id !== "None") {
if (this.getStatus() === "processing" && this.job.query_result_id && this.job.query_result_id !== "None") {
loadResult();
} else if (this.getStatus() !== ExecutionStatus.FAILED) {
} else if (this.getStatus() !== "failed") {
const waitTime = tryNumber > 10 ? 3000 : 500;
setTimeout(() => {
this.refreshStatus(query, parameters, tryNumber + 1);
@@ -432,7 +428,7 @@ class QueryResult {
this.update({
job: {
error: "failed communicating with server. Please check your Internet connection and try again.",
status: ExecutionStatus.FAILED,
status: 4,
},
});
});

View File

@@ -2,7 +2,6 @@ import moment from "moment";
import debug from "debug";
import Mustache from "mustache";
import { axios } from "@/services/axios";
import { ExecutionStatus } from "@/services/query-result";
import {
zipObject,
isEmpty,
@@ -104,7 +103,7 @@ export class Query {
return new QueryResult({
job: {
error: `missing ${valuesWord} for ${missingParams.join(", ")} ${paramsWord}.`,
status: ExecutionStatus.FAILED,
status: 4,
},
});
}
@@ -361,7 +360,7 @@ export class QueryResultError {
// eslint-disable-next-line class-methods-use-this
getStatus() {
return ExecutionStatus.FAILED;
return "failed";
}
// eslint-disable-next-line class-methods-use-this

View File

@@ -43,18 +43,18 @@ function seedDatabase(seedValues) {
function buildServer() {
console.log("Building the server...");
execSync("docker compose build", { stdio: "inherit" });
execSync("docker compose -p cypress build", { stdio: "inherit" });
}
function startServer() {
console.log("Starting the server...");
execSync("docker compose up -d", { stdio: "inherit" });
execSync("docker compose run server create_db", { stdio: "inherit" });
execSync("docker compose -p cypress up -d", { stdio: "inherit" });
execSync("docker compose -p cypress run server create_db", { stdio: "inherit" });
}
function stopServer() {
console.log("Stopping the server...");
execSync("docker compose down", { stdio: "inherit" });
execSync("docker compose -p cypress down", { stdio: "inherit" });
}
function runCypressCI() {
@@ -63,12 +63,12 @@ function runCypressCI() {
CYPRESS_OPTIONS, // eslint-disable-line no-unused-vars
} = process.env;
if (GITHUB_REPOSITORY === "getredash/redash") {
if (GITHUB_REPOSITORY === "getredash/redash" && process.env.CYPRESS_RECORD_KEY) {
process.env.CYPRESS_OPTIONS = "--record";
}
execSync(
"docker compose run --name cypress cypress ./node_modules/.bin/percy exec -t 300 -- ./node_modules/.bin/cypress run $CYPRESS_OPTIONS",
"COMMIT_INFO_MESSAGE=$(git show -s --format=%s) docker compose run --name cypress cypress ./node_modules/.bin/percy exec -t 300 -- ./node_modules/.bin/cypress run $CYPRESS_OPTIONS",
{ stdio: "inherit" }
);
}

View File

@@ -53,12 +53,11 @@ describe("Dashboard Sharing", () => {
};
const dashboardUrl = this.dashboardUrl;
cy.createQuery({ options }).then(({ id: queryId, name: queryName }) => {
cy.createQuery({ options }).then(({ id: queryId }) => {
cy.visit(dashboardUrl);
editDashboard();
cy.getByTestId("AddWidgetButton").click();
cy.getByTestId("AddWidgetDialog").within(() => {
cy.get("input").type(queryName);
cy.get(`.query-selector-result[data-test="QueryId${queryId}"]`).click();
});
cy.contains("button", "Add to Dashboard").click();
@@ -179,12 +178,11 @@ describe("Dashboard Sharing", () => {
};
const dashboardUrl = this.dashboardUrl;
cy.createQuery({ options }).then(({ id: queryId, name: queryName }) => {
cy.createQuery({ options }).then(({ id: queryId }) => {
cy.visit(dashboardUrl);
editDashboard();
cy.getByTestId("AddWidgetButton").click();
cy.getByTestId("AddWidgetDialog").within(() => {
cy.get("input").type(queryName);
cy.get(`.query-selector-result[data-test="QueryId${queryId}"]`).click();
});
cy.contains("button", "Add to Dashboard").click();

View File

@@ -18,12 +18,11 @@ describe("Widget", () => {
};
it("adds widget", function() {
cy.createQuery().then(({ id: queryId, name: queryName }) => {
cy.createQuery().then(({ id: queryId }) => {
cy.visit(this.dashboardUrl);
editDashboard();
cy.getByTestId("AddWidgetButton").click();
cy.getByTestId("AddWidgetDialog").within(() => {
cy.get("input").type(queryName);
cy.get(`.query-selector-result[data-test="QueryId${queryId}"]`).click();
});
cy.contains("button", "Add to Dashboard").click();

View File

@@ -2,16 +2,14 @@ import { dragParam } from "../../support/parameters";
import dayjs from "dayjs";
function openAndSearchAntdDropdown(testId, paramOption) {
cy.getByTestId(testId)
.find(".ant-select-selection-search-input")
.type(paramOption, { force: true });
cy.getByTestId(testId).find(".ant-select-selection-search-input").type(paramOption, { force: true });
}
describe("Parameter", () => {
const expectDirtyStateChange = edit => {
const expectDirtyStateChange = (edit) => {
cy.getByTestId("ParameterName-test-parameter")
.find(".parameter-input")
.should($el => {
.should(($el) => {
assert.isUndefined($el.data("dirty"));
});
@@ -19,7 +17,7 @@ describe("Parameter", () => {
cy.getByTestId("ParameterName-test-parameter")
.find(".parameter-input")
.should($el => {
.should(($el) => {
assert.isTrue($el.data("dirty"));
});
};
@@ -42,9 +40,7 @@ describe("Parameter", () => {
});
it("updates the results after clicking Apply", () => {
cy.getByTestId("ParameterName-test-parameter")
.find("input")
.type("Redash");
cy.getByTestId("ParameterName-test-parameter").find("input").type("Redash");
cy.getByTestId("ParameterApplyButton").click();
@@ -53,13 +49,66 @@ describe("Parameter", () => {
it("sets dirty state when edited", () => {
expectDirtyStateChange(() => {
cy.getByTestId("ParameterName-test-parameter")
.find("input")
.type("Redash");
cy.getByTestId("ParameterName-test-parameter").find("input").type("Redash");
});
});
});
describe("Text Pattern Parameter", () => {
beforeEach(() => {
const queryData = {
name: "Text Pattern Parameter",
query: "SELECT '{{test-parameter}}' AS parameter",
options: {
parameters: [{ name: "test-parameter", title: "Test Parameter", type: "text-pattern", regex: "a.*a" }],
},
};
cy.createQuery(queryData, false).then(({ id }) => cy.visit(`/queries/${id}/source`));
});
it("updates the results after clicking Apply", () => {
cy.getByTestId("ParameterName-test-parameter").find("input").type("{selectall}arta");
cy.getByTestId("ParameterApplyButton").click();
cy.getByTestId("TableVisualization").should("contain", "arta");
cy.getByTestId("ParameterName-test-parameter").find("input").type("{selectall}arounda");
cy.getByTestId("ParameterApplyButton").click();
cy.getByTestId("TableVisualization").should("contain", "arounda");
});
it("throws error message with invalid query request", () => {
cy.getByTestId("ParameterName-test-parameter").find("input").type("{selectall}arta");
cy.getByTestId("ParameterApplyButton").click();
cy.getByTestId("ParameterName-test-parameter").find("input").type("{selectall}abcab");
cy.getByTestId("ParameterApplyButton").click();
cy.getByTestId("QueryExecutionStatus").should("exist");
});
it("sets dirty state when edited", () => {
expectDirtyStateChange(() => {
cy.getByTestId("ParameterName-test-parameter").find("input").type("{selectall}arta");
});
});
it("doesn't let user save invalid regex", () => {
cy.get(".fa-cog").click();
cy.getByTestId("RegexPatternInput").type("{selectall}[");
cy.contains("Invalid Regex Pattern").should("exist");
cy.getByTestId("SaveParameterSettings").click();
cy.get(".fa-cog").click();
cy.getByTestId("RegexPatternInput").should("not.equal", "[");
});
});
describe("Number Parameter", () => {
beforeEach(() => {
const queryData = {
@@ -74,17 +123,13 @@ describe("Parameter", () => {
});
it("updates the results after clicking Apply", () => {
cy.getByTestId("ParameterName-test-parameter")
.find("input")
.type("{selectall}42");
cy.getByTestId("ParameterName-test-parameter").find("input").type("{selectall}42");
cy.getByTestId("ParameterApplyButton").click();
cy.getByTestId("TableVisualization").should("contain", 42);
cy.getByTestId("ParameterName-test-parameter")
.find("input")
.type("{selectall}31415");
cy.getByTestId("ParameterName-test-parameter").find("input").type("{selectall}31415");
cy.getByTestId("ParameterApplyButton").click();
@@ -93,9 +138,7 @@ describe("Parameter", () => {
it("sets dirty state when edited", () => {
expectDirtyStateChange(() => {
cy.getByTestId("ParameterName-test-parameter")
.find("input")
.type("{selectall}42");
cy.getByTestId("ParameterName-test-parameter").find("input").type("{selectall}42");
});
});
});
@@ -119,10 +162,7 @@ describe("Parameter", () => {
openAndSearchAntdDropdown("ParameterName-test-parameter", "value2"); // asserts option filter prop
// only the filtered option should be on the DOM
cy.get(".ant-select-item-option")
.should("have.length", 1)
.and("contain", "value2")
.click();
cy.get(".ant-select-item-option").should("have.length", 1).and("contain", "value2").click();
cy.getByTestId("ParameterApplyButton").click();
// ensure that query is being executed
@@ -140,12 +180,10 @@ describe("Parameter", () => {
SaveParameterSettings
`);
cy.getByTestId("ParameterName-test-parameter")
.find(".ant-select-selection-search")
.click();
cy.getByTestId("ParameterName-test-parameter").find(".ant-select-selection-search").click();
// select all unselected options
cy.get(".ant-select-item-option").each($option => {
cy.get(".ant-select-item-option").each(($option) => {
if (!$option.hasClass("ant-select-item-option-selected")) {
cy.wrap($option).click();
}
@@ -160,9 +198,7 @@ describe("Parameter", () => {
it("sets dirty state when edited", () => {
expectDirtyStateChange(() => {
cy.getByTestId("ParameterName-test-parameter")
.find(".ant-select")
.click();
cy.getByTestId("ParameterName-test-parameter").find(".ant-select").click();
cy.contains(".ant-select-item-option", "value2").click();
});
@@ -176,7 +212,7 @@ describe("Parameter", () => {
name: "Dropdown Query",
query: "",
};
cy.createQuery(dropdownQueryData, true).then(dropdownQuery => {
cy.createQuery(dropdownQueryData, true).then((dropdownQuery) => {
const queryData = {
name: "Query Based Dropdown Parameter",
query: "SELECT '{{test-parameter}}' AS parameter",
@@ -208,7 +244,7 @@ describe("Parameter", () => {
SELECT 'value2' AS name, 2 AS value UNION ALL
SELECT 'value3' AS name, 3 AS value`,
};
cy.createQuery(dropdownQueryData, true).then(dropdownQuery => {
cy.createQuery(dropdownQueryData, true).then((dropdownQuery) => {
const queryData = {
name: "Query Based Dropdown Parameter",
query: "SELECT '{{test-parameter}}' AS parameter",
@@ -234,10 +270,7 @@ describe("Parameter", () => {
openAndSearchAntdDropdown("ParameterName-test-parameter", "value2"); // asserts option filter prop
// only the filtered option should be on the DOM
cy.get(".ant-select-item-option")
.should("have.length", 1)
.and("contain", "value2")
.click();
cy.get(".ant-select-item-option").should("have.length", 1).and("contain", "value2").click();
cy.getByTestId("ParameterApplyButton").click();
// ensure that query is being executed
@@ -255,12 +288,10 @@ describe("Parameter", () => {
SaveParameterSettings
`);
cy.getByTestId("ParameterName-test-parameter")
.find(".ant-select")
.click();
cy.getByTestId("ParameterName-test-parameter").find(".ant-select").click();
// make sure all options are unselected and select all
cy.get(".ant-select-item-option").each($option => {
cy.get(".ant-select-item-option").each(($option) => {
expect($option).not.to.have.class("ant-select-dropdown-menu-item-selected");
cy.wrap($option).click();
});
@@ -274,14 +305,10 @@ describe("Parameter", () => {
});
});
const selectCalendarDate = date => {
cy.getByTestId("ParameterName-test-parameter")
.find("input")
.click();
const selectCalendarDate = (date) => {
cy.getByTestId("ParameterName-test-parameter").find("input").click();
cy.get(".ant-picker-panel")
.contains(".ant-picker-cell-inner", date)
.click();
cy.get(".ant-picker-panel").contains(".ant-picker-cell-inner", date).click();
};
describe("Date Parameter", () => {
@@ -303,7 +330,7 @@ describe("Parameter", () => {
});
afterEach(() => {
cy.clock().then(clock => clock.restore());
cy.clock().then((clock) => clock.restore());
});
it("updates the results after selecting a date", function () {
@@ -317,9 +344,7 @@ describe("Parameter", () => {
it("allows picking a dynamic date", function () {
cy.getByTestId("DynamicButton").click();
cy.getByTestId("DynamicButtonMenu")
.contains("Today/Now")
.click();
cy.getByTestId("DynamicButtonMenu").contains("Today/Now").click();
cy.getByTestId("ParameterApplyButton").click();
@@ -350,14 +375,11 @@ describe("Parameter", () => {
});
afterEach(() => {
cy.clock().then(clock => clock.restore());
cy.clock().then((clock) => clock.restore());
});
it("updates the results after selecting a date and clicking in ok", function () {
cy.getByTestId("ParameterName-test-parameter")
.find("input")
.as("Input")
.click();
cy.getByTestId("ParameterName-test-parameter").find("input").as("Input").click();
selectCalendarDate("15");
@@ -369,14 +391,9 @@ describe("Parameter", () => {
});
it("shows the current datetime after clicking in Now", function () {
cy.getByTestId("ParameterName-test-parameter")
.find("input")
.as("Input")
.click();
cy.getByTestId("ParameterName-test-parameter").find("input").as("Input").click();
cy.get(".ant-picker-panel")
.contains("Now")
.click();
cy.get(".ant-picker-panel").contains("Now").click();
cy.getByTestId("ParameterApplyButton").click();
@@ -386,9 +403,7 @@ describe("Parameter", () => {
it("allows picking a dynamic date", function () {
cy.getByTestId("DynamicButton").click();
cy.getByTestId("DynamicButtonMenu")
.contains("Today/Now")
.click();
cy.getByTestId("DynamicButtonMenu").contains("Today/Now").click();
cy.getByTestId("ParameterApplyButton").click();
@@ -397,31 +412,20 @@ describe("Parameter", () => {
it("sets dirty state when edited", () => {
expectDirtyStateChange(() => {
cy.getByTestId("ParameterName-test-parameter")
.find("input")
.click();
cy.getByTestId("ParameterName-test-parameter").find("input").click();
cy.get(".ant-picker-panel")
.contains("Now")
.click();
cy.get(".ant-picker-panel").contains("Now").click();
});
});
});
describe("Date Range Parameter", () => {
const selectCalendarDateRange = (startDate, endDate) => {
cy.getByTestId("ParameterName-test-parameter")
.find("input")
.first()
.click();
cy.getByTestId("ParameterName-test-parameter").find("input").first().click();
cy.get(".ant-picker-panel")
.contains(".ant-picker-cell-inner", startDate)
.click();
cy.get(".ant-picker-panel").contains(".ant-picker-cell-inner", startDate).click();
cy.get(".ant-picker-panel")
.contains(".ant-picker-cell-inner", endDate)
.click();
cy.get(".ant-picker-panel").contains(".ant-picker-cell-inner", endDate).click();
};
beforeEach(() => {
@@ -442,7 +446,7 @@ describe("Parameter", () => {
});
afterEach(() => {
cy.clock().then(clock => clock.restore());
cy.clock().then((clock) => clock.restore());
});
it("updates the results after selecting a date range", function () {
@@ -460,9 +464,7 @@ describe("Parameter", () => {
it("allows picking a dynamic date range", function () {
cy.getByTestId("DynamicButton").click();
cy.getByTestId("DynamicButtonMenu")
.contains("Last month")
.click();
cy.getByTestId("DynamicButtonMenu").contains("Last month").click();
cy.getByTestId("ParameterApplyButton").click();
@@ -479,15 +481,10 @@ describe("Parameter", () => {
});
describe("Apply Changes", () => {
const expectAppliedChanges = apply => {
cy.getByTestId("ParameterName-test-parameter-1")
.find("input")
.as("Input")
.type("Redash");
const expectAppliedChanges = (apply) => {
cy.getByTestId("ParameterName-test-parameter-1").find("input").as("Input").type("Redash");
cy.getByTestId("ParameterName-test-parameter-2")
.find("input")
.type("Redash");
cy.getByTestId("ParameterName-test-parameter-2").find("input").type("Redash");
cy.location("search").should("not.contain", "Redash");
@@ -523,10 +520,7 @@ describe("Parameter", () => {
it("shows and hides according to parameter dirty state", () => {
cy.getByTestId("ParameterApplyButton").should("not.be", "visible");
cy.getByTestId("ParameterName-test-parameter-1")
.find("input")
.as("Param")
.type("Redash");
cy.getByTestId("ParameterName-test-parameter-1").find("input").as("Param").type("Redash");
cy.getByTestId("ParameterApplyButton").should("be.visible");
@@ -536,21 +530,13 @@ describe("Parameter", () => {
});
it("updates dirty counter", () => {
cy.getByTestId("ParameterName-test-parameter-1")
.find("input")
.type("Redash");
cy.getByTestId("ParameterName-test-parameter-1").find("input").type("Redash");
cy.getByTestId("ParameterApplyButton")
.find(".ant-badge-count p.current")
.should("contain", "1");
cy.getByTestId("ParameterApplyButton").find(".ant-badge-count p.current").should("contain", "1");
cy.getByTestId("ParameterName-test-parameter-2")
.find("input")
.type("Redash");
cy.getByTestId("ParameterName-test-parameter-2").find("input").type("Redash");
cy.getByTestId("ParameterApplyButton")
.find(".ant-badge-count p.current")
.should("contain", "2");
cy.getByTestId("ParameterApplyButton").find(".ant-badge-count p.current").should("contain", "2");
});
it('applies changes from "Apply Changes" button', () => {
@@ -560,16 +546,13 @@ describe("Parameter", () => {
});
it('applies changes from "alt+enter" keyboard shortcut', () => {
expectAppliedChanges(input => {
expectAppliedChanges((input) => {
input.type("{alt}{enter}");
});
});
it('disables "Execute" button', () => {
cy.getByTestId("ParameterName-test-parameter-1")
.find("input")
.as("Input")
.type("Redash");
cy.getByTestId("ParameterName-test-parameter-1").find("input").as("Input").type("Redash");
cy.getByTestId("ExecuteButton").should("be.disabled");
cy.get("@Input").clear();
@@ -594,10 +577,7 @@ describe("Parameter", () => {
cy.createQuery(queryData, false).then(({ id }) => cy.visit(`/queries/${id}/source`));
cy.get(".parameter-block")
.first()
.invoke("width")
.as("paramWidth");
cy.get(".parameter-block").first().invoke("width").as("paramWidth");
cy.get("body").type("{alt}D"); // hide schema browser
});

View File

@@ -26,16 +26,16 @@ const SQL = `
describe("Chart", () => {
beforeEach(() => {
cy.login();
cy.createQuery({ name: "Chart Visualization", query: SQL })
.its("id")
.as("queryId");
cy.createQuery({ name: "Chart Visualization", query: SQL }).its("id").as("queryId");
});
it("creates Bar charts", function () {
cy.visit(`queries/${this.queryId}/source`);
cy.getByTestId("ExecuteButton").click();
const getBarChartAssertionFunction = (specificBarChartAssertionFn = () => {}) => () => {
const getBarChartAssertionFunction =
(specificBarChartAssertionFn = () => {}) =>
() => {
// checks for TabbedEditor standard tabs
assertTabbedEditor();
@@ -95,8 +95,8 @@ describe("Chart", () => {
const withDashboardWidgetsAssertionFn = (widgetGetters, dashboardUrl) => {
cy.visit(dashboardUrl);
widgetGetters.forEach(widgetGetter => {
cy.get(`@${widgetGetter}`).then(widget => {
widgetGetters.forEach((widgetGetter) => {
cy.get(`@${widgetGetter}`).then((widget) => {
cy.getByTestId(getWidgetTestId(widget)).within(() => {
cy.get("g.points").should("exist");
});
@@ -107,4 +107,34 @@ describe("Chart", () => {
createDashboardWithCharts("Bar chart visualizations", chartGetters, withDashboardWidgetsAssertionFn);
cy.percySnapshot("Visualizations - Charts - Bar");
});
it("colors Bar charts", function () {
cy.visit(`queries/${this.queryId}/source`);
cy.getByTestId("ExecuteButton").click();
cy.getByTestId("NewVisualization").click();
cy.getByTestId("Chart.ColumnMapping.x").selectAntdOption("Chart.ColumnMapping.x.stage");
cy.getByTestId("Chart.ColumnMapping.y").selectAntdOption("Chart.ColumnMapping.y.value1");
cy.getByTestId("VisualizationEditor.Tabs.Colors").click();
cy.getByTestId("ColorScheme").click();
cy.getByTestId("ColorOptionViridis").click();
cy.getByTestId("ColorScheme").click();
cy.getByTestId("ColorOptionTableau 10").click();
cy.getByTestId("ColorScheme").click();
cy.getByTestId("ColorOptionD3 Category 10").click();
});
it("colors Pie charts", function () {
cy.visit(`queries/${this.queryId}/source`);
cy.getByTestId("ExecuteButton").click();
cy.getByTestId("NewVisualization").click();
cy.getByTestId("Chart.GlobalSeriesType").click();
cy.getByTestId("Chart.ChartType.pie").click();
cy.getByTestId("Chart.ColumnMapping.x").selectAntdOption("Chart.ColumnMapping.x.stage");
cy.getByTestId("Chart.ColumnMapping.y").selectAntdOption("Chart.ColumnMapping.y.value1");
cy.getByTestId("VisualizationEditor.Tabs.Colors").click();
cy.getByTestId("ColorScheme").click();
cy.getByTestId("ColorOptionViridis").click();
cy.getByTestId("ColorScheme").click();
cy.getByTestId("ColorOptionTableau 10").click();
cy.getByTestId("ColorScheme").click();
cy.getByTestId("ColorOptionD3 Category 10").click();
});
});

View File

@@ -22,10 +22,7 @@ function prepareVisualization(query, type, name, options) {
cy.get("body").type("{alt}D");
// do some pre-checks here to ensure that visualization was created and is visible
cy.getByTestId("TableVisualization")
.should("exist")
.find("table")
.should("exist");
cy.getByTestId("TableVisualization").should("exist").find("table").should("exist");
return cy.then(() => ({ queryId, visualizationId }));
});
@@ -62,38 +59,21 @@ describe("Table", () => {
});
it("sorts data by a single column", function () {
cy.getByTestId("TableVisualization")
.find("table th")
.contains("c")
.should("exist")
.click();
cy.getByTestId("TableVisualization").find("table th").contains("c").should("exist").click();
cy.percySnapshot("Visualizations - Table (Single-column sort)", { widths: [viewportWidth] });
});
it("sorts data by a multiple columns", function () {
cy.getByTestId("TableVisualization")
.find("table th")
.contains("a")
.should("exist")
.click();
cy.getByTestId("TableVisualization").find("table th").contains("a").should("exist").click();
cy.get("body").type("{shift}", { release: false });
cy.getByTestId("TableVisualization")
.find("table th")
.contains("b")
.should("exist")
.click();
cy.getByTestId("TableVisualization").find("table th").contains("b").should("exist").click();
cy.percySnapshot("Visualizations - Table (Multi-column sort)", { widths: [viewportWidth] });
});
it("sorts data in reverse order", function () {
cy.getByTestId("TableVisualization")
.find("table th")
.contains("c")
.should("exist")
.click()
.click();
cy.getByTestId("TableVisualization").find("table th").contains("c").should("exist").click().click();
cy.percySnapshot("Visualizations - Table (Single-column reverse sort)", { widths: [viewportWidth] });
});
});
@@ -101,10 +81,7 @@ describe("Table", () => {
it("searches in multiple columns", () => {
const { query, config } = SearchInData;
prepareVisualization(query, "TABLE", "Search", config).then(({ visualizationId }) => {
cy.getByTestId("TableVisualization")
.find("table input")
.should("exist")
.type("test");
cy.getByTestId("TableVisualization").find("table input").should("exist").type("test");
cy.percySnapshot("Visualizations - Table (Search in data)", { widths: [viewportWidth] });
});
});

View File

@@ -2,12 +2,12 @@
const { extend, get, merge, find } = Cypress._;
const post = options =>
const post = (options) =>
cy
.getCookie("csrf_token")
.then(csrf => cy.request({ ...options, method: "POST", headers: { "X-CSRF-TOKEN": csrf.value } }));
.then((csrf) => cy.request({ ...options, method: "POST", headers: { "X-CSRF-TOKEN": csrf.value } }));
Cypress.Commands.add("createDashboard", name => {
Cypress.Commands.add("createDashboard", (name) => {
return post({ url: "api/dashboards", body: { name } }).then(({ body }) => body);
});
@@ -28,7 +28,7 @@ Cypress.Commands.add("createQuery", (data, shouldPublish = true) => {
// eslint-disable-next-line cypress/no-assigning-return-values
let request = post({ url: "/api/queries", body: merged }).then(({ body }) => body);
if (shouldPublish) {
request = request.then(query =>
request = request.then((query) =>
post({ url: `/api/queries/${query.id}`, body: { is_draft: false } }).then(() => query)
);
}
@@ -86,6 +86,7 @@ Cypress.Commands.add("addWidget", (dashboardId, visualizationId, options = {}) =
Cypress.Commands.add("createAlert", (queryId, options = {}, name) => {
const defaultOptions = {
column: "?column?",
selector: "first",
op: "greater than",
rearm: 0,
value: 1,
@@ -109,7 +110,7 @@ Cypress.Commands.add("createUser", ({ name, email, password }) => {
url: "api/users?no_invite=yes",
body: { name, email },
failOnStatusCode: false,
}).then(xhr => {
}).then((xhr) => {
const { status, body } = xhr;
if (status < 200 || status > 400) {
throw new Error(xhr);
@@ -146,7 +147,7 @@ Cypress.Commands.add("getDestinations", () => {
Cypress.Commands.add("addDestinationSubscription", (alertId, destinationName) => {
return cy
.getDestinations()
.then(destinations => {
.then((destinations) => {
const destination = find(destinations, { name: destinationName });
if (!destination) {
throw new Error("Destination not found");
@@ -166,6 +167,6 @@ Cypress.Commands.add("addDestinationSubscription", (alertId, destinationName) =>
});
});
Cypress.Commands.add("updateOrgSettings", settings => {
Cypress.Commands.add("updateOrgSettings", (settings) => {
return post({ url: "api/settings/organization", body: settings }).then(({ body }) => body);
});

View File

@@ -3,36 +3,26 @@
* @param should Passed to should expression after plot points are captured
*/
export function assertPlotPreview(should = "exist") {
cy.getByTestId("VisualizationPreview")
.find("g.plot")
.should("exist")
.find("g.points")
.should(should);
cy.getByTestId("VisualizationPreview").find("g.overplot").should("exist").find("g.points").should(should);
}
export function createChartThroughUI(chartName, chartSpecificAssertionFn = () => {}) {
cy.getByTestId("NewVisualization").click();
cy.getByTestId("VisualizationType").selectAntdOption("VisualizationType.CHART");
cy.getByTestId("VisualizationName")
.clear()
.type(chartName);
cy.getByTestId("VisualizationName").clear().type(chartName);
chartSpecificAssertionFn();
cy.server();
cy.route("POST", "**/api/visualizations").as("SaveVisualization");
cy.getByTestId("EditVisualizationDialog")
.contains("button", "Save")
.click();
cy.getByTestId("EditVisualizationDialog").contains("button", "Save").click();
cy.getByTestId("QueryPageVisualizationTabs")
.contains("span", chartName)
.should("exist");
cy.getByTestId("QueryPageVisualizationTabs").contains("span", chartName).should("exist");
cy.wait("@SaveVisualization").should("have.property", "status", 200);
return cy.get("@SaveVisualization").then(xhr => {
return cy.get("@SaveVisualization").then((xhr) => {
const { id, name, options } = xhr.response.body;
return cy.wrap({ id, name, options });
});
@@ -42,19 +32,13 @@ export function assertTabbedEditor(chartSpecificTabbedEditorAssertionFn = () =>
cy.getByTestId("Chart.GlobalSeriesType").should("exist");
cy.getByTestId("VisualizationEditor.Tabs.Series").click();
cy.getByTestId("VisualizationEditor")
.find("table")
.should("exist");
cy.getByTestId("VisualizationEditor").find("table").should("exist");
cy.getByTestId("VisualizationEditor.Tabs.Colors").click();
cy.getByTestId("VisualizationEditor")
.find("table")
.should("exist");
cy.getByTestId("VisualizationEditor").find("table").should("exist");
cy.getByTestId("VisualizationEditor.Tabs.DataLabels").click();
cy.getByTestId("VisualizationEditor")
.getByTestId("Chart.DataLabels.ShowDataLabels")
.should("exist");
cy.getByTestId("VisualizationEditor").getByTestId("Chart.DataLabels.ShowDataLabels").should("exist");
chartSpecificTabbedEditorAssertionFn();
@@ -63,39 +47,29 @@ export function assertTabbedEditor(chartSpecificTabbedEditorAssertionFn = () =>
export function assertAxesAndAddLabels(xaxisLabel, yaxisLabel) {
cy.getByTestId("VisualizationEditor.Tabs.XAxis").click();
cy.getByTestId("Chart.XAxis.Type")
.contains(".ant-select-selection-item", "Auto Detect")
.should("exist");
cy.getByTestId("Chart.XAxis.Type").contains(".ant-select-selection-item", "Auto Detect").should("exist");
cy.getByTestId("Chart.XAxis.Name")
.clear()
.type(xaxisLabel);
cy.getByTestId("Chart.XAxis.Name").clear().type(xaxisLabel);
cy.getByTestId("VisualizationEditor.Tabs.YAxis").click();
cy.getByTestId("Chart.LeftYAxis.Type")
.contains(".ant-select-selection-item", "Linear")
.should("exist");
cy.getByTestId("Chart.LeftYAxis.Type").contains(".ant-select-selection-item", "Linear").should("exist");
cy.getByTestId("Chart.LeftYAxis.Name")
.clear()
.type(yaxisLabel);
cy.getByTestId("Chart.LeftYAxis.Name").clear().type(yaxisLabel);
cy.getByTestId("Chart.LeftYAxis.TickFormat")
.clear()
.type("+");
cy.getByTestId("Chart.LeftYAxis.TickFormat").clear().type("+");
cy.getByTestId("VisualizationEditor.Tabs.General").click();
}
export function createDashboardWithCharts(title, chartGetters, widgetsAssertionFn = () => {}) {
cy.createDashboard(title).then(dashboard => {
cy.createDashboard(title).then((dashboard) => {
const dashboardUrl = `/dashboards/${dashboard.id}`;
const widgetGetters = chartGetters.map(chartGetter => `${chartGetter}Widget`);
const widgetGetters = chartGetters.map((chartGetter) => `${chartGetter}Widget`);
chartGetters.forEach((chartGetter, i) => {
const position = { autoHeight: false, sizeY: 8, sizeX: 3, col: (i % 2) * 3 };
cy.get(`@${chartGetter}`)
.then(chart => cy.addWidget(dashboard.id, chart.id, { position }))
.then((chart) => cy.addWidget(dashboard.id, chart.id, { position }))
.as(widgetGetters[i]);
});

View File

@@ -1,12 +1,10 @@
export function expectTableToHaveLength(length) {
cy.getByTestId("TableVisualization")
.find("tbody tr")
.should("have.length", length);
cy.getByTestId("TableVisualization").find("tbody tr").should("have.length", length);
}
export function expectFirstColumnToHaveMembers(values) {
cy.getByTestId("TableVisualization")
.find("tbody tr td:first-child")
.then($cell => Cypress.$.map($cell, item => Cypress.$(item).text()))
.then(firstColumnCells => expect(firstColumnCells).to.have.members(values));
.then(($cell) => Cypress.$.map($cell, (item) => Cypress.$(item).text()))
.then((firstColumnCells) => expect(firstColumnCells).to.have.members(values));
}

View File

@@ -1,24 +0,0 @@
services:
.redash:
build:
context: .
args:
FRONTEND_BUILD_MODE: ${FRONTEND_BUILD_MODE:-2}
INSTALL_GROUPS: ${INSTALL_GROUPS:-main,all_ds,dev}
volumes:
- $PWD:${SERVER_MOUNT:-/ignore}
command: manage version
environment:
REDASH_LOG_LEVEL: INFO
REDASH_REDIS_URL: redis://redis:6379/0
REDASH_DATABASE_URL: postgresql://postgres@postgres/postgres
REDASH_RATELIMIT_ENABLED: false
REDASH_MAIL_DEFAULT_SENDER: redash@example.com
REDASH_MAIL_SERVER: email
REDASH_MAIL_PORT: 1025
REDASH_ENFORCE_CSRF: true
REDASH_COOKIE_SECRET: ${REDASH_COOKIE_SECRET}
REDASH_SECRET_KEY: ${REDASH_SECRET_KEY}
REDASH_PRODUCTION: ${REDASH_PRODUCTION:-true}
env_file:
- .env

View File

@@ -1,81 +1,71 @@
# This configuration file is for the **development** setup.
# For a production example please refer to getredash/setup repository on GitHub.
x-redash-service: &redash-service
build:
context: .
args:
skip_frontend_build: "true" # set to empty string to build
volumes:
- .:/app
env_file:
- .env
x-redash-environment: &redash-environment
REDASH_HOST: http://localhost:5001
REDASH_LOG_LEVEL: "INFO"
REDASH_REDIS_URL: "redis://redis:6379/0"
REDASH_DATABASE_URL: "postgresql://postgres@postgres/postgres"
REDASH_RATELIMIT_ENABLED: "false"
REDASH_MAIL_DEFAULT_SENDER: "redash@example.com"
REDASH_MAIL_SERVER: "email"
REDASH_MAIL_PORT: 1025
REDASH_ENFORCE_CSRF: "true"
REDASH_GUNICORN_TIMEOUT: 60
# Set secret keys in the .env file
services:
server:
extends:
file: compose.base.yaml
service: .redash
command: server
<<: *redash-service
command: dev_server
depends_on:
- postgres
- redis
ports:
- "${REDASH_PORT:-5001}:5000"
- "5001:5000"
- "5678:5678"
environment:
<<: *redash-environment
PYTHONUNBUFFERED: 0
scheduler:
extends:
file: compose.base.yaml
service: .redash
profiles:
- e2e
- local
command: scheduler
depends_on:
- server
worker:
extends:
file: compose.base.yaml
service: .redash
profiles:
- e2e
- local
command: worker
<<: *redash-service
command: dev_scheduler
depends_on:
- server
environment:
<<: *redash-environment
worker:
<<: *redash-service
command: dev_worker
depends_on:
- server
environment:
<<: *redash-environment
PYTHONUNBUFFERED: 0
redis:
image: redis:7-alpine
restart: unless-stopped
postgres:
image: postgres:16-alpine
image: pgautoupgrade/pgautoupgrade:latest
ports:
- "${POSTGRES_PORT:-15432}:5432"
- "15432:5432"
# The following turns the DB into less durable, but gains significant performance improvements for the tests run (x3
# improvement on my personal machine). We should consider moving this into a dedicated Docker Compose configuration for
# tests.
command: postgres -c fsync=off -c full_page_writes=off -c synchronous_commit=OFF
command: "postgres -c fsync=off -c full_page_writes=off -c synchronous_commit=OFF"
restart: unless-stopped
environment:
POSTGRES_HOST_AUTH_METHOD: trust
POSTGRES_HOST_AUTH_METHOD: "trust"
email:
image: maildev/maildev
ports:
- "1080:1080"
- "1025:1025"
restart: unless-stopped
cypress:
ipc: host
build:
context: .
dockerfile: Dockerfile.cypress
profiles:
- e2e
depends_on:
- server
- worker
- scheduler
environment:
CYPRESS_baseUrl: http://server:5000
PERCY_TOKEN: ${PERCY_TOKEN:-""}
PERCY_BRANCH: ${PERCY_BRANCH:-""}
PERCY_COMMIT: ${PERCY_COMMIT:-""}
PERCY_PULL_REQUEST: ${PERCY_PULL_REQUEST:-}
COMMIT_INFO_BRANCH: ${COMMIT_INFO_BRANCH:-""}
COMMIT_INFO_MESSAGE: ${COMMIT_INFO_MESSAGE:-""}
COMMIT_INFO_AUTHOR: ${COMMIT_INFO_AUTHOR:-""}
COMMIT_INFO_SHA: ${COMMIT_INFO_SHA:-""}
COMMIT_INFO_REMOTE: ${COMMIT_INFO_REMOTE:-""}
CYPRESS_PROJECT_ID: ${CYPRESS_PROJECT_ID:-""}
CYPRESS_RECORD_KEY: ${CYPRESS_RECORD_KEY:-""}
CYPRESS_COVERAGE: ${CYPRESS_COVERAGE:-true}

View File

@@ -24,56 +24,56 @@ def upgrade():
type_=JSONB(astext_type=sa.Text()),
nullable=True,
postgresql_using='options::jsonb',
server_default=sa.text("'{}'::jsonb"))
)
op.alter_column('queries', 'schedule',
existing_type=sa.Text(),
type_=JSONB(astext_type=sa.Text()),
nullable=True,
postgresql_using='schedule::jsonb',
server_default=sa.text("'{}'::jsonb"))
)
op.alter_column('events', 'additional_properties',
existing_type=sa.Text(),
type_=JSONB(astext_type=sa.Text()),
nullable=True,
postgresql_using='additional_properties::jsonb',
server_default=sa.text("'{}'::jsonb"))
)
op.alter_column('organizations', 'settings',
existing_type=sa.Text(),
type_=JSONB(astext_type=sa.Text()),
nullable=True,
postgresql_using='settings::jsonb',
server_default=sa.text("'{}'::jsonb"))
)
op.alter_column('alerts', 'options',
existing_type=JSON(astext_type=sa.Text()),
type_=JSONB(astext_type=sa.Text()),
nullable=True,
postgresql_using='options::jsonb',
server_default=sa.text("'{}'::jsonb"))
)
op.alter_column('dashboards', 'options',
existing_type=JSON(astext_type=sa.Text()),
type_=JSONB(astext_type=sa.Text()),
postgresql_using='options::jsonb',
server_default=sa.text("'{}'::jsonb"))
)
op.alter_column('dashboards', 'layout',
existing_type=sa.Text(),
type_=JSONB(astext_type=sa.Text()),
postgresql_using='layout::jsonb',
server_default=sa.text("'{}'::jsonb"))
)
op.alter_column('changes', 'change',
existing_type=JSON(astext_type=sa.Text()),
type_=JSONB(astext_type=sa.Text()),
postgresql_using='change::jsonb',
server_default=sa.text("'{}'::jsonb"))
)
op.alter_column('visualizations', 'options',
existing_type=sa.Text(),
type_=JSONB(astext_type=sa.Text()),
postgresql_using='options::jsonb',
server_default=sa.text("'{}'::jsonb"))
)
op.alter_column('widgets', 'options',
existing_type=sa.Text(),
type_=JSONB(astext_type=sa.Text()),
postgresql_using='options::jsonb',
server_default=sa.text("'{}'::jsonb"))
)
def downgrade():
@@ -83,53 +83,53 @@ def downgrade():
type_=sa.Text(),
postgresql_using='options::text',
existing_nullable=True,
server_default=sa.text("'{}'::text"))
)
op.alter_column('queries', 'schedule',
existing_type=JSONB(astext_type=sa.Text()),
type_=sa.Text(),
postgresql_using='schedule::text',
existing_nullable=True,
server_default=sa.text("'{}'::text"))
)
op.alter_column('events', 'additional_properties',
existing_type=JSONB(astext_type=sa.Text()),
type_=sa.Text(),
postgresql_using='additional_properties::text',
existing_nullable=True,
server_default=sa.text("'{}'::text"))
)
op.alter_column('organizations', 'settings',
existing_type=JSONB(astext_type=sa.Text()),
type_=sa.Text(),
postgresql_using='settings::text',
existing_nullable=True,
server_default=sa.text("'{}'::text"))
)
op.alter_column('alerts', 'options',
existing_type=JSONB(astext_type=sa.Text()),
type_=JSON(astext_type=sa.Text()),
postgresql_using='options::json',
existing_nullable=True,
server_default=sa.text("'{}'::json"))
)
op.alter_column('dashboards', 'options',
existing_type=JSONB(astext_type=sa.Text()),
type_=JSON(astext_type=sa.Text()),
postgresql_using='options::json',
server_default=sa.text("'{}'::json"))
)
op.alter_column('dashboards', 'layout',
existing_type=JSONB(astext_type=sa.Text()),
type_=sa.Text(),
postgresql_using='layout::text',
server_default=sa.text("'{}'::text"))
)
op.alter_column('changes', 'change',
existing_type=JSONB(astext_type=sa.Text()),
type_=JSON(astext_type=sa.Text()),
postgresql_using='change::json',
server_default=sa.text("'{}'::json"))
)
op.alter_column('visualizations', 'options',
type_=sa.Text(),
existing_type=JSONB(astext_type=sa.Text()),
postgresql_using='options::text',
server_default=sa.text("'{}'::text"))
)
op.alter_column('widgets', 'options',
type_=sa.Text(),
existing_type=JSONB(astext_type=sa.Text()),
postgresql_using='options::text',
server_default=sa.text("'{}'::text"))
)

View File

@@ -15,6 +15,7 @@ from redash import settings
from redash.utils.configuration import ConfigurationContainer
from redash.models.types import (
EncryptedConfiguration,
Configuration,
MutableDict,
MutableList,
)
@@ -44,14 +45,7 @@ def upgrade():
)
),
),
sa.Column(
"options",
ConfigurationContainer.as_mutable(
EncryptedConfiguration(
sa.Text, settings.DATASOURCE_SECRET_KEY, FernetEngine
)
),
),
sa.Column("options", ConfigurationContainer.as_mutable(Configuration)),
)
conn = op.get_bind()

View File

@@ -0,0 +1,64 @@
"""fix_hash
Revision ID: 9e8c841d1a30
Revises: 7205816877ec
Create Date: 2024-10-05 18:55:35.730573
"""
import logging
from alembic import op
import sqlalchemy as sa
from sqlalchemy.sql import table
from sqlalchemy import select
from redash.query_runner import BaseQueryRunner, get_query_runner
# revision identifiers, used by Alembic.
revision = '9e8c841d1a30'
down_revision = '7205816877ec'
branch_labels = None
depends_on = None
def update_query_hash(record):
should_apply_auto_limit = record['options'].get("apply_auto_limit", False) if record['options'] else False
query_runner = get_query_runner(record['type'], {}) if record['type'] else BaseQueryRunner({})
query_text = record['query']
parameters_dict = {p["name"]: p.get("value") for p in record['options'].get('parameters', [])} if record.options else {}
if any(parameters_dict):
print(f"Query {record['query_id']} has parameters. Hash might be incorrect.")
return query_runner.gen_query_hash(query_text, should_apply_auto_limit)
def upgrade():
conn = op.get_bind()
metadata = sa.MetaData(bind=conn)
queries = sa.Table("queries", metadata, autoload=True)
data_sources = sa.Table("data_sources", metadata, autoload=True)
joined_table = queries.outerjoin(data_sources, queries.c.data_source_id == data_sources.c.id)
query = select([
queries.c.id.label("query_id"),
queries.c.query,
queries.c.query_hash,
queries.c.options,
data_sources.c.id.label("data_source_id"),
data_sources.c.type
]).select_from(joined_table)
for record in conn.execute(query):
new_hash = update_query_hash(record)
print(f"Updating hash for query {record['query_id']} from {record['query_hash']} to {new_hash}")
conn.execute(
queries.update()
.where(queries.c.id == record['query_id'])
.values(query_hash=new_hash))
def downgrade():
pass

View File

@@ -14,7 +14,10 @@ from sqlalchemy_utils.types.encrypted.encrypted_type import FernetEngine
from redash import settings
from redash.utils.configuration import ConfigurationContainer
from redash.models.base import key_type
from redash.models.types import EncryptedConfiguration
from redash.models.types import (
EncryptedConfiguration,
Configuration,
)
# revision identifiers, used by Alembic.
@@ -42,14 +45,7 @@ def upgrade():
)
),
),
sa.Column(
"options",
ConfigurationContainer.as_mutable(
EncryptedConfiguration(
sa.Text, settings.DATASOURCE_SECRET_KEY, FernetEngine
)
),
),
sa.Column("options", ConfigurationContainer.as_mutable(Configuration)),
)
conn = op.get_bind()

View File

@@ -1,6 +1,6 @@
{
"name": "redash-client",
"version": "24.05.0-dev",
"version": "25.03.0-dev",
"description": "The frontend part of Redash.",
"main": "index.js",
"scripts": {
@@ -24,7 +24,7 @@
"jest": "TZ=Africa/Khartoum jest",
"test": "run-s type-check jest",
"test:watch": "jest --watch",
"cypress": "COMPOSE_PROFILES=local node client/cypress/cypress.js",
"cypress": "node client/cypress/cypress.js",
"preinstall": "cd viz-lib && yarn link --link-folder ../.yarn",
"postinstall": "(cd viz-lib && yarn --frozen-lockfile && yarn build:babel) && yarn link --link-folder ./.yarn @redash/viz"
},
@@ -50,11 +50,12 @@
"antd": "^4.4.3",
"axios": "0.27.2",
"axios-auth-refresh": "3.3.6",
"bootstrap": "^3.3.7",
"bootstrap": "^3.4.1",
"classnames": "^2.2.6",
"d3": "^3.5.17",
"debug": "^3.2.7",
"dompurify": "^2.0.17",
"elliptic": "^6.6.0",
"font-awesome": "^4.7.0",
"history": "^4.10.1",
"hoist-non-react-statics": "^3.3.0",
@@ -63,7 +64,7 @@
"mousetrap": "^1.6.1",
"mustache": "^2.3.0",
"numeral": "^2.0.6",
"path-to-regexp": "^3.1.0",
"path-to-regexp": "^3.3.0",
"prop-types": "^15.6.1",
"query-string": "^6.9.0",
"react": "16.14.0",

681
poetry.lock generated
View File

@@ -1,4 +1,4 @@
# This file is automatically @generated by Poetry 1.8.2 and should not be changed by hand.
# This file is automatically @generated by Poetry 1.8.5 and should not be changed by hand.
[[package]]
name = "adal"
@@ -515,13 +515,13 @@ graph = ["gremlinpython (==3.3.4)"]
[[package]]
name = "certifi"
version = "2023.11.17"
version = "2024.7.4"
description = "Python package for providing Mozilla's CA Bundle."
optional = false
python-versions = ">=3.6"
files = [
{file = "certifi-2023.11.17-py3-none-any.whl", hash = "sha256:e036ab49d5b79556f99cfc2d9320b34cfbe5be05c5871b51de9329f0603b0474"},
{file = "certifi-2023.11.17.tar.gz", hash = "sha256:9b469f3a900bf28dc19b8cfbf8019bf47f7fdd1a65a1d4ffb98fc14166beb4d1"},
{file = "certifi-2024.7.4-py3-none-any.whl", hash = "sha256:c198e21b1289c2ab85ee4e67bb4b4ef3ead0892059901a8d5b622f24a1101e90"},
{file = "certifi-2024.7.4.tar.gz", hash = "sha256:5a1e7645bc0ec61a09e26c36f6106dd4cf40c6db3a1fb6352b0244e7fb057c7b"},
]
[[package]]
@@ -891,47 +891,51 @@ files = [
[[package]]
name = "cryptography"
version = "41.0.6"
version = "43.0.1"
description = "cryptography is a package which provides cryptographic recipes and primitives to Python developers."
optional = false
python-versions = ">=3.7"
files = [
{file = "cryptography-41.0.6-cp37-abi3-macosx_10_12_universal2.whl", hash = "sha256:0f27acb55a4e77b9be8d550d762b0513ef3fc658cd3eb15110ebbcbd626db12c"},
{file = "cryptography-41.0.6-cp37-abi3-macosx_10_12_x86_64.whl", hash = "sha256:ae236bb8760c1e55b7a39b6d4d32d2279bc6c7c8500b7d5a13b6fb9fc97be35b"},
{file = "cryptography-41.0.6-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:afda76d84b053923c27ede5edc1ed7d53e3c9f475ebaf63c68e69f1403c405a8"},
{file = "cryptography-41.0.6-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:da46e2b5df770070412c46f87bac0849b8d685c5f2679771de277a422c7d0b86"},
{file = "cryptography-41.0.6-cp37-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:ff369dd19e8fe0528b02e8df9f2aeb2479f89b1270d90f96a63500afe9af5cae"},
{file = "cryptography-41.0.6-cp37-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:b648fe2a45e426aaee684ddca2632f62ec4613ef362f4d681a9a6283d10e079d"},
{file = "cryptography-41.0.6-cp37-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:5daeb18e7886a358064a68dbcaf441c036cbdb7da52ae744e7b9207b04d3908c"},
{file = "cryptography-41.0.6-cp37-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:068bc551698c234742c40049e46840843f3d98ad7ce265fd2bd4ec0d11306596"},
{file = "cryptography-41.0.6-cp37-abi3-win32.whl", hash = "sha256:2132d5865eea673fe6712c2ed5fb4fa49dba10768bb4cc798345748380ee3660"},
{file = "cryptography-41.0.6-cp37-abi3-win_amd64.whl", hash = "sha256:48783b7e2bef51224020efb61b42704207dde583d7e371ef8fc2a5fb6c0aabc7"},
{file = "cryptography-41.0.6-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:8efb2af8d4ba9dbc9c9dd8f04d19a7abb5b49eab1f3694e7b5a16a5fc2856f5c"},
{file = "cryptography-41.0.6-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:c5a550dc7a3b50b116323e3d376241829fd326ac47bc195e04eb33a8170902a9"},
{file = "cryptography-41.0.6-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:85abd057699b98fce40b41737afb234fef05c67e116f6f3650782c10862c43da"},
{file = "cryptography-41.0.6-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:f39812f70fc5c71a15aa3c97b2bbe213c3f2a460b79bd21c40d033bb34a9bf36"},
{file = "cryptography-41.0.6-pp38-pypy38_pp73-macosx_10_12_x86_64.whl", hash = "sha256:742ae5e9a2310e9dade7932f9576606836ed174da3c7d26bc3d3ab4bd49b9f65"},
{file = "cryptography-41.0.6-pp38-pypy38_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:35f3f288e83c3f6f10752467c48919a7a94b7d88cc00b0668372a0d2ad4f8ead"},
{file = "cryptography-41.0.6-pp38-pypy38_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:4d03186af98b1c01a4eda396b137f29e4e3fb0173e30f885e27acec8823c1b09"},
{file = "cryptography-41.0.6-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:b27a7fd4229abef715e064269d98a7e2909ebf92eb6912a9603c7e14c181928c"},
{file = "cryptography-41.0.6-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:398ae1fc711b5eb78e977daa3cbf47cec20f2c08c5da129b7a296055fbb22aed"},
{file = "cryptography-41.0.6-pp39-pypy39_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:7e00fb556bda398b99b0da289ce7053639d33b572847181d6483ad89835115f6"},
{file = "cryptography-41.0.6-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:60e746b11b937911dc70d164060d28d273e31853bb359e2b2033c9e93e6f3c43"},
{file = "cryptography-41.0.6-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:3288acccef021e3c3c10d58933f44e8602cf04dba96d9796d70d537bb2f4bbc4"},
{file = "cryptography-41.0.6.tar.gz", hash = "sha256:422e3e31d63743855e43e5a6fcc8b4acab860f560f9321b0ee6269cc7ed70cc3"},
{file = "cryptography-43.0.1-cp37-abi3-macosx_10_9_universal2.whl", hash = "sha256:8385d98f6a3bf8bb2d65a73e17ed87a3ba84f6991c155691c51112075f9ffc5d"},
{file = "cryptography-43.0.1-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:27e613d7077ac613e399270253259d9d53872aaf657471473ebfc9a52935c062"},
{file = "cryptography-43.0.1-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:68aaecc4178e90719e95298515979814bda0cbada1256a4485414860bd7ab962"},
{file = "cryptography-43.0.1-cp37-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:de41fd81a41e53267cb020bb3a7212861da53a7d39f863585d13ea11049cf277"},
{file = "cryptography-43.0.1-cp37-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:f98bf604c82c416bc829e490c700ca1553eafdf2912a91e23a79d97d9801372a"},
{file = "cryptography-43.0.1-cp37-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:61ec41068b7b74268fa86e3e9e12b9f0c21fcf65434571dbb13d954bceb08042"},
{file = "cryptography-43.0.1-cp37-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:014f58110f53237ace6a408b5beb6c427b64e084eb451ef25a28308270086494"},
{file = "cryptography-43.0.1-cp37-abi3-win32.whl", hash = "sha256:2bd51274dcd59f09dd952afb696bf9c61a7a49dfc764c04dd33ef7a6b502a1e2"},
{file = "cryptography-43.0.1-cp37-abi3-win_amd64.whl", hash = "sha256:666ae11966643886c2987b3b721899d250855718d6d9ce41b521252a17985f4d"},
{file = "cryptography-43.0.1-cp39-abi3-macosx_10_9_universal2.whl", hash = "sha256:ac119bb76b9faa00f48128b7f5679e1d8d437365c5d26f1c2c3f0da4ce1b553d"},
{file = "cryptography-43.0.1-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1bbcce1a551e262dfbafb6e6252f1ae36a248e615ca44ba302df077a846a8806"},
{file = "cryptography-43.0.1-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:58d4e9129985185a06d849aa6df265bdd5a74ca6e1b736a77959b498e0505b85"},
{file = "cryptography-43.0.1-cp39-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:d03a475165f3134f773d1388aeb19c2d25ba88b6a9733c5c590b9ff7bbfa2e0c"},
{file = "cryptography-43.0.1-cp39-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:511f4273808ab590912a93ddb4e3914dfd8a388fed883361b02dea3791f292e1"},
{file = "cryptography-43.0.1-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:80eda8b3e173f0f247f711eef62be51b599b5d425c429b5d4ca6a05e9e856baa"},
{file = "cryptography-43.0.1-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:38926c50cff6f533f8a2dae3d7f19541432610d114a70808f0926d5aaa7121e4"},
{file = "cryptography-43.0.1-cp39-abi3-win32.whl", hash = "sha256:a575913fb06e05e6b4b814d7f7468c2c660e8bb16d8d5a1faf9b33ccc569dd47"},
{file = "cryptography-43.0.1-cp39-abi3-win_amd64.whl", hash = "sha256:d75601ad10b059ec832e78823b348bfa1a59f6b8d545db3a24fd44362a1564cb"},
{file = "cryptography-43.0.1-pp310-pypy310_pp73-macosx_10_9_x86_64.whl", hash = "sha256:ea25acb556320250756e53f9e20a4177515f012c9eaea17eb7587a8c4d8ae034"},
{file = "cryptography-43.0.1-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:c1332724be35d23a854994ff0b66530119500b6053d0bd3363265f7e5e77288d"},
{file = "cryptography-43.0.1-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:fba1007b3ef89946dbbb515aeeb41e30203b004f0b4b00e5e16078b518563289"},
{file = "cryptography-43.0.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:5b43d1ea6b378b54a1dc99dd8a2b5be47658fe9a7ce0a58ff0b55f4b43ef2b84"},
{file = "cryptography-43.0.1-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:88cce104c36870d70c49c7c8fd22885875d950d9ee6ab54df2745f83ba0dc365"},
{file = "cryptography-43.0.1-pp39-pypy39_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:9d3cdb25fa98afdd3d0892d132b8d7139e2c087da1712041f6b762e4f807cc96"},
{file = "cryptography-43.0.1-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:e710bf40870f4db63c3d7d929aa9e09e4e7ee219e703f949ec4073b4294f6172"},
{file = "cryptography-43.0.1-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:7c05650fe8023c5ed0d46793d4b7d7e6cd9c04e68eabe5b0aeea836e37bdcec2"},
{file = "cryptography-43.0.1.tar.gz", hash = "sha256:203e92a75716d8cfb491dc47c79e17d0d9207ccffcbcb35f598fbe463ae3444d"},
]
[package.dependencies]
cffi = ">=1.12"
cffi = {version = ">=1.12", markers = "platform_python_implementation != \"PyPy\""}
[package.extras]
docs = ["sphinx (>=5.3.0)", "sphinx-rtd-theme (>=1.1.1)"]
docstest = ["pyenchant (>=1.6.11)", "sphinxcontrib-spelling (>=4.0.1)", "twine (>=1.12.0)"]
docstest = ["pyenchant (>=1.6.11)", "readme-renderer", "sphinxcontrib-spelling (>=4.0.1)"]
nox = ["nox"]
pep8test = ["black", "check-sdist", "mypy", "ruff"]
pep8test = ["check-sdist", "click", "mypy", "ruff"]
sdist = ["build"]
ssh = ["bcrypt (>=3.1.5)"]
test = ["pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
test = ["certifi", "cryptography-vectors (==43.0.1)", "pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
test-randomorder = ["pytest-randomly"]
[[package]]
@@ -970,6 +974,41 @@ sqlalchemy = "*"
sqlalchemy = ["sqlalchemy (>1.3.21,<2.0)"]
superset = ["apache-superset (>=1.4.1)"]
[[package]]
name = "debugpy"
version = "1.8.9"
description = "An implementation of the Debug Adapter Protocol for Python"
optional = false
python-versions = ">=3.8"
files = [
{file = "debugpy-1.8.9-cp310-cp310-macosx_14_0_x86_64.whl", hash = "sha256:cfe1e6c6ad7178265f74981edf1154ffce97b69005212fbc90ca22ddfe3d017e"},
{file = "debugpy-1.8.9-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ada7fb65102a4d2c9ab62e8908e9e9f12aed9d76ef44880367bc9308ebe49a0f"},
{file = "debugpy-1.8.9-cp310-cp310-win32.whl", hash = "sha256:c36856343cbaa448171cba62a721531e10e7ffb0abff838004701454149bc037"},
{file = "debugpy-1.8.9-cp310-cp310-win_amd64.whl", hash = "sha256:17c5e0297678442511cf00a745c9709e928ea4ca263d764e90d233208889a19e"},
{file = "debugpy-1.8.9-cp311-cp311-macosx_14_0_universal2.whl", hash = "sha256:b74a49753e21e33e7cf030883a92fa607bddc4ede1aa4145172debc637780040"},
{file = "debugpy-1.8.9-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:62d22dacdb0e296966d7d74a7141aaab4bec123fa43d1a35ddcb39bf9fd29d70"},
{file = "debugpy-1.8.9-cp311-cp311-win32.whl", hash = "sha256:8138efff315cd09b8dcd14226a21afda4ca582284bf4215126d87342bba1cc66"},
{file = "debugpy-1.8.9-cp311-cp311-win_amd64.whl", hash = "sha256:ff54ef77ad9f5c425398efb150239f6fe8e20c53ae2f68367eba7ece1e96226d"},
{file = "debugpy-1.8.9-cp312-cp312-macosx_14_0_universal2.whl", hash = "sha256:957363d9a7a6612a37458d9a15e72d03a635047f946e5fceee74b50d52a9c8e2"},
{file = "debugpy-1.8.9-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5e565fc54b680292b418bb809f1386f17081d1346dca9a871bf69a8ac4071afe"},
{file = "debugpy-1.8.9-cp312-cp312-win32.whl", hash = "sha256:3e59842d6c4569c65ceb3751075ff8d7e6a6ada209ceca6308c9bde932bcef11"},
{file = "debugpy-1.8.9-cp312-cp312-win_amd64.whl", hash = "sha256:66eeae42f3137eb428ea3a86d4a55f28da9bd5a4a3d369ba95ecc3a92c1bba53"},
{file = "debugpy-1.8.9-cp313-cp313-macosx_14_0_universal2.whl", hash = "sha256:957ecffff80d47cafa9b6545de9e016ae8c9547c98a538ee96ab5947115fb3dd"},
{file = "debugpy-1.8.9-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1efbb3ff61487e2c16b3e033bc8595aea578222c08aaf3c4bf0f93fadbd662ee"},
{file = "debugpy-1.8.9-cp313-cp313-win32.whl", hash = "sha256:7c4d65d03bee875bcb211c76c1d8f10f600c305dbd734beaed4077e902606fee"},
{file = "debugpy-1.8.9-cp313-cp313-win_amd64.whl", hash = "sha256:e46b420dc1bea64e5bbedd678148be512442bc589b0111bd799367cde051e71a"},
{file = "debugpy-1.8.9-cp38-cp38-macosx_14_0_x86_64.whl", hash = "sha256:472a3994999fe6c0756945ffa359e9e7e2d690fb55d251639d07208dbc37caea"},
{file = "debugpy-1.8.9-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:365e556a4772d7d0d151d7eb0e77ec4db03bcd95f26b67b15742b88cacff88e9"},
{file = "debugpy-1.8.9-cp38-cp38-win32.whl", hash = "sha256:54a7e6d3014c408eb37b0b06021366ee985f1539e12fe49ca2ee0d392d9ceca5"},
{file = "debugpy-1.8.9-cp38-cp38-win_amd64.whl", hash = "sha256:8e99c0b1cc7bf86d83fb95d5ccdc4ad0586d4432d489d1f54e4055bcc795f693"},
{file = "debugpy-1.8.9-cp39-cp39-macosx_14_0_x86_64.whl", hash = "sha256:7e8b079323a56f719977fde9d8115590cb5e7a1cba2fcee0986ef8817116e7c1"},
{file = "debugpy-1.8.9-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6953b335b804a41f16a192fa2e7851bdcfd92173cbb2f9f777bb934f49baab65"},
{file = "debugpy-1.8.9-cp39-cp39-win32.whl", hash = "sha256:7e646e62d4602bb8956db88b1e72fe63172148c1e25c041e03b103a25f36673c"},
{file = "debugpy-1.8.9-cp39-cp39-win_amd64.whl", hash = "sha256:3d9755e77a2d680ce3d2c5394a444cf42be4a592caaf246dbfbdd100ffcf7ae5"},
{file = "debugpy-1.8.9-py2.py3-none-any.whl", hash = "sha256:cc37a6c9987ad743d9c3a14fa1b1a14b7e4e6041f9dd0c8abf8895fe7a97b899"},
{file = "debugpy-1.8.9.zip", hash = "sha256:1339e14c7d980407248f09824d1b25ff5c5616651689f1e0f0e51bdead3ea13e"},
]
[[package]]
name = "defusedxml"
version = "0.7.1"
@@ -1312,6 +1351,45 @@ files = [
[package.dependencies]
python-dateutil = ">=2.7"
[[package]]
name = "fsspec"
version = "2024.10.0"
description = "File-system specification"
optional = false
python-versions = ">=3.8"
files = [
{file = "fsspec-2024.10.0-py3-none-any.whl", hash = "sha256:03b9a6785766a4de40368b88906366755e2819e758b83705c88cd7cb5fe81871"},
{file = "fsspec-2024.10.0.tar.gz", hash = "sha256:eda2d8a4116d4f2429db8550f2457da57279247dd930bb12f821b58391359493"},
]
[package.extras]
abfs = ["adlfs"]
adl = ["adlfs"]
arrow = ["pyarrow (>=1)"]
dask = ["dask", "distributed"]
dev = ["pre-commit", "ruff"]
doc = ["numpydoc", "sphinx", "sphinx-design", "sphinx-rtd-theme", "yarl"]
dropbox = ["dropbox", "dropboxdrivefs", "requests"]
full = ["adlfs", "aiohttp (!=4.0.0a0,!=4.0.0a1)", "dask", "distributed", "dropbox", "dropboxdrivefs", "fusepy", "gcsfs", "libarchive-c", "ocifs", "panel", "paramiko", "pyarrow (>=1)", "pygit2", "requests", "s3fs", "smbprotocol", "tqdm"]
fuse = ["fusepy"]
gcs = ["gcsfs"]
git = ["pygit2"]
github = ["requests"]
gs = ["gcsfs"]
gui = ["panel"]
hdfs = ["pyarrow (>=1)"]
http = ["aiohttp (!=4.0.0a0,!=4.0.0a1)"]
libarchive = ["libarchive-c"]
oci = ["ocifs"]
s3 = ["s3fs"]
sftp = ["paramiko"]
smb = ["smbprotocol"]
ssh = ["paramiko"]
test = ["aiohttp (!=4.0.0a0,!=4.0.0a1)", "numpy", "pytest", "pytest-asyncio (!=0.22.0)", "pytest-benchmark", "pytest-cov", "pytest-mock", "pytest-recording", "pytest-rerunfailures", "requests"]
test-downstream = ["aiobotocore (>=2.5.4,<3.0.0)", "dask-expr", "dask[dataframe,test]", "moto[server] (>4,<5)", "pytest-timeout", "xarray"]
test-full = ["adlfs", "aiohttp (!=4.0.0a0,!=4.0.0a1)", "cloudpickle", "dask", "distributed", "dropbox", "dropboxdrivefs", "fastparquet", "fusepy", "gcsfs", "jinja2", "kerchunk", "libarchive-c", "lz4", "notebook", "numpy", "ocifs", "pandas", "panel", "paramiko", "pyarrow", "pyarrow (>=1)", "pyftpdlib", "pygit2", "pytest", "pytest-asyncio (!=0.22.0)", "pytest-benchmark", "pytest-cov", "pytest-mock", "pytest-recording", "pytest-rerunfailures", "python-snappy", "requests", "smbprotocol", "tqdm", "urllib3", "zarr", "zstandard"]
tqdm = ["tqdm"]
[[package]]
name = "funcy"
version = "1.13"
@@ -1984,13 +2062,13 @@ testing = ["Django", "attrs", "colorama", "docopt", "pytest (<7.0.0)"]
[[package]]
name = "jinja2"
version = "3.1.3"
version = "3.1.5"
description = "A very fast and expressive template engine."
optional = false
python-versions = ">=3.7"
files = [
{file = "Jinja2-3.1.3-py3-none-any.whl", hash = "sha256:7d6d50dd97d52cbc355597bd845fabfbac3f551e1f99619e39a35ce8c370b5fa"},
{file = "Jinja2-3.1.3.tar.gz", hash = "sha256:ac8bd6544d4bb2c9792bf3a159e80bba8fda7f07e81bc3aed565432d5925ba90"},
{file = "jinja2-3.1.5-py3-none-any.whl", hash = "sha256:aba0f4dc9ed8013c424088f68a5c226f7d6097ed89b246d7749c2ec4175c6adb"},
{file = "jinja2-3.1.5.tar.gz", hash = "sha256:8fefff8dc3034e27bb80d67c671eb8a9bc424c0ef4c0826edbff304cceff43bb"},
]
[package.dependencies]
@@ -2638,42 +2716,42 @@ et-xmlfile = "*"
[[package]]
name = "oracledb"
version = "2.1.2"
version = "2.5.1"
description = "Python interface to Oracle Database"
optional = false
python-versions = ">=3.7"
python-versions = ">=3.8"
files = [
{file = "oracledb-2.1.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:4ffaba9504c638c29129b484cf547accf750bd0f86df1ca6194646a4d2540691"},
{file = "oracledb-2.1.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:71d98deb1e3a500920f5460d457925f0c8cef8d037881fdbd16df1c4734453dd"},
{file = "oracledb-2.1.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bde2df672704fbe12ab0653f6e808b1ed62de28c6864b17fc3a1fcac9c1fd472"},
{file = "oracledb-2.1.2-cp310-cp310-win32.whl", hash = "sha256:3b3798a1220fc8736a37b9280d0ae4cdf263bb203fc6e2b3a82c33f9a2010702"},
{file = "oracledb-2.1.2-cp310-cp310-win_amd64.whl", hash = "sha256:92620efd5eb0d23b252d75f2f2ff1deadf25f44546903e3283760cb276d524ed"},
{file = "oracledb-2.1.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:b913a164e1830d0e955b88d97c5e4da4d2402f8a8b0d38febb6ad5a8ef9e4743"},
{file = "oracledb-2.1.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c53827344c6d001f492aee0a3acb6c1b6c0f3030c2f5dc8cb86dc4f0bb4dd1ab"},
{file = "oracledb-2.1.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:50225074841d5f9b281d620c012ced4b0946ff5a941c8b639be7babda5190709"},
{file = "oracledb-2.1.2-cp311-cp311-win32.whl", hash = "sha256:a043b4df2919411b787bcd24ffa4286249a11d05d29bb20bb076d108c3c6f777"},
{file = "oracledb-2.1.2-cp311-cp311-win_amd64.whl", hash = "sha256:9edce208c26ee018e43b75323888743031be3e9f0c0e4221abf037129c12d949"},
{file = "oracledb-2.1.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:08aa313b801dda950918168d3962ba59a617adce143e0c2bf1ee9b847695faaa"},
{file = "oracledb-2.1.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:de5c932b04d3bcdd22c71c0e5c5e1d16b6a3a2fc68dc472ee3a12e677461354c"},
{file = "oracledb-2.1.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1d590caf39b1901bcba394fcda9815438faff0afaf374025f89ef5d65993d0a4"},
{file = "oracledb-2.1.2-cp312-cp312-win32.whl", hash = "sha256:1e3ffdfe76c97d1ca13a3fecf239c96d3889015bb5b775dc22b947108044b01e"},
{file = "oracledb-2.1.2-cp312-cp312-win_amd64.whl", hash = "sha256:8c1eaf8c74bb6de5772de768f2f3f5eb935ab935c633d3a012ddff7e691a2073"},
{file = "oracledb-2.1.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:e2ee06e154e08cc5e4037855d74dc6e37dc054c91a7a1a372bb60d4442e2ed3d"},
{file = "oracledb-2.1.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1a21d84aaf5dddab0cfa8ab7c23272c0295a5c796f212a4ce8a6b499643663dd"},
{file = "oracledb-2.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b337f7cf30753c3a32302fbc25ca80d7ff5049dd9333e681236a674a90c21caf"},
{file = "oracledb-2.1.2-cp37-cp37m-win32.whl", hash = "sha256:b5d936763a9b26d32c4e460dbb346c2a962fcc98e6df33dd2d81fdc2eb26f1e4"},
{file = "oracledb-2.1.2-cp37-cp37m-win_amd64.whl", hash = "sha256:0ea32b87b7202811d85082f10bf7789747ce45f195be4199c5611e7d76a79e78"},
{file = "oracledb-2.1.2-cp38-cp38-macosx_11_0_universal2.whl", hash = "sha256:f94b22da87e051e3a8620d2b04d99e1cc9d9abb4da6736d6ae0ca436ba03fb86"},
{file = "oracledb-2.1.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:391034ee66717dba514e765263d08d18a2aa7badde373f82599b89e46fa3720a"},
{file = "oracledb-2.1.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1a2d9891244b9b94465e30af8cc79380bbb41081c5dc0511cbc94cc250e9e26d"},
{file = "oracledb-2.1.2-cp38-cp38-win32.whl", hash = "sha256:9a9a6e0bf61952c2c82614b98fe896d2cda17d81ffca4527556e6607b10e3365"},
{file = "oracledb-2.1.2-cp38-cp38-win_amd64.whl", hash = "sha256:d9a6447589f203ca846526c99a667537b099d54ddeff09d24f9da59bdcc8f98b"},
{file = "oracledb-2.1.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8eb688dd1f8ea2038d17bc84fb651aa1e994b155d3cb8b8387df70ab2a7b4c4c"},
{file = "oracledb-2.1.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0f22c31b894bb085a33d70e174c9bcd0abafc630c2c941ff0d630ee3852f1aa6"},
{file = "oracledb-2.1.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e5bc03520b8bd4dbf2ac4d937d298a85a7208ffbeec738eea92ad7bb00e7134a"},
{file = "oracledb-2.1.2-cp39-cp39-win32.whl", hash = "sha256:5d4f6bd1036d7edbb96d8d31f0ca53696a013c00ac82fc19ac0ca374d2265b2c"},
{file = "oracledb-2.1.2-cp39-cp39-win_amd64.whl", hash = "sha256:69bde9770392c1c859b1e1d767dbb9ca4c57e3f2946ca90c779d9402a7e96111"},
{file = "oracledb-2.1.2.tar.gz", hash = "sha256:3054bcc295d7378834ba7a5aceb865985e954915f9b07a843ea84c3824c6a0b2"},
{file = "oracledb-2.5.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:54ea7b4da179eb3fefad338685b44fed657a9cd733fb0bfc09d344cfb266355e"},
{file = "oracledb-2.5.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:05df7a5a61f4d26c986e235fae6f64a81afaac8f1dbef60e2e9ecf9236218e58"},
{file = "oracledb-2.5.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d17c80063375a5d87a7ab57c8343e5434a16ea74f7be3b56f9100300ef0b69d6"},
{file = "oracledb-2.5.1-cp310-cp310-win32.whl", hash = "sha256:51b3911ee822319e20f2e19d816351aac747591a59a0a96cf891c62c2a5c0c0d"},
{file = "oracledb-2.5.1-cp310-cp310-win_amd64.whl", hash = "sha256:e4e884625117e50b619c93828affbcffa594029ef8c8b40205394990e6af65a8"},
{file = "oracledb-2.5.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:85318350fa4837b7b637e436fa5f99c17919d6329065e64d1e18e5a7cae52457"},
{file = "oracledb-2.5.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:676c221227159d9cee25030c56ff9782f330115cb86164d92d3360f55b07654b"},
{file = "oracledb-2.5.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e78c6de57b4b5df7f932337c57e59b62e34fc4527d2460c0cab10c2ab01825f8"},
{file = "oracledb-2.5.1-cp311-cp311-win32.whl", hash = "sha256:0d5974327a1957538a144b073367104cdf8bb39cf056940995b75cb099535589"},
{file = "oracledb-2.5.1-cp311-cp311-win_amd64.whl", hash = "sha256:541bb5a107917b9d9eba1346318b42f8b6024e7dd3bef1451f0745364f03399c"},
{file = "oracledb-2.5.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:970a9420cc351d650cc6716122e9aa50cfb8c27f425ffc9d83651fd3edff6090"},
{file = "oracledb-2.5.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:a6788c128af5a3a45689453fc4832f32b4a0dae2696d9917c7631a2e02865148"},
{file = "oracledb-2.5.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8778daa3f08639232341d802b95ca6da4c0c798c8530e4df331b3286d32e49d5"},
{file = "oracledb-2.5.1-cp312-cp312-win32.whl", hash = "sha256:a44613f3dfacb2b9462c3871ee333fa535fbd0ec21942e14019fcfd572487db0"},
{file = "oracledb-2.5.1-cp312-cp312-win_amd64.whl", hash = "sha256:934d02da80bfc030c644c5c43fbe58119dc170f15b4dfdb6fe04c220a1f8730d"},
{file = "oracledb-2.5.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:0374481329fa873a2af24eb12de4fd597c6c111e148065200562eb75ea0c6be7"},
{file = "oracledb-2.5.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:66e885de106701d1f2a630d19e183e491e4f1ccb8d78855f60396ba15856fb66"},
{file = "oracledb-2.5.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:fcf446f6250d8edad5367ff03ad73dbbe672a2e4b060c51a774821dd723b0283"},
{file = "oracledb-2.5.1-cp313-cp313-win32.whl", hash = "sha256:b02b93199a7073e9b5687fe2dfa83d25ea102ab261c577f9d55820d5ef193dda"},
{file = "oracledb-2.5.1-cp313-cp313-win_amd64.whl", hash = "sha256:173b6d132b230f0617380272181e14fc53aec65aaffe68b557a9b6040716a267"},
{file = "oracledb-2.5.1-cp38-cp38-macosx_11_0_universal2.whl", hash = "sha256:7d5efc94ce5bb657a5f43e2683e23cc4b4c53c4783e817759869472a113dac26"},
{file = "oracledb-2.5.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6919cb69638a7dda45380d6530b6f2f7fd21ea7bdf8d38936653f9ebc4f7e3d6"},
{file = "oracledb-2.5.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:44f5eb220945a6e092975ebcb9afc3f1eb10420d04d6bfeace1207ba86d60431"},
{file = "oracledb-2.5.1-cp38-cp38-win32.whl", hash = "sha256:aa6ce0dfc64dc7b30bcf477f978538ba82fa7060ecd7a1b9227925b471ae3b50"},
{file = "oracledb-2.5.1-cp38-cp38-win_amd64.whl", hash = "sha256:7a3115e4d445e3430d6f34083b7eed607309411f41472b66d145508f7b0c3770"},
{file = "oracledb-2.5.1-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8a2627a0d29390aaef7211c5b3f7182dfd8e76c969b39d57ee3e43c1057c6fe7"},
{file = "oracledb-2.5.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:730cd03e7fbf05acd32a221ead2a43020b3b91391597eaf728d724548f418b1b"},
{file = "oracledb-2.5.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f42524b586733daa896f675acad8b9f2fc2f4380656d60a22a109a573861fc93"},
{file = "oracledb-2.5.1-cp39-cp39-win32.whl", hash = "sha256:7958c7796df9f8c97484768c88817dec5c6d49220fc4cccdfde12a1a883f3d46"},
{file = "oracledb-2.5.1-cp39-cp39-win_amd64.whl", hash = "sha256:92e0d176e3c76a1916f4e34fc3d84994ad74cce6b8664656c4dbecb8fa7e8c37"},
{file = "oracledb-2.5.1.tar.gz", hash = "sha256:63d17ebb95f9129d0ab9386cb632c9e667e3be2c767278cc11a8e4585468de33"},
]
[package.dependencies]
@@ -2752,13 +2830,13 @@ test = ["hypothesis (>=3.58)", "pytest (>=6.0)", "pytest-xdist"]
[[package]]
name = "paramiko"
version = "3.4.0"
version = "3.4.1"
description = "SSH2 protocol library"
optional = false
python-versions = ">=3.6"
files = [
{file = "paramiko-3.4.0-py3-none-any.whl", hash = "sha256:43f0b51115a896f9c00f59618023484cb3a14b98bbceab43394a39c6739b7ee7"},
{file = "paramiko-3.4.0.tar.gz", hash = "sha256:aac08f26a31dc4dffd92821527d1682d99d52f9ef6851968114a8728f3c274d3"},
{file = "paramiko-3.4.1-py3-none-any.whl", hash = "sha256:8e49fd2f82f84acf7ffd57c64311aa2b30e575370dc23bdb375b10262f7eac32"},
{file = "paramiko-3.4.1.tar.gz", hash = "sha256:8b15302870af7f6652f2e038975c1d2973f06046cb5d7d65355668b3ecbece0c"},
]
[package.dependencies]
@@ -3072,40 +3150,6 @@ pygments = "*"
all = ["black"]
ptipython = ["ipython"]
[[package]]
name = "ptvsd"
version = "4.3.2"
description = "Remote debugging server for Python support in Visual Studio and Visual Studio Code"
optional = false
python-versions = ">=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*"
files = [
{file = "ptvsd-4.3.2-cp27-cp27m-macosx_10_13_x86_64.whl", hash = "sha256:22b699369a18ff28d4d1aa6a452739e50c7b7790cb16c6312d766e023c12fe27"},
{file = "ptvsd-4.3.2-cp27-cp27m-manylinux1_i686.whl", hash = "sha256:3f839fe91d9ddca0d6a3a0afd6a1c824be1768498a737ab9333d084c5c3f3591"},
{file = "ptvsd-4.3.2-cp27-cp27m-manylinux1_x86_64.whl", hash = "sha256:70260b4591c07bff95566d49b6a5dc3051d8558035c43c847bad9a954def46bb"},
{file = "ptvsd-4.3.2-cp27-cp27mu-manylinux1_i686.whl", hash = "sha256:d2662ec37ee049c0f8f2f9a378abeb7e570d9215c19eaf0a6d7189464195009f"},
{file = "ptvsd-4.3.2-cp27-cp27mu-manylinux1_x86_64.whl", hash = "sha256:d9337ebba4d099698982e090b203e85670086c4b29cf1185b2e45cd353a8053e"},
{file = "ptvsd-4.3.2-cp34-cp34m-macosx_10_13_x86_64.whl", hash = "sha256:cf09fd4d90c4c42ddd9bf853290f1a80bc2128993a3923bd3b96b68cc1acd03f"},
{file = "ptvsd-4.3.2-cp34-cp34m-manylinux1_i686.whl", hash = "sha256:ccc5c533135305709461f545feed5061c608714db38fa0f58e3f848a127b7fde"},
{file = "ptvsd-4.3.2-cp34-cp34m-manylinux1_x86_64.whl", hash = "sha256:de5234bec74c47da668e1a1a21bcc9821af0cbb28b5153df78cd5abc744b29a2"},
{file = "ptvsd-4.3.2-cp35-cp35m-macosx_10_13_x86_64.whl", hash = "sha256:c893fb9d1c2ef8f980cc00ced3fd90356f86d9f59b58ee97e0e7e622b8860f76"},
{file = "ptvsd-4.3.2-cp35-cp35m-manylinux1_i686.whl", hash = "sha256:2bbc121bce3608501998afbe742f02b80e7d26b8fecd38f78b903f22f52a81d9"},
{file = "ptvsd-4.3.2-cp35-cp35m-manylinux1_x86_64.whl", hash = "sha256:fad06de012a78f277318d0c308dd3d7cc1f67167f3b2e1e2f7c6caf04c03440c"},
{file = "ptvsd-4.3.2-cp35-cp35m-win32.whl", hash = "sha256:92d26aa7c8f7ffe41cb4b50a00846027027fa17acdf2d9dd8c24de77b25166c6"},
{file = "ptvsd-4.3.2-cp35-cp35m-win_amd64.whl", hash = "sha256:eda10ecd43daacc180a6fbe524992be76a877c3559e2b78016b4ada8fec10273"},
{file = "ptvsd-4.3.2-cp36-cp36m-macosx_10_13_x86_64.whl", hash = "sha256:c01204e3f025c3f7252c79c1a8a028246d29e3ef339e1a01ddf652999f47bdea"},
{file = "ptvsd-4.3.2-cp36-cp36m-manylinux1_i686.whl", hash = "sha256:c97c71835dde7e67fc7b06398bee1c012559a0784ebda9cf8acaf176c7ae766c"},
{file = "ptvsd-4.3.2-cp36-cp36m-manylinux1_x86_64.whl", hash = "sha256:612948a045fcf9c8931cd306972902440278f34de7ca684b49d4caeec9f1ec62"},
{file = "ptvsd-4.3.2-cp36-cp36m-win32.whl", hash = "sha256:72d114baa5737baf29c8068d1ccdd93cbb332d2030601c888eed0e3761b588d7"},
{file = "ptvsd-4.3.2-cp36-cp36m-win_amd64.whl", hash = "sha256:58508485a1609a495dd45829bd6d219303cf9edef5ca1f01a9ed8ffaa87f390c"},
{file = "ptvsd-4.3.2-cp37-cp37m-macosx_10_13_x86_64.whl", hash = "sha256:20f48ffed42a6beb879c250d82662e175ad59cc46a29c95c6a4472ae413199c5"},
{file = "ptvsd-4.3.2-cp37-cp37m-manylinux1_i686.whl", hash = "sha256:b9970e3dc987eb2a6001af6c9d2f726dd6455cfc6d47e0f51925cbdee7ea2157"},
{file = "ptvsd-4.3.2-cp37-cp37m-manylinux1_x86_64.whl", hash = "sha256:1d3d82ecc82186d099992a748556e6e54037f5c5e4d3fc9bba3e2302354be0d4"},
{file = "ptvsd-4.3.2-cp37-cp37m-win32.whl", hash = "sha256:10745fbb788001959b4de405198d8bd5243611a88fb5a2e2c6800245bc0ddd74"},
{file = "ptvsd-4.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:90cbd082e7a9089664888d0d94aca760202f080133fca8f3fe65c48ed6b9e39d"},
{file = "ptvsd-4.3.2-py2.py3-none-any.whl", hash = "sha256:459137736068bb02515040b2ed2738169cb30d69a38e0fd5dffcba255f41e68d"},
{file = "ptvsd-4.3.2.zip", hash = "sha256:3b05c06018fdbce5943c50fb0baac695b5c11326f9e21a5266c854306bda28ab"},
]
[[package]]
name = "pure-sasl"
version = "0.6.2"
@@ -3147,23 +3191,25 @@ pyasn1 = ">=0.4.6,<0.6.0"
[[package]]
name = "pyathena"
version = "1.11.5"
version = "2.25.2"
description = "Python DB API 2.0 (PEP 249) client for Amazon Athena"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
python-versions = ">=3.7.1,<4.0.0"
files = [
{file = "PyAthena-1.11.5-py2.py3-none-any.whl", hash = "sha256:8cc5d40236993fe5241bb625e78d0a0a149e629b74569a9636b49168448a7ac8"},
{file = "PyAthena-1.11.5.tar.gz", hash = "sha256:86c0f4d10528de44fcd63222506949b010dff36ad57116e4c1274c1cfa9477d0"},
{file = "pyathena-2.25.2-py3-none-any.whl", hash = "sha256:df7855fec5cc675511431d7c72b814346ebd7e51ed32181ec95847154f79210b"},
{file = "pyathena-2.25.2.tar.gz", hash = "sha256:aebb8254dd7b2a450841ee3552bf443002a2deaed93fae0ae6f4258b5eb2d367"},
]
[package.dependencies]
boto3 = ">=1.4.4"
botocore = ">=1.5.52"
future = "*"
boto3 = ">=1.26.4"
botocore = ">=1.29.4"
fsspec = "*"
tenacity = ">=4.1.0"
[package.extras]
pandas = ["pandas (>=0.24.0)", "pyarrow (>=0.15.0)"]
arrow = ["pyarrow (>=7.0.0)"]
fastparquet = ["fastparquet (>=0.4.0)"]
pandas = ["pandas (>=1.3.0)"]
sqlalchemy = ["sqlalchemy (>=1.0.0,<2.0.0)"]
[[package]]
@@ -3469,43 +3515,92 @@ zstd = ["zstandard"]
[[package]]
name = "pymssql"
version = "2.2.8"
version = "2.3.1"
description = "DB-API interface to Microsoft SQL Server for Python. (new Cython-based version)"
optional = false
python-versions = "*"
files = [
{file = "pymssql-2.2.8-cp310-cp310-macosx_11_0_x86_64.whl", hash = "sha256:30bfd7b8edef78097ccd3f52ac3f3a5c3cf0019f8a280f306cacbbb165caaf63"},
{file = "pymssql-2.2.8-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:049f2e3de919e8e02504780a21ebbf235e21ca8ed5c7538c5b6e705aa6c43d8c"},
{file = "pymssql-2.2.8-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0dd86d8e3e346e34f3f03d12e333747b53a1daa74374a727f4714d5b82ee0dd5"},
{file = "pymssql-2.2.8-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:508226a0df7cb6faeda9f8e84e85743690ca427d7b27af9a73d75fcf0c1eef6e"},
{file = "pymssql-2.2.8-cp310-cp310-win_amd64.whl", hash = "sha256:47859887adeaf184766b5e0bc845dd23611f3808f9521552063bb36eabc10092"},
{file = "pymssql-2.2.8-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:d873e553374d5b1c57fe1c43bb75e3bcc2920678db1ef26f6bfed396c7d21b30"},
{file = "pymssql-2.2.8-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bf31b8b76634c826a91f9999e15b7bfb0c051a0f53b319fd56481a67e5b903bb"},
{file = "pymssql-2.2.8-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:821945c2214fe666fd456c61e09a29a00e7719c9e136c801bffb3a254e9c579b"},
{file = "pymssql-2.2.8-cp311-cp311-win_amd64.whl", hash = "sha256:cc85b609b4e60eac25fa38bbac1ff854fd2c2a276e0ca4a3614c6f97efb644bb"},
{file = "pymssql-2.2.8-cp36-cp36m-macosx_10_14_x86_64.whl", hash = "sha256:ebe7f64d5278d807f14bea08951e02512bfbc6219fd4d4f15bb45ded885cf3d4"},
{file = "pymssql-2.2.8-cp36-cp36m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:253af3d39fc0235627966817262d5c4c94ad09dcbea59664748063470048c29c"},
{file = "pymssql-2.2.8-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2c9d109df536dc5f7dd851a88d285a4c9cb12a9314b621625f4f5ab1197eb312"},
{file = "pymssql-2.2.8-cp36-cp36m-manylinux_2_28_x86_64.whl", hash = "sha256:358d5acf0298d6618edf7fedc4ce3dc8fb5ce8a9db85e7332d5196d29d841821"},
{file = "pymssql-2.2.8-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:63e1be8936372c07aee2405203ee0161ce76b03893cafe3d46841be9886f5ffe"},
{file = "pymssql-2.2.8-cp37-cp37m-macosx_11_0_x86_64.whl", hash = "sha256:381d8a47c4665d99f114849bed23bcba1922c9d005accc3ac19cee8a1d3522dc"},
{file = "pymssql-2.2.8-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:4f365033c9b4263b74b8a332bbdf2d7d8d7230f05805439b4f3fbf0a0164acfe"},
{file = "pymssql-2.2.8-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:03903bdf23a2aac26e9b772b3998efeba079fcb6fcfa6df7abc614e9afa14af0"},
{file = "pymssql-2.2.8-cp37-cp37m-manylinux_2_28_x86_64.whl", hash = "sha256:5c83208138f87942c5f08aa50c5fb8d89b7f15340cde58a77b08f49df277e134"},
{file = "pymssql-2.2.8-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:7e4538e85d7b5fb3867636391f91e9e18ac2e0aef660d25e97268e04339f2c36"},
{file = "pymssql-2.2.8-cp38-cp38-macosx_11_0_x86_64.whl", hash = "sha256:e920d6f805a525f19e770e48326a5f96b83d7b8dfd093f5b7015b54ef84bcf4c"},
{file = "pymssql-2.2.8-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:2446645eb8684c0cb246a3294110455dd89a29608dfa7a58ea88aa42aa1cf005"},
{file = "pymssql-2.2.8-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3906993300650844ec140aa58772c0f5f3e9e9d5709c061334fd1551acdcf066"},
{file = "pymssql-2.2.8-cp38-cp38-manylinux_2_28_x86_64.whl", hash = "sha256:7309c7352e4a87c9995c3183ebfe0ff4135e955bb759109637673c61c9f0ca8d"},
{file = "pymssql-2.2.8-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:9b8d603cc1ec7ae585c5a409a1d45e8da067970c79dd550d45c238ae0aa0f79f"},
{file = "pymssql-2.2.8-cp38-cp38-win_amd64.whl", hash = "sha256:293cb4d0339e221d877d6b19a1905082b658f0100a1e2ccc9dda10de58938901"},
{file = "pymssql-2.2.8-cp39-cp39-macosx_11_0_x86_64.whl", hash = "sha256:895041edd002a2e91d8a4faf0906b6fbfef29d9164bc6beb398421f5927fa40e"},
{file = "pymssql-2.2.8-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:6b2d9c6d38a416c6f2db36ff1cd8e69f9a5387a46f9f4f612623192e0c9404b1"},
{file = "pymssql-2.2.8-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d63d6f25cf40fe6a03c49be2d4d337858362b8ab944d6684c268e4990807cf0c"},
{file = "pymssql-2.2.8-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:c83ad3ad20951f3a94894b354fa5fa9666dcd5ebb4a635dad507c7d1dd545833"},
{file = "pymssql-2.2.8-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:3933f7f082be74698eea835df51798dab9bc727d94d3d280bffc75ab9265f890"},
{file = "pymssql-2.2.8-cp39-cp39-win_amd64.whl", hash = "sha256:de313375b90b0f554058992f35c4a4beb3f6ec2f5912d8cd6afb649f95b03a9f"},
{file = "pymssql-2.2.8.tar.gz", hash = "sha256:9baefbfbd07d0142756e2dfcaa804154361ac5806ab9381350aad4e780c3033e"},
{file = "pymssql-2.3.1-cp310-cp310-macosx_13_0_x86_64.whl", hash = "sha256:001b3321a5f620b80d1427933fcca11b05f29a808d7772a84d18d01e640ee60a"},
{file = "pymssql-2.3.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:15466dd41be5e32302f0c4791f612aadd608a0e6ec0b10d769e76cbb4c86aa97"},
{file = "pymssql-2.3.1-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:74349040d4ff6f05894aefb5109ecffcd416e1e366d9951085d3225a9d09c46b"},
{file = "pymssql-2.3.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc79dbe5eca8825b73830c8bb147b6f588300dc7510393822682162dc4ff003f"},
{file = "pymssql-2.3.1-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:0b93ebe2feb45e772ca708bc4cd70f3e4c72796ec1b157fd5d80cdc589c786aa"},
{file = "pymssql-2.3.1-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:44b1c8752c0fc6750902c1c521f258bdf4271bfbf7b2a5fee469b6ad00631aab"},
{file = "pymssql-2.3.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fdfadb055a9ecad58356decfecc41626999ad7b548cc7ea898cf159e2217f7bb"},
{file = "pymssql-2.3.1-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:46f1074c6763e9a899128f22a0f72e9fb0035535f48efabd6a294db1c149e6f1"},
{file = "pymssql-2.3.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:ebb11b61d99ec5bbe0b8c411ff748a90263cdaf474881de231da8184e721c42c"},
{file = "pymssql-2.3.1-cp310-cp310-win32.whl", hash = "sha256:2ef07fdee3e9652d39b4c081c5c5e1a1031abd122b402ed66813bceb3874ccea"},
{file = "pymssql-2.3.1-cp310-cp310-win_amd64.whl", hash = "sha256:791522339215cb7f88db54c831a2347e0c4d69dd3092a343eea5b9339adf4412"},
{file = "pymssql-2.3.1-cp311-cp311-macosx_13_0_universal2.whl", hash = "sha256:0433ffa1c86290a93e81176f377621cb70405be66ade8f3070d3f5ec9cfebdba"},
{file = "pymssql-2.3.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6182d82ebfbe46f0e7748d068c6a1c16c0f4fe1f34f1c390f63375cee79b44b0"},
{file = "pymssql-2.3.1-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cfbe07dcf0aaee8ce630624669cb2fb77b76743d4dd925f99331422be8704de3"},
{file = "pymssql-2.3.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7d999c8e5d5d48e9305c4132392825de402f13feea15694e4e7103029b6eae06"},
{file = "pymssql-2.3.1-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:2dced0a76d8e99c283103a2e3c825ca22c67f1f8fc5cff657510f4d2ffb9d188"},
{file = "pymssql-2.3.1-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:880d3173025dea3babf5ab862875b3c76a5cf8df5b292418050c7793c651c0b2"},
{file = "pymssql-2.3.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:9f89c698e29ce5c576e4980ded89c00b45e482ec02759bfbfc1aa326648cf64a"},
{file = "pymssql-2.3.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:3f4f2a38ce6e39ed2414c20ca16deaea4340868033a4bb23d5e4e30c72290caf"},
{file = "pymssql-2.3.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:e34e8aa1d3da555dbf23141b02f401267c0be32104b4f030afd0bae62d26d735"},
{file = "pymssql-2.3.1-cp311-cp311-win32.whl", hash = "sha256:72e57e20802bf97399e050a0760a4541996fc27bc605a1a25e48ca6fe4913c48"},
{file = "pymssql-2.3.1-cp311-cp311-win_amd64.whl", hash = "sha256:b5d3604bca2fa8d5ba2eed1582a3c8a83970a8d2edabfcfd87c1edecb7617d16"},
{file = "pymssql-2.3.1-cp312-cp312-macosx_13_0_universal2.whl", hash = "sha256:c28f1b9560b82fe1a1e51d8c56f6d36bca7c507a8cdf2caa2a0642503c220d5c"},
{file = "pymssql-2.3.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e3509b75747eb22ae89f3d47ae316a4b9eac7d952269e88b356ef117a1b8e3b8"},
{file = "pymssql-2.3.1-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cca3bed27e1ab867e482fa8b529d408489ad57e8b60452f75ef288da90573db6"},
{file = "pymssql-2.3.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4fe3276915e6040daec409203e3143aa2826984adb8d223c155dab91010110a4"},
{file = "pymssql-2.3.1-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:d36d566d0d6997c95442c3d2902800e6b072ccc017c6284e5b1bd4e17dc8fada"},
{file = "pymssql-2.3.1-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:3564df40a678623a769acd9677dc68228b2694170132c6f296eb62bf766d31e4"},
{file = "pymssql-2.3.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:3dbd4106faabf97f028d0ac59b30d132cfb5e48cf5314b0476f293123dbf3422"},
{file = "pymssql-2.3.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:acd1690d9b1b2ece9d0e1fd7d68571fc9fa56b6ba8697a3132446419ff7fb3f4"},
{file = "pymssql-2.3.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:126e0b78773975136e6385da7286c277e2e0320c1f4bee0e4dc61a5edcf98c41"},
{file = "pymssql-2.3.1-cp312-cp312-win32.whl", hash = "sha256:21803b731b8c8780fc974d9b4931fa8f1ca29c227502a4c317e12773c8bdef43"},
{file = "pymssql-2.3.1-cp312-cp312-win_amd64.whl", hash = "sha256:6b0224fc5ce4cf0703278859f145e3e921c04d9feb59739a104d3020bbf0c0c1"},
{file = "pymssql-2.3.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:709c1df3134e330ee9590437253be363b558154bde5bb54856fc5fe68a03c971"},
{file = "pymssql-2.3.1-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b9381eafaf529815f2d61f22b99e0538e744b31234f17d4384f5b0496bd1fbed"},
{file = "pymssql-2.3.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c3bf78789014f202855f5d00de982bbcd95177fe8bcf920f0ce730b72456c173"},
{file = "pymssql-2.3.1-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:4b44280eedd0a3f031e9464d4fc632a215fadcfb375bb479065b61a6337df402"},
{file = "pymssql-2.3.1-cp313-cp313-musllinux_1_1_i686.whl", hash = "sha256:922f536b925880c260968c8f2130b1c9d6315b83f300f18365b5421933f034a2"},
{file = "pymssql-2.3.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:f00f618d1c0f58617de548e5094f7d55ab6034b94068d7eebba60a034866b10b"},
{file = "pymssql-2.3.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:b363db86a1a3fe16df9b4253e17b02a268d0f2e2753679b8e85cee268e2fe8c4"},
{file = "pymssql-2.3.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:396a26cf576196cc4a3d77890b2b8eb62655ff02846288757dd8b587352cc4f5"},
{file = "pymssql-2.3.1-cp36-cp36m-win32.whl", hash = "sha256:5a1a1c697596f23058697709144d00a44e7af6ecab6a517f2ecf28dcf8fb4280"},
{file = "pymssql-2.3.1-cp36-cp36m-win_amd64.whl", hash = "sha256:4f92e8657d42341dce01f7f57d03f84b35c0ed00a7bef24533ff80a37ffcfb4e"},
{file = "pymssql-2.3.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:095b50e43bfbc4d6f953810175ba275bb3e6136206f3a7146bdd1031e3f0dd9b"},
{file = "pymssql-2.3.1-cp37-cp37m-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:47ac89098732c327725b53464932c6a532367271a3d5c5a988f61e23e0e0e286"},
{file = "pymssql-2.3.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2f284fc052cf1dbc702a2f4d13442d87fc6847ba9054faccfc8d8446fcf00894"},
{file = "pymssql-2.3.1-cp37-cp37m-manylinux_2_28_x86_64.whl", hash = "sha256:40778b65c09eef9e7c25c444b96e76f81d8b5cf1828cb555123d052b7d3b5661"},
{file = "pymssql-2.3.1-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:22c8609bc7f8b13d383729ba09042b4d796a607c93779c616be51b37caa6b384"},
{file = "pymssql-2.3.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:ab2aea2ae8bc1aba0105fccbf9e4f6716648b2b8f9421fd3418c6cc798fca43e"},
{file = "pymssql-2.3.1-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:e594de69832ad13761412f4d5c981a6e5d931b22f25136c8cd3531d9c6cfdf63"},
{file = "pymssql-2.3.1-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:68f879b4ec4b2191a1d8b3bb24db04c3631737653785369c275bd5a574e54093"},
{file = "pymssql-2.3.1-cp37-cp37m-musllinux_1_2_i686.whl", hash = "sha256:9ef157e63a1c19e7ab4823237b5f03a3bca45e1e94a4d5ed73baab6d019830c7"},
{file = "pymssql-2.3.1-cp37-cp37m-musllinux_1_2_x86_64.whl", hash = "sha256:66afe6ee539e37cdfea0c6b2d596ec0d2a6223f09450c4df7cf872bad12691fe"},
{file = "pymssql-2.3.1-cp37-cp37m-win32.whl", hash = "sha256:b9cc14a9f63e632200f54311da9868ece2715fa9560f6272c9bb82c57edc0543"},
{file = "pymssql-2.3.1-cp37-cp37m-win_amd64.whl", hash = "sha256:54bc10f28c0acc1347d3c7056e702ad21f128e6bf7737b4edc8c267372db9ce8"},
{file = "pymssql-2.3.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c8d955e751fb125be2a8513b5a338457a3fe73e5daa094815f96a86e496f7149"},
{file = "pymssql-2.3.1-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4c13ca6eaf0d7f16af9edf87d58070329bfacb7f27b90e1de16318d64c7b873b"},
{file = "pymssql-2.3.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5ecb0cdea24e2c019fb403fd642c04a64e8767c79f8dd38451eb5d72ceffce34"},
{file = "pymssql-2.3.1-cp38-cp38-manylinux_2_28_x86_64.whl", hash = "sha256:afd57a728e81d73a0f43f3d28216c402fea03bd06a382da881dfc8215fb4080d"},
{file = "pymssql-2.3.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:5e6f6d9de73309cda602bbb769cb707f08d6899664f3ac6e9ed3e3b1ad472cee"},
{file = "pymssql-2.3.1-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:02b808dbb86bbe751dd3fd117e83926b0a19ca9d9b833fae945bf2e31be66bf6"},
{file = "pymssql-2.3.1-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:b0f1ba9befe23e6c4e75c2a626ffe59d159ab3a425a0208515888ec8670bf5bf"},
{file = "pymssql-2.3.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:8eecb4f3b41b8b29a0cbe502ae37b6477063d690151f668c410328f101f6198b"},
{file = "pymssql-2.3.1-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:a36c8b089e2d7b606aee823eefdfd72f5df110241fc5d913094b0b9da2692794"},
{file = "pymssql-2.3.1-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:425de7d3f38cd1867c30b7c352d66020f38fdcdf804282ee232f5e25672930c1"},
{file = "pymssql-2.3.1-cp38-cp38-win32.whl", hash = "sha256:ce397eb6a2a90fcd2a83d8812c1b8752af3b5362e630da49aa556c947e32ce3d"},
{file = "pymssql-2.3.1-cp38-cp38-win_amd64.whl", hash = "sha256:02c4ab7a58bfb57edb2deee7e2aceed2512960e7c2c1fd2cb23c647471a36ba2"},
{file = "pymssql-2.3.1-cp39-cp39-macosx_13_0_x86_64.whl", hash = "sha256:750078568dafc1e0a24cf0f51eecfe548b13440976a2c8b19cc6e5d38e7b10bc"},
{file = "pymssql-2.3.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5a651dd98f67eef98f429c949fb50ea0a92fcf8668834cc35909237c24c1b906"},
{file = "pymssql-2.3.1-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a1ecedaeec8f4d8643d088b4985f0b742d9669bff701153a845b0d1900260b81"},
{file = "pymssql-2.3.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:015f6ccd1bcb53f22a3226653d0d8155da40f4afbc1fd0cec25de5fe8decf126"},
{file = "pymssql-2.3.1-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:da44761ca2f996d88f90c0f972b583dfe9c389db84888bd8209cdb83508f7c7a"},
{file = "pymssql-2.3.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:9557b738475e06dfd53f97d8a2c2b259b9b9fd79bf1a4e084ae4e9f164be644d"},
{file = "pymssql-2.3.1-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:a1f3f2e2792364a50417f3c2dc0d8f125955c1b641f36eb313daf666045b9748"},
{file = "pymssql-2.3.1-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:be8af4dea025f171ffb1e5b17cb0c9cbc92b0e3c32d0517bc678fff6f660e5fb"},
{file = "pymssql-2.3.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:a87950fb1a2b1c4028064fac971f3e191adebb58657ca985330f70e02f95223e"},
{file = "pymssql-2.3.1-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:9ea04bf8e13d567650631a944c88886c99a5622d9491e896a9b5a9ffbef2e352"},
{file = "pymssql-2.3.1-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:4d93a82f8ad7d3606354b81bbbe7e7832f70fd6e9ccb2e04a2975117da5df973"},
{file = "pymssql-2.3.1-cp39-cp39-win32.whl", hash = "sha256:6a2657152d4007314b66f353a25fc2742155c2770083320b5255fc576103661e"},
{file = "pymssql-2.3.1-cp39-cp39-win_amd64.whl", hash = "sha256:6c9ffb3ef110bf0fc2a41c845f231cf749162b1d71e02b0aceb6c0ebc603e2e9"},
{file = "pymssql-2.3.1.tar.gz", hash = "sha256:ddee15c4c193e14c92fe2cd720ca9be1dba1e0f4178240380b8f5f6f00da04c6"},
]
[[package]]
@@ -3536,43 +3631,61 @@ tests = ["hypothesis (>=3.27.0)", "pytest (>=3.2.1,!=3.3.0)"]
[[package]]
name = "pyodbc"
version = "4.0.28"
description = "DB API Module for ODBC"
version = "5.1.0"
description = "DB API module for ODBC"
optional = false
python-versions = "*"
python-versions = ">=3.8"
files = [
{file = "pyodbc-4.0.28-cp27-cp27m-win32.whl", hash = "sha256:2217eb01091a207a9ffa457c49a63a1d0eb8514c810a23b901518348422fcf65"},
{file = "pyodbc-4.0.28-cp27-cp27m-win_amd64.whl", hash = "sha256:ae35c455bfbadc631ee20df6657bfda0779bdc80badfd9d13741433dd78785e6"},
{file = "pyodbc-4.0.28-cp27-none-macosx_10_15_x86_64.whl", hash = "sha256:f37f26ae909101465a085ef51b9dde35afc93b7c7e38c25b61b124b110aa9998"},
{file = "pyodbc-4.0.28-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:5d1abca8f5bdab1515e300d05c63c25d072a123c7089a554290b5b9e83168eb6"},
{file = "pyodbc-4.0.28-cp36-cp36m-win32.whl", hash = "sha256:c25e525e0576b1dfa067d3a6530e046a24006d89715026d2d5dbf6d4290093b9"},
{file = "pyodbc-4.0.28-cp36-cp36m-win_amd64.whl", hash = "sha256:259b2554d2b8c9a6247871fec741b526f0b63a0e42676bd8f210e214a3015129"},
{file = "pyodbc-4.0.28-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8ad9aa2a851242109141e4275c2a9b4d4379e00288959acd877501ee90aa3955"},
{file = "pyodbc-4.0.28-cp37-cp37m-win32.whl", hash = "sha256:2908f73e5a374437fd7a38f14b09f2b96d742235bf2f819fb697f8922e35ddda"},
{file = "pyodbc-4.0.28-cp37-cp37m-win_amd64.whl", hash = "sha256:a1a1687edef4319ae533e1d789c6c8241459f04af9e4db76e6e4045c530239de"},
{file = "pyodbc-4.0.28-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:4f3c788d231907f45ea329cd245b398b165d9d28809f55814240eea775a6b1cd"},
{file = "pyodbc-4.0.28-cp38-cp38-win32.whl", hash = "sha256:93e495c51a5db027c2f7ee2c2c3fe9d6ea86b3a61392c7c8961a1818951868c8"},
{file = "pyodbc-4.0.28-cp38-cp38-win_amd64.whl", hash = "sha256:49ba851be2d9d07cc1472b43febc93e3362c1e09ceb3eac84693a6690d090165"},
{file = "pyodbc-4.0.28.tar.gz", hash = "sha256:510643354c4c687ed96bf7e7cec4d02d6c626ecf3e18696f5a0228dd6d11b769"},
{file = "pyodbc-5.1.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:02fe9821711a2d14415eaeb4deab471d2c8b7034b107e524e414c0e133c42248"},
{file = "pyodbc-5.1.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:2cbdbd019756285dc44bc35238a3ed8dfaa454e8c8b2c3462f1710cfeebfb290"},
{file = "pyodbc-5.1.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:84df3bbce9bafe65abd25788d55c9f1da304f6115d70f25758ff8c85f3ce0517"},
{file = "pyodbc-5.1.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:218bb75d4bc67075529a65ce8ec7daeed1d83c33dd7410450fbf68d43d184d28"},
{file = "pyodbc-5.1.0-cp310-cp310-win32.whl", hash = "sha256:eae576b3b67d21d6f237e18bb5f3df8323a2258f52c3e3afeef79269704072a9"},
{file = "pyodbc-5.1.0-cp310-cp310-win_amd64.whl", hash = "sha256:c3b65343557f4c7753204e06f4c82c97ed212a636501f4bc27c5ce0e549eb3e8"},
{file = "pyodbc-5.1.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:aa6f46377da303bf79bcb4b559899507df4b2559f30dcfdf191358ee4b99f3ab"},
{file = "pyodbc-5.1.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:b19d7f44cfee89901e482f554a88177e83fae76b03c3f830e0023a195d840220"},
{file = "pyodbc-5.1.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3c36448322f8d6479d87c528cf52401a6ea4f509b9637750b67340382b4e1b40"},
{file = "pyodbc-5.1.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1c5e0cb79222aad4b31a3602e39b242683c29c6221a16ed43f45f18fd0b73659"},
{file = "pyodbc-5.1.0-cp311-cp311-win32.whl", hash = "sha256:92caed9d445815ed3f7e5a1249e29a4600ebc1e99404df81b6ed7671074c9227"},
{file = "pyodbc-5.1.0-cp311-cp311-win_amd64.whl", hash = "sha256:a1bd14633e91b7a9814f4fd944c9ebb89fb7f1fd4710c4e3999b5ef041536347"},
{file = "pyodbc-5.1.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:d3d9cc4af703c4817b6e604315910b0cf5dcb68056d52b25ca072dd59c52dcbc"},
{file = "pyodbc-5.1.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:406b8fa2133a7b6a713aa5187dba2d08cf763b5884606bed77610a7660fdfabe"},
{file = "pyodbc-5.1.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f8488c3818f12207650836c5c6f7352f9ff9f56a05a05512145995e497c0bbb1"},
{file = "pyodbc-5.1.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b0df69e3a500791b70b5748c68a79483b24428e4c16027b56aa0305e95c143a4"},
{file = "pyodbc-5.1.0-cp312-cp312-win32.whl", hash = "sha256:aa4e02d3a9bf819394510b726b25f1566f8b3f0891ca400ad2d4c8b86b535b78"},
{file = "pyodbc-5.1.0-cp312-cp312-win_amd64.whl", hash = "sha256:33f4984af38872e7bdec78007a34e4d43ae72bf9d0bae3344e79d9d0db157c0e"},
{file = "pyodbc-5.1.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:29425e2d366e7f5828b76c7993f412a3db4f18bd5bcee00186c00b5a5965e205"},
{file = "pyodbc-5.1.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:a2bbd2e75c77dee9f3cd100c3246110abaeb9af3f7fa304ccc2934ff9c6a4fa4"},
{file = "pyodbc-5.1.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3602136a936bc0c1bb9722eb2fbf2042b3ff1ddccdc4688e514b82d4b831563b"},
{file = "pyodbc-5.1.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bed1c843565d3a4fd8c332ebceaf33efe817657a0505eacb97dd1b786a985b0b"},
{file = "pyodbc-5.1.0-cp38-cp38-win32.whl", hash = "sha256:735f6da3762e5856b5580be0ed96bb946948346ebd1e526d5169a5513626a67a"},
{file = "pyodbc-5.1.0-cp38-cp38-win_amd64.whl", hash = "sha256:c5bb4e43f6c72f5fa2c634570e0d761767d8ea49f39205229b812fb4d3fe05aa"},
{file = "pyodbc-5.1.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:33f0f1d7764cefef6f787936bd6359670828a6086be67518ab951f1f7f503cda"},
{file = "pyodbc-5.1.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:be3b1c36c31ec7d73d0b34a8ad8743573763fadd8f2bceef1e84408252b48dce"},
{file = "pyodbc-5.1.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e71a51c252b503b4d753e21ed31e640015fc0d00202d42ea42f2396fcc924b4a"},
{file = "pyodbc-5.1.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:af5282cc8b667af97d76f4955250619a53f25486cbb6b1f45a06b781006ffa0b"},
{file = "pyodbc-5.1.0-cp39-cp39-win32.whl", hash = "sha256:96b2a8dc27693a517e3aad3944a7faa8be95d40d7ec1eda51a1885162eedfa33"},
{file = "pyodbc-5.1.0-cp39-cp39-win_amd64.whl", hash = "sha256:e738c5eedb4a0cbab20cc008882f49b106054499db56864057c2530ff208cf32"},
{file = "pyodbc-5.1.0.tar.gz", hash = "sha256:397feee44561a6580be08cedbe986436859563f4bb378f48224655c8e987ea60"},
]
[[package]]
name = "pyopenssl"
version = "23.2.0"
version = "24.2.1"
description = "Python wrapper module around the OpenSSL library"
optional = false
python-versions = ">=3.6"
python-versions = ">=3.7"
files = [
{file = "pyOpenSSL-23.2.0-py3-none-any.whl", hash = "sha256:24f0dc5227396b3e831f4c7f602b950a5e9833d292c8e4a2e06b709292806ae2"},
{file = "pyOpenSSL-23.2.0.tar.gz", hash = "sha256:276f931f55a452e7dea69c7173e984eb2a4407ce413c918aa34b55f82f9b8bac"},
{file = "pyOpenSSL-24.2.1-py3-none-any.whl", hash = "sha256:967d5719b12b243588573f39b0c677637145c7a1ffedcd495a487e58177fbb8d"},
{file = "pyopenssl-24.2.1.tar.gz", hash = "sha256:4247f0dbe3748d560dcbb2ff3ea01af0f9a1a001ef5f7c4c647956ed8cbf0e95"},
]
[package.dependencies]
cryptography = ">=38.0.0,<40.0.0 || >40.0.0,<40.0.1 || >40.0.1,<42"
cryptography = ">=41.0.5,<44"
[package.extras]
docs = ["sphinx (!=5.2.0,!=5.2.0.post0)", "sphinx-rtd-theme"]
test = ["flaky", "pretend", "pytest (>=3.0.1)"]
docs = ["sphinx (!=5.2.0,!=5.2.0.post0,!=7.2.5)", "sphinx-rtd-theme"]
test = ["pretend", "pytest (>=3.0.1)", "pytest-rerunfailures"]
[[package]]
name = "pyparsing"
@@ -3766,12 +3879,84 @@ cli = ["click (>=5.0)"]
[[package]]
name = "python-rapidjson"
version = "1.1"
version = "1.20"
description = "Python wrapper around rapidjson"
optional = false
python-versions = ">=3.6"
files = [
{file = "python-rapidjson-1.1.tar.gz", hash = "sha256:9353a5eeb23a43556fa382ff94b3f6d67c663e31a2cfd220268c13e3f848fddc"},
{file = "python_rapidjson-1.20-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:eeaa8487fdd8db409bd2e0c41c59cee3b9f1d08401fc75520f7d35c7a22d8789"},
{file = "python_rapidjson-1.20-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:425c2bb8e778a04497953482c251944b2736f61012d897f17b73da3eca060c27"},
{file = "python_rapidjson-1.20-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4f7cbbff9696ea01dd8a29502cb314471c9a5d4239f2f3b7e35b6adbde2cc620"},
{file = "python_rapidjson-1.20-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:83a48f96d0abb8349a4d42f029259b755d8c6fd347f5de2d640e164c3f45e63b"},
{file = "python_rapidjson-1.20-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6cb3ad353ec083a6dcf0552f1fce3c490f92e2fccf9a81eac42835297a8431a1"},
{file = "python_rapidjson-1.20-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7f7b6574887d8828f34eb3384092d6e6c290e8fbb12703c409dbdde814612657"},
{file = "python_rapidjson-1.20-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:403e4986484f01f79fdce00b48c12a1b39d16e822cd37c60843ab26455ab0680"},
{file = "python_rapidjson-1.20-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:e3f89a58d7709d5879586e9dbfd11be76a799e8fbdbb5eddaffaeba9b572fba3"},
{file = "python_rapidjson-1.20-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:b0d07d4f0ebbb2228d5140463f11ac519147b9d791f7e40b3edf518a806be3cc"},
{file = "python_rapidjson-1.20-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:a5fb413414b92763a54d53b732df3c9de1b114012c8881a3d1215a19b9fca494"},
{file = "python_rapidjson-1.20-cp310-cp310-win32.whl", hash = "sha256:9831430f17101a6a249e07db9c42d26c3263e6009450722cce0c14726421f434"},
{file = "python_rapidjson-1.20-cp310-cp310-win_amd64.whl", hash = "sha256:fbff5caf127c5bed4d6620f95a039dd9e293784d844af50782aaf278a743acb4"},
{file = "python_rapidjson-1.20-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:328095d6d558090c29d24d889482b10dcc3ade3b77c93a61ea86794623046628"},
{file = "python_rapidjson-1.20-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:fc7a095f77eb3bb6acff94acf868a100faaf06028c4b513428f161cd55030476"},
{file = "python_rapidjson-1.20-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ce4cee141c924300cbedba1e5bea05b13484598d1e550afc5b50209ba73c62f2"},
{file = "python_rapidjson-1.20-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4355bcfc8629d15f6246011b40e84cc368d842518a91adb15c5eba211305ee5b"},
{file = "python_rapidjson-1.20-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7dd9c5e661d17eafa44b2875f6ce55178cc87388575ce3cd3c606d5a33772b49"},
{file = "python_rapidjson-1.20-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:bd978c7669cc844f669a48d2a6019fb9134a2385536f806fe265a1e374c3573a"},
{file = "python_rapidjson-1.20-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8fc52405435ce875aa000afa2637ea267eb0d4ab9622f9b97c92d92cb1a9c440"},
{file = "python_rapidjson-1.20-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:bef1eca712fb9fd5d2edd724dd1dd8a608215d6afcaee4f351b3e99e3f73f720"},
{file = "python_rapidjson-1.20-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:6355cb690bf64629767206524d4d00da909970d46d8fc0b367f339975e4eb419"},
{file = "python_rapidjson-1.20-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:f974c4e11be833221062fc4c3129bed172082792b33ef9fc1b8104f49c514f1d"},
{file = "python_rapidjson-1.20-cp311-cp311-win32.whl", hash = "sha256:06ee7bcf660ebbdf1953aa7bf74214b722d934928c7b9f2a23b12e0713b61fa4"},
{file = "python_rapidjson-1.20-cp311-cp311-win_amd64.whl", hash = "sha256:9df543521fa4b69589c42772b2f32a6c334b3b5fc612cd6dc3705136d0788da3"},
{file = "python_rapidjson-1.20-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:6056fcc8caeb9b04775bf655568bba362c7670ab792c1b438671bb056db954cd"},
{file = "python_rapidjson-1.20-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:225bd4cbabfe7910261cbcebb8b811d4ff98e90cdd17c233b916c6aa71a9553f"},
{file = "python_rapidjson-1.20-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:026077b663acf93a3f2b1adb87282e611a30214b8ae8001b7e4863a3b978e646"},
{file = "python_rapidjson-1.20-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:884e1dd4c0770ed424737941af4d5dc9014995f9c33595f151af13f83ce282c3"},
{file = "python_rapidjson-1.20-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5f55531c8197cb7a21a5ef0ffa46f2b8fc8c5fe7c6fd08bdbd2063ae65d2ff65"},
{file = "python_rapidjson-1.20-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c60121d155562dc694c05ed7df4e39e42ee1d3adff2a060c64a004498e6451f7"},
{file = "python_rapidjson-1.20-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3a6620eed0b04196f37fab7048c1d672d03391bb29d7f09ee8fee8dea33f11f4"},
{file = "python_rapidjson-1.20-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:ddb63eff401ce7cf20cdd5e21942fc23fbe0e1dc1d96d7ae838645fb1f74fb47"},
{file = "python_rapidjson-1.20-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:05e28c3dbb4a0d74ec13af9668ef2b9f302edf83cf7ce1d8316a95364720eec0"},
{file = "python_rapidjson-1.20-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b733978ecd84fc5df9a778ce821dc1f3113f7bfc2493cac0bb17efb4ae0bb8fa"},
{file = "python_rapidjson-1.20-cp312-cp312-win32.whl", hash = "sha256:d87041448cec00e2db5d858625a76dc1b59eef6691a039acff6d92ad8581cfc1"},
{file = "python_rapidjson-1.20-cp312-cp312-win_amd64.whl", hash = "sha256:5d3be149ce5475f9605f01240487541057792abad94d3fd0cd56af363cf5a4dc"},
{file = "python_rapidjson-1.20-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:daee815b4c20ca6e4dbc6bde373dd3f65b53813d775f1c94b765b33b402513a7"},
{file = "python_rapidjson-1.20-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:083df379c769b30f9bc40041c91fd9d8f7bb8ca2b3c7170258842aced2098e05"},
{file = "python_rapidjson-1.20-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b9399ad75a2e3377f9e6208caabe73eb9354cd01b732407475ccadcd42c577df"},
{file = "python_rapidjson-1.20-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:599ab208ccf6172d6cfac1abe048c837e62612f91f97d198e32773c45346a0b4"},
{file = "python_rapidjson-1.20-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bf3c0e2a5b97b0d07311f15f0dce4434e43dec865c3794ad1b10d968460fd665"},
{file = "python_rapidjson-1.20-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e8064b8edb57ddd9e3ffa539cf2ec2f03515751fb0698b40ba5cb66a2123af19"},
{file = "python_rapidjson-1.20-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:bc79d7f00f7538e027960ca6bcd1e03ed99fcf660d4d882d1c22f641155d0db0"},
{file = "python_rapidjson-1.20-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:87aa0b01b8c20984844f1440b8ff6bdb32de911a1750fed344b9daed33b4b52b"},
{file = "python_rapidjson-1.20-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:4099cb9eae8a0ce19c09e02729eb6d69d5180424f13a2641a6c407d053e47a82"},
{file = "python_rapidjson-1.20-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4c680cd2b4de760ff6875de71fe6a87bd610aa116593d62e4f81a563be86ae18"},
{file = "python_rapidjson-1.20-cp313-cp313-win32.whl", hash = "sha256:9e431a7afc77aa874fed537c9f6bf5fcecaef124ebeae2a2379d3b9e9adce74b"},
{file = "python_rapidjson-1.20-cp313-cp313-win_amd64.whl", hash = "sha256:7444bc7e6a04c03d6ed748b5dab0798fa2b3f2b303be8c38d3af405b2cac6d63"},
{file = "python_rapidjson-1.20-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:69e702fe74fe8c44c6253bb91364a270dc49f704920c90e01040155bd600a5fd"},
{file = "python_rapidjson-1.20-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:b9496b1e9d6247e8802ac559b7eebb5f3cae426d1c1dbde4049c63dff0941370"},
{file = "python_rapidjson-1.20-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1446e902b6c781f271bf8556da636c1375cbb208e25f92e1af4cc2d92cf0cf15"},
{file = "python_rapidjson-1.20-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:368ecdf4031abbde9c94aac40981d9a1238e6bcfef9fbfee441047b4757d6033"},
{file = "python_rapidjson-1.20-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:924f9ea302494d4a4d540d3509f8f1f15622ea7d614c6f29df3188d52c6cb546"},
{file = "python_rapidjson-1.20-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:632acb2dfa29883723e24bb2ce47c726edd5f672341553a5184db68f78d3bd09"},
{file = "python_rapidjson-1.20-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:c2f85da53286e67778d4061ef32ff44ca9b5f945030463716e046ee8985319f8"},
{file = "python_rapidjson-1.20-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:c05c8602c019cc0db19601fdc4927755a9d33f21d01beb3d5767313d7a81360d"},
{file = "python_rapidjson-1.20-cp38-cp38-musllinux_1_2_ppc64le.whl", hash = "sha256:7d36aab758bfb1b59e0a849cd20e971eda951a04d3586bb5f6cb460bfc7c103d"},
{file = "python_rapidjson-1.20-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:e5774c905034362298312116f9b58c181e91a09800e4e5cede7b3d460a6a9fde"},
{file = "python_rapidjson-1.20-cp38-cp38-win32.whl", hash = "sha256:488d0c6155004b5177225eaf331bb1838616da05ae966dd24a7d442751c1d193"},
{file = "python_rapidjson-1.20-cp38-cp38-win_amd64.whl", hash = "sha256:00183c4938cd491b98b1a43626bc5a381842ceba87644cb91b25555f3fc3c0bf"},
{file = "python_rapidjson-1.20-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:f510ffe32fec319699f0c1ea9cee5bde47c33202b034b85c5d1b9ace682aa96a"},
{file = "python_rapidjson-1.20-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:a2b624b3613fb7b8dfef4adc709bf39489be8c655cd9d24dc4e2cc16fc5def83"},
{file = "python_rapidjson-1.20-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6c9f813a37d1f708a221f1f7d8c97c437d10597261810c1d3b52cf8f248d66c0"},
{file = "python_rapidjson-1.20-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0c3f7085c52259c56af72462df7620c3b8bb95575fd9b8c3a073728855e93269"},
{file = "python_rapidjson-1.20-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:871f2eeb0907f3d7ab09efe04c5b5e2886c275ea568f7867c97468ae14cdd52f"},
{file = "python_rapidjson-1.20-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b7c0408e7f52f32cf4bdd5aa305f005914b0143cac69d42575e2d40e8678cd72"},
{file = "python_rapidjson-1.20-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:ec17a18df700e1f956fc5a0c41cbb3cc746c44c0fef38988efba9b2cb607ecfa"},
{file = "python_rapidjson-1.20-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:1c0303bd445312a78485a9adba06dfdb84561c5157a9cda7999fefb36df4c6cc"},
{file = "python_rapidjson-1.20-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:303b079ef268a996242be51ae80c8b563ee2d73489ab4f16199fef2216e80765"},
{file = "python_rapidjson-1.20-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:5adcef7a27abafbb2b3d0b02c822dfd9b4b329769cb97810b7f9733e1fda0498"},
{file = "python_rapidjson-1.20-cp39-cp39-win32.whl", hash = "sha256:3e963e78fff6ab5ab2ae847b65683774c48b9b192307380f2175540d6423fd73"},
{file = "python_rapidjson-1.20-cp39-cp39-win_amd64.whl", hash = "sha256:1fc3bba6632ecffeb1897fdf98858dc50a677237f4241853444c70a041158a90"},
{file = "python_rapidjson-1.20.tar.gz", hash = "sha256:115f08c86d2df7543c02605e77c84727cdabc4b08310d2f097e953efeaaa73eb"},
]
[[package]]
@@ -3825,6 +4010,7 @@ files = [
{file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"},
{file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"},
{file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"},
{file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a08c6f0fe150303c1c6b71ebcd7213c2858041a7e01975da3a99aed1e7a378ef"},
{file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"},
{file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"},
{file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"},
@@ -4011,13 +4197,13 @@ files = [
[[package]]
name = "requests"
version = "2.31.0"
version = "2.32.3"
description = "Python HTTP for Humans."
optional = false
python-versions = ">=3.7"
python-versions = ">=3.8"
files = [
{file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
{file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
{file = "requests-2.32.3-py3-none-any.whl", hash = "sha256:70761cfe03c773ceb22aa2f671b4757976145175cdfca038c02654d061d6dcc6"},
{file = "requests-2.32.3.tar.gz", hash = "sha256:55365417734eb18255590a9ff9eb97e9e1da868d4ccd6402399eaf68af20a760"},
]
[package.dependencies]
@@ -4079,13 +4265,13 @@ requests = ">=2.0.1,<3.0.0"
[[package]]
name = "restrictedpython"
version = "6.2"
version = "7.3"
description = "RestrictedPython is a defined subset of the Python language which allows to provide a program input into a trusted environment."
optional = false
python-versions = ">=3.6, <3.12"
python-versions = "<3.13,>=3.7"
files = [
{file = "RestrictedPython-6.2-py3-none-any.whl", hash = "sha256:7c2ffa4904300d67732f841d8a975dcdc53eba4c1cdc9d84b97684ef12304a3d"},
{file = "RestrictedPython-6.2.tar.gz", hash = "sha256:db73eb7e3b39650f0d21d10cc8dda9c0e2986e621c94b0c5de32fb0dee3a08af"},
{file = "RestrictedPython-7.3-py3-none-any.whl", hash = "sha256:40a6170bbcfc48b32962831d9281a61608c8e56e7c02fd8e2397225f516a6ed4"},
{file = "RestrictedPython-7.3.tar.gz", hash = "sha256:8888304c7858fdcfd86c50b58561797375ba40319d2b6ffb5d24b08b6a2dcd61"},
]
[package.extras]
@@ -4254,13 +4440,13 @@ files = [
[[package]]
name = "sentry-sdk"
version = "1.28.1"
version = "1.45.1"
description = "Python client for Sentry (https://sentry.io)"
optional = false
python-versions = "*"
files = [
{file = "sentry-sdk-1.28.1.tar.gz", hash = "sha256:dcd88c68aa64dae715311b5ede6502fd684f70d00a7cd4858118f0ba3153a3ae"},
{file = "sentry_sdk-1.28.1-py2.py3-none-any.whl", hash = "sha256:6bdb25bd9092478d3a817cb0d01fa99e296aea34d404eac3ca0037faa5c2aa0a"},
{file = "sentry_sdk-1.45.1-py2.py3-none-any.whl", hash = "sha256:608887855ccfe39032bfd03936e3a1c4f4fc99b3a4ac49ced54a4220de61c9c1"},
{file = "sentry_sdk-1.45.1.tar.gz", hash = "sha256:a16c997c0f4e3df63c0fc5e4207ccb1ab37900433e0f72fef88315d317829a26"},
]
[package.dependencies]
@@ -4270,10 +4456,13 @@ urllib3 = {version = ">=1.26.11", markers = "python_version >= \"3.6\""}
[package.extras]
aiohttp = ["aiohttp (>=3.5)"]
arq = ["arq (>=0.23)"]
asyncpg = ["asyncpg (>=0.23)"]
beam = ["apache-beam (>=2.12)"]
bottle = ["bottle (>=0.12.13)"]
celery = ["celery (>=3)"]
celery-redbeat = ["celery-redbeat (>=2)"]
chalice = ["chalice (>=1.16.0)"]
clickhouse-driver = ["clickhouse-driver (>=0.2.0)"]
django = ["django (>=1.8)"]
falcon = ["falcon (>=1.4)"]
fastapi = ["fastapi (>=0.79.0)"]
@@ -4282,7 +4471,9 @@ grpcio = ["grpcio (>=1.21.1)"]
httpx = ["httpx (>=0.16.0)"]
huey = ["huey (>=2)"]
loguru = ["loguru (>=0.5)"]
openai = ["openai (>=1.0.0)", "tiktoken (>=0.3.0)"]
opentelemetry = ["opentelemetry-distro (>=0.35b0)"]
opentelemetry-experimental = ["opentelemetry-distro (>=0.40b0,<1.0)", "opentelemetry-instrumentation-aiohttp-client (>=0.40b0,<1.0)", "opentelemetry-instrumentation-django (>=0.40b0,<1.0)", "opentelemetry-instrumentation-fastapi (>=0.40b0,<1.0)", "opentelemetry-instrumentation-flask (>=0.40b0,<1.0)", "opentelemetry-instrumentation-requests (>=0.40b0,<1.0)", "opentelemetry-instrumentation-sqlite3 (>=0.40b0,<1.0)", "opentelemetry-instrumentation-urllib (>=0.40b0,<1.0)"]
pure-eval = ["asttokens", "executing", "pure-eval"]
pymongo = ["pymongo (>=3.1)"]
pyspark = ["pyspark (>=2.4.4)"]
@@ -4296,19 +4487,18 @@ tornado = ["tornado (>=5)"]
[[package]]
name = "setuptools"
version = "69.0.3"
version = "70.0.0"
description = "Easily download, build, install, upgrade, and uninstall Python packages"
optional = false
python-versions = ">=3.8"
files = [
{file = "setuptools-69.0.3-py3-none-any.whl", hash = "sha256:385eb4edd9c9d5c17540511303e39a147ce2fc04bc55289c322b9e5904fe2c05"},
{file = "setuptools-69.0.3.tar.gz", hash = "sha256:be1af57fc409f93647f2e8e4573a142ed38724b8cdd389706a867bb4efcf1e78"},
{file = "setuptools-70.0.0-py3-none-any.whl", hash = "sha256:54faa7f2e8d2d11bcd2c07bed282eef1046b5c080d1c32add737d7b5817b1ad4"},
{file = "setuptools-70.0.0.tar.gz", hash = "sha256:f211a66637b8fa059bb28183da127d4e86396c991a942b028c6650d4319c3fd0"},
]
[package.extras]
docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "pygments-github-lexers (==0.0.5)", "rst.linker (>=1.9)", "sphinx (<7.2.5)", "sphinx (>=3.5)", "sphinx-favicon", "sphinx-inline-tabs", "sphinx-lint", "sphinx-notfound-page (>=1,<2)", "sphinx-reredirects", "sphinxcontrib-towncrier"]
testing = ["build[virtualenv]", "filelock (>=3.4.0)", "flake8-2020", "ini2toml[lite] (>=0.9)", "jaraco.develop (>=7.21)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "pip (>=19.1)", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy (>=0.9.1)", "pytest-perf", "pytest-ruff", "pytest-timeout", "pytest-xdist", "tomli-w (>=1.0.0)", "virtualenv (>=13.0.0)", "wheel"]
testing-integration = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "packaging (>=23.1)", "pytest", "pytest-enabler", "pytest-xdist", "tomli", "virtualenv (>=13.0.0)", "wheel"]
docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "pygments-github-lexers (==0.0.5)", "pyproject-hooks (!=1.1)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-favicon", "sphinx-inline-tabs", "sphinx-lint", "sphinx-notfound-page (>=1,<2)", "sphinx-reredirects", "sphinxcontrib-towncrier"]
testing = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "importlib-metadata", "ini2toml[lite] (>=0.14)", "jaraco.develop (>=7.21)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "mypy (==1.9)", "packaging (>=23.2)", "pip (>=19.1)", "pyproject-hooks (!=1.1)", "pytest (>=6,!=8.1.1)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-home (>=0.5)", "pytest-mypy", "pytest-perf", "pytest-ruff (>=0.2.1)", "pytest-subprocess", "pytest-timeout", "pytest-xdist (>=3)", "tomli", "tomli-w (>=1.0.0)", "virtualenv (>=13.0.0)", "wheel"]
[[package]]
name = "simple-salesforce"
@@ -4454,32 +4644,37 @@ files = [
[[package]]
name = "snowflake-connector-python"
version = "3.4.0"
version = "3.12.3"
description = "Snowflake Connector for Python"
optional = false
python-versions = ">=3.8"
files = [
{file = "snowflake-connector-python-3.4.0.tar.gz", hash = "sha256:09939c300d4e40705db1388c9ba596dce7dd9ee4fa8eea0b6fd67b07756597cb"},
{file = "snowflake_connector_python-3.4.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:1cfb5de0fdd1f08ce3046bcec31d6aad2de0fb5196e8c1c2ebf0960748f8bfcf"},
{file = "snowflake_connector_python-3.4.0-cp310-cp310-macosx_11_0_x86_64.whl", hash = "sha256:0b4ea9ffac7e8a7ef7f357116f59d2790c07f8bbc0650cf6a717ecaa275440bb"},
{file = "snowflake_connector_python-3.4.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:16c3ed32db6c2804413a766a4aa85eb6687f3e5334d5e1238a56be938ab0fe5e"},
{file = "snowflake_connector_python-3.4.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5c2683d8e0a0baf05bf946caafe2dcc525f57051869c45f9dcbc5ced5f5433b6"},
{file = "snowflake_connector_python-3.4.0-cp310-cp310-win_amd64.whl", hash = "sha256:389226a12ac56a6b78264a258183580c18c0bd5628ae7c48198d7f239f72fc44"},
{file = "snowflake_connector_python-3.4.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:7ed10a5bad779383e099c6c8124e350718d02f48dc7abb48cd3983687d881132"},
{file = "snowflake_connector_python-3.4.0-cp311-cp311-macosx_11_0_x86_64.whl", hash = "sha256:93145ea700e548a1b5f7612ed9bd597b49dae85d1914fee62be165d1e8a6bb4f"},
{file = "snowflake_connector_python-3.4.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ed168e53cb0ef09c0788095833f22a1590effbb1eb9167ed21edcedeb4c9faeb"},
{file = "snowflake_connector_python-3.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dc7a6aa3b205022beb286cdaa157c2ca3017f2536fbd7d5b6bd6750dbd7861d1"},
{file = "snowflake_connector_python-3.4.0-cp311-cp311-win_amd64.whl", hash = "sha256:432a8b4d0c4194e346eea0bc9329747bb5f6e1a771177d0c33f917d2aef7e421"},
{file = "snowflake_connector_python-3.4.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:4434d1fd0c49c509c631830ca8abf3a3319e90c6993024702a5835991e97946b"},
{file = "snowflake_connector_python-3.4.0-cp38-cp38-macosx_11_0_x86_64.whl", hash = "sha256:dcbeac81489ae6a9aac3eb4d35a05147ae8e346a6d95bd5d740b30bbf5342970"},
{file = "snowflake_connector_python-3.4.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0aea046fb928afb86ccbd80ef8a65398044217172ccf82627f00e63316f10832"},
{file = "snowflake_connector_python-3.4.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b1fdc6ce9e6c21969cf4f8365b4aa93cc1622e8b14caf4b26d9d61b5551eda0d"},
{file = "snowflake_connector_python-3.4.0-cp38-cp38-win_amd64.whl", hash = "sha256:4d7ed67213b68e21ff87ae39068926a81dfd1a5d1b84fd6707163050a7c98801"},
{file = "snowflake_connector_python-3.4.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:9cb928b9e04ab5e3681b4f4aeefe0b68c0137aefb4b7363d204a29cc7e8341de"},
{file = "snowflake_connector_python-3.4.0-cp39-cp39-macosx_11_0_x86_64.whl", hash = "sha256:843e74bba8c7e73c5d946b244df3d86ae691bb144ed73f9a9be77cdbb892769b"},
{file = "snowflake_connector_python-3.4.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d71dcd83c9b97216622dc465dca2ed3f0a7e9e736b979d8798daa282f8a53b08"},
{file = "snowflake_connector_python-3.4.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fb7281f2481b924192ea3939f49d766122b59f58d4a9339536f1d2c1a8f86bd7"},
{file = "snowflake_connector_python-3.4.0-cp39-cp39-win_amd64.whl", hash = "sha256:9488e54f1ac2fea80d2a8d94e10552f19eb88db00a21c67a13e0ac4c79ca9a0b"},
{file = "snowflake_connector_python-3.12.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:497a096fc379ef0846b2f1cf11a8d7620f0d090f08a77d9e93473845014d57d1"},
{file = "snowflake_connector_python-3.12.3-cp310-cp310-macosx_11_0_x86_64.whl", hash = "sha256:055c5808d524497213e4cc9ae91ec3e46cb8342b314e78bc3e139d733dc16741"},
{file = "snowflake_connector_python-3.12.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4a5dc512d62ef693041ed2ad82931231caddc16e14ffc2842da3e3dd4240b83d"},
{file = "snowflake_connector_python-3.12.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a46448f7279d444084eb84a9cddea67662e80ccfaddf41713b9e9aab2b1242e9"},
{file = "snowflake_connector_python-3.12.3-cp310-cp310-win_amd64.whl", hash = "sha256:821b774b77129ce9f03729456ac1f21d69fedb50e5ce957178131c7bb3d8279f"},
{file = "snowflake_connector_python-3.12.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:82290134978d11628026b447052219ce8d880e36937204f1f0332dfc3f2e92e9"},
{file = "snowflake_connector_python-3.12.3-cp311-cp311-macosx_11_0_x86_64.whl", hash = "sha256:20b5c8000ee9cee11b0f9a6ae26640f0d498ce77f7e2ec649a2f0d306523792d"},
{file = "snowflake_connector_python-3.12.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ca6500d16bdbd37da88e589cc3e82b90272471d3aabfe4a79ec1cf4696675acf"},
{file = "snowflake_connector_python-3.12.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b455ba117a68da436e253899674fae1a93669eaefdde8a903c03eb65b7e87c86"},
{file = "snowflake_connector_python-3.12.3-cp311-cp311-win_amd64.whl", hash = "sha256:205219fcaeee2d33db5d0d023d60518e3bd8272ce1679be2199d7f362d255054"},
{file = "snowflake_connector_python-3.12.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3d830ca32c864b730cba5d92900d850752199635c4fb0ae0a70ee677f62aee70"},
{file = "snowflake_connector_python-3.12.3-cp312-cp312-macosx_11_0_x86_64.whl", hash = "sha256:597b0c74ec57ba693191ae2de8db9536e349ee32cab152df657473e498b6fd87"},
{file = "snowflake_connector_python-3.12.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2215d8a4c5e25ea0d2183fe693c3fdf058cd6035e5c84710d532dc04ab4ffd31"},
{file = "snowflake_connector_python-3.12.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f8ba9c261904c1ba7cae6035c7881224cf979da39c8b7c7cb10236fdfc57e505"},
{file = "snowflake_connector_python-3.12.3-cp312-cp312-win_amd64.whl", hash = "sha256:f0d0fcb948ef0812ab162ec9767622f345554043a07439c0c1a9474c86772320"},
{file = "snowflake_connector_python-3.12.3-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:fe742a0b2fb1c79a21e95b97c49a05783bc00314d1184d227c5fe5b57688af12"},
{file = "snowflake_connector_python-3.12.3-cp38-cp38-macosx_11_0_x86_64.whl", hash = "sha256:a8584a44a6bb41d2056cf1b833e629c76e28c5303d2c875c1a23bda46a1cd43a"},
{file = "snowflake_connector_python-3.12.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dd990db8e4886c32ba5c63758e8dc4814e2e75f5fd3fe79d43f7e5ee0fc46793"},
{file = "snowflake_connector_python-3.12.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e4fe7f91f6e44bda877e77403a586d7487ca2c52dc1a32a705b2fea33f9c763a"},
{file = "snowflake_connector_python-3.12.3-cp38-cp38-win_amd64.whl", hash = "sha256:4994e95eff593dc44c28243ef0ae8d27b8b1aeb96dd64cbcea5bcf0e4dfb77fb"},
{file = "snowflake_connector_python-3.12.3-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:ac33a7dd54b35f94c4b91369971dbd6467a914dff4b01c46e77e7e6901d7eca4"},
{file = "snowflake_connector_python-3.12.3-cp39-cp39-macosx_11_0_x86_64.whl", hash = "sha256:a26876322811fe2b93f6d814dcfe016f1df680a12624026ecf57a6bcdf20f969"},
{file = "snowflake_connector_python-3.12.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6c0bb390be2e15b6b7cccab7fbe1ef94e1e9ab13790c974aa44761298cdc2641"},
{file = "snowflake_connector_python-3.12.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e7340f73af4ae72e6af8fe28a1b8e196a0c99943071afc96ce419efb4da80035"},
{file = "snowflake_connector_python-3.12.3-cp39-cp39-win_amd64.whl", hash = "sha256:c314749bd0151218b654a7d4646a39067ab650bdc86dfebb1884b056b0bdb4b4"},
{file = "snowflake_connector_python-3.12.3.tar.gz", hash = "sha256:02873c7f7a3b10322e28dddc2be6907f8ab8ecad93d6d6af14c77c2f53091b88"},
]
[package.dependencies]
@@ -4487,24 +4682,24 @@ asn1crypto = ">0.24.0,<2.0.0"
certifi = ">=2017.4.17"
cffi = ">=1.9,<2.0.0"
charset-normalizer = ">=2,<4"
cryptography = ">=3.1.0,<42.0.0"
cryptography = ">=3.1.0"
filelock = ">=3.5,<4"
idna = ">=2.5,<4"
packaging = "*"
platformdirs = ">=2.6.0,<4.0.0"
platformdirs = ">=2.6.0,<5.0.0"
pyjwt = "<3.0.0"
pyOpenSSL = ">=16.2.0,<24.0.0"
pyOpenSSL = ">=16.2.0,<25.0.0"
pytz = "*"
requests = "<3.0.0"
sortedcontainers = ">=2.4.0"
tomlkit = "*"
typing-extensions = ">=4.3,<5"
urllib3 = ">=1.21.1,<1.27"
urllib3 = {version = ">=1.21.1,<2.0.0", markers = "python_version < \"3.10\""}
[package.extras]
development = ["Cython", "coverage", "more-itertools", "numpy (<1.27.0)", "pendulum (!=2.1.1)", "pexpect", "pytest (<7.5.0)", "pytest-cov", "pytest-rerunfailures", "pytest-timeout", "pytest-xdist", "pytzdata"]
pandas = ["pandas (>=1.0.0,<2.1.0)", "pyarrow (>=10.0.1,<10.1.0)"]
secure-local-storage = ["keyring (!=16.1.0,<25.0.0)"]
pandas = ["pandas (>=1.0.0,<3.0.0)", "pyarrow"]
secure-local-storage = ["keyring (>=23.1.0,<26.0.0)"]
[[package]]
name = "sortedcontainers"
@@ -4592,31 +4787,29 @@ test = ["flake8 (>=2.4.0)", "isort (>=3.9.6)", "psycopg2 (>=2.4.6)", "pytest (>=
[[package]]
name = "sqlalchemy-utils"
version = "0.34.2"
version = "0.38.3"
description = "Various utility functions for SQLAlchemy."
optional = false
python-versions = "*"
python-versions = "~=3.6"
files = [
{file = "SQLAlchemy-Utils-0.34.2.tar.gz", hash = "sha256:6689b29d7951c5c7c4d79fa6b8c95f9ff9ec708b07aa53f82060599bd14dcc88"},
{file = "SQLAlchemy-Utils-0.38.3.tar.gz", hash = "sha256:9f9afba607a40455cf703adfa9846584bf26168a0c5a60a70063b70d65051f4d"},
{file = "SQLAlchemy_Utils-0.38.3-py3-none-any.whl", hash = "sha256:5c13b5d08adfaa85f3d4e8ec09a75136216fad41346980d02974a70a77988bf9"},
]
[package.dependencies]
six = "*"
SQLAlchemy = ">=1.0"
SQLAlchemy = ">=1.3"
[package.extras]
anyjson = ["anyjson (>=0.3.3)"]
arrow = ["arrow (>=0.3.4)"]
babel = ["Babel (>=1.3)"]
color = ["colour (>=0.0.4)"]
encrypted = ["cryptography (>=0.6)"]
enum = ["enum34"]
intervals = ["intervals (>=0.7.1)"]
ipaddress = ["ipaddr"]
password = ["passlib (>=1.6,<2.0)"]
pendulum = ["pendulum (>=2.0.5)"]
phone = ["phonenumbers (>=5.9.2)"]
test = ["Jinja2 (>=2.3)", "Pygments (>=1.2)", "docutils (>=0.10)", "flake8 (>=2.4.0)", "flexmock (>=0.9.7)", "isort (>=4.2.2)", "mock (==2.0.0)", "pg8000 (>=1.12.4)", "psycopg2 (>=2.5.1)", "pymysql", "pyodbc", "pytest (>=2.7.1)", "python-dateutil (>=2.6)", "pytz (>=2014.2)"]
test-all = ["Babel (>=1.3)", "Jinja2 (>=2.3)", "Pygments (>=1.2)", "anyjson (>=0.3.3)", "arrow (>=0.3.4)", "colour (>=0.0.4)", "cryptography (>=0.6)", "docutils (>=0.10)", "enum34", "flake8 (>=2.4.0)", "flexmock (>=0.9.7)", "furl (>=0.4.1)", "intervals (>=0.7.1)", "ipaddr", "isort (>=4.2.2)", "mock (==2.0.0)", "passlib (>=1.6,<2.0)", "pg8000 (>=1.12.4)", "phonenumbers (>=5.9.2)", "psycopg2 (>=2.5.1)", "pymysql", "pyodbc", "pytest (>=2.7.1)", "python-dateutil", "python-dateutil (>=2.6)", "pytz (>=2014.2)"]
test = ["Jinja2 (>=2.3)", "Pygments (>=1.2)", "backports.zoneinfo", "docutils (>=0.10)", "flake8 (>=2.4.0)", "flexmock (>=0.9.7)", "isort (>=4.2.2)", "pg8000 (>=1.12.4)", "psycopg2 (>=2.5.1)", "psycopg2cffi (>=2.8.1)", "pymysql", "pyodbc", "pytest (>=2.7.1)", "python-dateutil (>=2.6)", "pytz (>=2014.2)"]
test-all = ["Babel (>=1.3)", "Jinja2 (>=2.3)", "Pygments (>=1.2)", "arrow (>=0.3.4)", "backports.zoneinfo", "colour (>=0.0.4)", "cryptography (>=0.6)", "docutils (>=0.10)", "flake8 (>=2.4.0)", "flexmock (>=0.9.7)", "furl (>=0.4.1)", "intervals (>=0.7.1)", "isort (>=4.2.2)", "passlib (>=1.6,<2.0)", "pendulum (>=2.0.5)", "pg8000 (>=1.12.4)", "phonenumbers (>=5.9.2)", "psycopg2 (>=2.5.1)", "psycopg2cffi (>=2.8.1)", "pymysql", "pyodbc", "pytest (>=2.7.1)", "python-dateutil", "python-dateutil (>=2.6)", "pytz (>=2014.2)"]
timezone = ["python-dateutil"]
url = ["furl (>=0.4.1)"]
@@ -4795,13 +4988,13 @@ files = [
[[package]]
name = "tomlkit"
version = "0.12.3"
version = "0.13.0"
description = "Style preserving TOML library"
optional = false
python-versions = ">=3.7"
python-versions = ">=3.8"
files = [
{file = "tomlkit-0.12.3-py3-none-any.whl", hash = "sha256:b0a645a9156dc7cb5d3a1f0d4bab66db287fcb8e0430bdd4664a095ea16414ba"},
{file = "tomlkit-0.12.3.tar.gz", hash = "sha256:75baf5012d06501f07bee5bf8e801b9f343e7aac5a92581f20f80ce632e6b5a4"},
{file = "tomlkit-0.13.0-py3-none-any.whl", hash = "sha256:7075d3042d03b80f603482d69bf0c8f345c2b30e41699fd8883227f89972b264"},
{file = "tomlkit-0.13.0.tar.gz", hash = "sha256:08ad192699734149f5b97b45f1f18dad7eb1b6d16bc72ad0c2335772650d7b72"},
]
[[package]]
@@ -4894,13 +5087,13 @@ files = [
[[package]]
name = "urllib3"
version = "1.26.18"
version = "1.26.19"
description = "HTTP library with thread-safe connection pooling, file post, and more."
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*"
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.7"
files = [
{file = "urllib3-1.26.18-py2.py3-none-any.whl", hash = "sha256:34b97092d7e0a3a8cf7cd10e386f401b3737364026c45e622aa02903dffe0f07"},
{file = "urllib3-1.26.18.tar.gz", hash = "sha256:f8ecc1bba5667413457c529ab955bf8c67b45db799d159066261719e328580a0"},
{file = "urllib3-1.26.19-py2.py3-none-any.whl", hash = "sha256:37a0344459b199fce0e80b0d3569837ec6b6937435c5244e7fd73fa6006830f3"},
{file = "urllib3-1.26.19.tar.gz", hash = "sha256:3e3d753a8618b86d7de333b4223005f68720bcd6a7d2bcb9fbd2229ec7c1e429"},
]
[package.extras]
@@ -4961,13 +5154,13 @@ six = ">=1.10.0"
[[package]]
name = "virtualenv"
version = "20.25.0"
version = "20.26.6"
description = "Virtual Python Environment builder"
optional = false
python-versions = ">=3.7"
files = [
{file = "virtualenv-20.25.0-py3-none-any.whl", hash = "sha256:4238949c5ffe6876362d9c0180fc6c3a824a7b12b80604eeb8085f2ed7460de3"},
{file = "virtualenv-20.25.0.tar.gz", hash = "sha256:bf51c0d9c7dd63ea8e44086fa1e4fb1093a31e963b86959257378aef020e1f1b"},
{file = "virtualenv-20.26.6-py3-none-any.whl", hash = "sha256:7345cc5b25405607a624d8418154577459c3e0277f5466dd79c49d5e492995f2"},
{file = "virtualenv-20.26.6.tar.gz", hash = "sha256:280aede09a2a5c317e409a00102e7077c6432c5a38f0ef938e643805a7ad2c48"},
]
[package.dependencies]
@@ -4976,7 +5169,7 @@ filelock = ">=3.12.2,<4"
platformdirs = ">=3.9.1,<5"
[package.extras]
docs = ["furo (>=2023.7.26)", "proselint (>=0.13)", "sphinx (>=7.1.2)", "sphinx-argparse (>=0.4)", "sphinxcontrib-towncrier (>=0.2.1a0)", "towncrier (>=23.6)"]
docs = ["furo (>=2023.7.26)", "proselint (>=0.13)", "sphinx (>=7.1.2,!=7.3)", "sphinx-argparse (>=0.4)", "sphinxcontrib-towncrier (>=0.2.1a0)", "towncrier (>=23.6)"]
test = ["covdefaults (>=2.3)", "coverage (>=7.2.7)", "coverage-enable-subprocess (>=1)", "flaky (>=3.7)", "packaging (>=23.1)", "pytest (>=7.4)", "pytest-env (>=0.8.2)", "pytest-freezer (>=0.4.8)", "pytest-mock (>=3.11.1)", "pytest-randomly (>=3.12)", "pytest-timeout (>=2.1)", "setuptools (>=68)", "time-machine (>=2.10)"]
[[package]]
@@ -5213,18 +5406,18 @@ docs = ["Sphinx", "elementpath (>=4.1.5,<5.0.0)", "jinja2", "sphinx-rtd-theme"]
[[package]]
name = "zipp"
version = "3.17.0"
version = "3.19.1"
description = "Backport of pathlib-compatible object wrapper for zip files"
optional = false
python-versions = ">=3.8"
files = [
{file = "zipp-3.17.0-py3-none-any.whl", hash = "sha256:0e923e726174922dce09c53c59ad483ff7bbb8e572e00c7f7c46b88556409f31"},
{file = "zipp-3.17.0.tar.gz", hash = "sha256:84e64a1c28cf7e91ed2078bb8cc8c259cb19b76942096c8d7b84947690cabaf0"},
{file = "zipp-3.19.1-py3-none-any.whl", hash = "sha256:2828e64edb5386ea6a52e7ba7cdb17bb30a73a858f5eb6eb93d8d36f5ea26091"},
{file = "zipp-3.19.1.tar.gz", hash = "sha256:35427f6d5594f4acf82d25541438348c26736fa9b3afa2754bcd63cdb99d8e8f"},
]
[package.extras]
docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (<7.2.5)", "sphinx (>=3.5)", "sphinx-lint"]
testing = ["big-O", "jaraco.functools", "jaraco.itertools", "more-itertools", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-ignore-flaky", "pytest-mypy (>=0.9.1)", "pytest-ruff"]
doc = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-lint"]
test = ["big-O", "jaraco.functools", "jaraco.itertools", "jaraco.test", "more-itertools", "pytest (>=6,!=8.1.*)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-ignore-flaky", "pytest-mypy", "pytest-ruff (>=0.2.1)"]
[[package]]
name = "zope-event"
@@ -5300,4 +5493,4 @@ testing = ["coverage (>=5.0.3)", "zope.event", "zope.testing"]
[metadata]
lock-version = "2.0"
python-versions = ">=3.8,<3.11"
content-hash = "e7985ee5c3ca3a4389b4e85fda033a9b3b867dbbe4b4a7fca8ea5c35fc401148"
content-hash = "93b13c8a960e148463fba93cfd826c0f3e7bd822bbda55af7ba708baead293df"

View File

@@ -12,7 +12,7 @@ force-exclude = '''
[tool.poetry]
name = "redash"
version = "24.05.0-dev"
version = "25.03.0-dev"
description = "Make Your Company Data Driven. Connect to any data source, easily visualize, dashboard and share your data."
authors = ["Arik Fraimovich <arik@redash.io>"]
# to be added to/removed from the mailing list, please reach out to Arik via the above email or Discord
@@ -29,7 +29,7 @@ authlib = "0.15.5"
backoff = "2.2.1"
blinker = "1.6.2"
click = "8.1.3"
cryptography = "41.0.6"
cryptography = "43.0.1"
disposable-email-domains = ">=0.0.52"
flask = "2.3.2"
flask-limiter = "3.3.1"
@@ -46,7 +46,7 @@ greenlet = "2.0.2"
gunicorn = "22.0.0"
httplib2 = "0.19.0"
itsdangerous = "2.1.2"
jinja2 = "3.1.3"
jinja2 = "3.1.5"
jsonschema = "3.1.1"
markupsafe = "2.1.1"
maxminddb-geolite2 = "2018.703"
@@ -54,7 +54,7 @@ parsedatetime = "2.4"
passlib = "1.7.3"
psycopg2-binary = "2.9.6"
pyjwt = "2.4.0"
pyopenssl = "23.2.0"
pyopenssl = "24.2.1"
pypd = "1.1.0"
pysaml2 = "7.3.1"
pystache = "0.6.0"
@@ -64,27 +64,31 @@ pytz = ">=2019.3"
pyyaml = "6.0.1"
redis = "4.6.0"
regex = "2023.8.8"
requests = "2.31.0"
restrictedpython = "6.2"
requests = "2.32.3"
restrictedpython = "7.3"
rq = "1.16.1"
rq-scheduler = "0.13.1"
semver = "2.8.1"
sentry-sdk = "1.28.1"
sentry-sdk = "1.45.1"
sqlalchemy = "1.3.24"
sqlalchemy-searchable = "1.2.0"
sqlalchemy-utils = "0.34.2"
sqlalchemy-utils = "0.38.3"
sqlparse = "0.5.0"
sshtunnel = "0.1.5"
statsd = "3.3.0"
supervisor = "4.1.0"
supervisor-checks = "0.8.1"
ua-parser = "0.18.0"
urllib3 = "1.26.18"
urllib3 = "1.26.19"
user-agents = "2.0"
werkzeug = "2.3.8"
wtforms = "2.2.1"
xlsxwriter = "1.2.2"
tzlocal = "4.3.1"
pyodbc = "5.1.0"
debugpy = "^1.8.9"
paramiko = "3.4.1"
oracledb = "2.5.1"
[tool.poetry.group.all_ds]
optional = true
@@ -110,26 +114,25 @@ nzalchemy = "^11.0.2"
nzpy = ">=1.15"
oauth2client = "4.1.3"
openpyxl = "3.0.7"
oracledb = "2.1.2"
pandas = "1.3.4"
phoenixdb = "0.7"
pinotdb = ">=0.4.5"
protobuf = "3.20.2"
pyathena = ">=1.5.0,<=1.11.5"
pyathena = "2.25.2"
pydgraph = "2.0.2"
pydruid = "0.5.7"
pyexasol = "0.12.0"
pyhive = "0.6.1"
pyignite = "0.6.1"
pymongo = { version = "4.6.3", extras = ["srv", "tls"] }
pymssql = "2.2.8"
pyodbc = "4.0.28"
pymssql = "^2.3.1"
pyodbc = "5.1.0"
python-arango = "6.1.0"
python-rapidjson = "1.1.0"
python-rapidjson = "1.20"
requests-aws-sign = "0.1.5"
sasl = ">=0.1.3"
simple-salesforce = "0.74.3"
snowflake-connector-python = "3.4.0"
snowflake-connector-python = "3.12.3"
td-client = "1.0.0"
thrift = ">=0.8.0"
thrift-sasl = ">=0.1.0"
@@ -155,7 +158,6 @@ jwcrypto = "1.5.6"
mock = "5.0.2"
pre-commit = "3.3.3"
ptpython = "3.0.23"
ptvsd = "4.3.2"
pytest-cov = "4.1.0"
watchdog = "3.0.0"
ruff = "0.0.289"

View File

@@ -14,13 +14,14 @@ from redash.app import create_app # noqa
from redash.destinations import import_destinations
from redash.query_runner import import_query_runners
__version__ = "24.05.0-dev"
__version__ = "25.03.0-dev"
if os.environ.get("REMOTE_DEBUG"):
import ptvsd
import debugpy
ptvsd.enable_attach(address=("0.0.0.0", 5678))
debugpy.listen(("0.0.0.0", 5678))
debugpy.wait_for_client()
def setup_logging():

View File

@@ -36,10 +36,14 @@ def create_app():
from .metrics import request as request_metrics
from .models import db, users
from .utils import sentry
from .version_check import reset_new_version_status
sentry.init()
app = Redash()
# Check and update the cached version for use by the client
reset_new_version_status()
security.init_app(app)
request_metrics.init_app(app)
db.init_app(app)

View File

@@ -8,6 +8,7 @@ from redash import settings
try:
from ldap3 import Connection, Server
from ldap3.utils.conv import escape_filter_chars
except ImportError:
if settings.LDAP_LOGIN_ENABLED:
sys.exit(
@@ -69,6 +70,7 @@ def login(org_slug=None):
def auth_ldap_user(username, password):
clean_username = escape_filter_chars(username)
server = Server(settings.LDAP_HOST_URL, use_ssl=settings.LDAP_SSL)
if settings.LDAP_BIND_DN is not None:
conn = Connection(
@@ -83,7 +85,7 @@ def auth_ldap_user(username, password):
conn.search(
settings.LDAP_SEARCH_DN,
settings.LDAP_SEARCH_TEMPLATE % {"username": username},
settings.LDAP_SEARCH_TEMPLATE % {"username": clean_username},
attributes=[settings.LDAP_DISPLAY_NAME_KEY, settings.LDAP_EMAIL_KEY],
)

View File

@@ -5,6 +5,22 @@ from sqlalchemy.orm.exc import NoResultFound
manager = AppGroup(help="Queries management commands.")
@manager.command(name="rehash")
def rehash():
from redash import models
for q in models.Query.query.all():
old_hash = q.query_hash
q.update_query_hash()
new_hash = q.query_hash
if old_hash != new_hash:
print(f"Query {q.id} has changed hash from {old_hash} to {new_hash}")
models.db.session.add(q)
models.db.session.commit()
@manager.command(name="add_tag")
@argument("query_id")
@argument("tag")

View File

@@ -1,3 +1,5 @@
import html
import json
import logging
from copy import deepcopy
@@ -37,31 +39,83 @@ class Webex(BaseDestination):
@staticmethod
def formatted_attachments_template(subject, description, query_link, alert_link):
return [
{
"contentType": "application/vnd.microsoft.card.adaptive",
"content": {
"$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
"type": "AdaptiveCard",
"version": "1.0",
"body": [
# Attempt to parse the description to find a 2D array
try:
# Extract the part of the description that looks like a JSON array
start_index = description.find("[")
end_index = description.rfind("]") + 1
json_array_str = description[start_index:end_index]
# Decode HTML entities
json_array_str = html.unescape(json_array_str)
# Replace single quotes with double quotes for valid JSON
json_array_str = json_array_str.replace("'", '"')
# Load the JSON array
data_array = json.loads(json_array_str)
# Check if it's a 2D array
if isinstance(data_array, list) and all(isinstance(i, list) for i in data_array):
# Create a table for the Adaptive Card
table_rows = []
for row in data_array:
table_rows.append(
{
"type": "ColumnSet",
"columns": [
{
"type": "Column",
"width": 4,
"items": [
{"type": "Column", "items": [{"type": "TextBlock", "text": str(item), "wrap": True}]}
for item in row
],
}
)
# Create the body of the card with the table
body = (
[
{
"type": "TextBlock",
"text": {subject},
"text": f"{subject}",
"weight": "bolder",
"size": "medium",
"wrap": True,
},
{
"type": "TextBlock",
"text": {description},
"text": f"{description[:start_index]}",
"isSubtle": True,
"wrap": True,
},
]
+ table_rows
+ [
{
"type": "TextBlock",
"text": f"Click [here]({query_link}) to check your query!",
"wrap": True,
"isSubtle": True,
},
{
"type": "TextBlock",
"text": f"Click [here]({alert_link}) to check your alert!",
"wrap": True,
"isSubtle": True,
},
]
)
else:
# Fallback to the original description if no valid 2D array is found
body = [
{
"type": "TextBlock",
"text": f"{subject}",
"weight": "bolder",
"size": "medium",
"wrap": True,
},
{
"type": "TextBlock",
"text": f"{description}",
"isSubtle": True,
"wrap": True,
},
@@ -77,11 +131,45 @@ class Webex(BaseDestination):
"wrap": True,
"isSubtle": True,
},
],
]
except json.JSONDecodeError:
# If parsing fails, fallback to the original description
body = [
{
"type": "TextBlock",
"text": f"{subject}",
"weight": "bolder",
"size": "medium",
"wrap": True,
},
],
}
],
{
"type": "TextBlock",
"text": f"{description}",
"isSubtle": True,
"wrap": True,
},
{
"type": "TextBlock",
"text": f"Click [here]({query_link}) to check your query!",
"wrap": True,
"isSubtle": True,
},
{
"type": "TextBlock",
"text": f"Click [here]({alert_link}) to check your alert!",
"wrap": True,
"isSubtle": True,
},
]
return [
{
"contentType": "application/vnd.microsoft.card.adaptive",
"content": {
"$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
"type": "AdaptiveCard",
"version": "1.0",
"body": body,
},
}
]
@@ -116,6 +204,10 @@ class Webex(BaseDestination):
# destinations is guaranteed to be a comma-separated string
for destination_id in destinations.split(","):
destination_id = destination_id.strip() # Remove any leading or trailing whitespace
if not destination_id: # Check if the destination_id is empty or blank
continue # Skip to the next iteration if it's empty or blank
payload = deepcopy(template_payload)
payload[payload_tag] = destination_id
self.post_message(payload, headers)

View File

@@ -1,7 +1,7 @@
from flask import request
from funcy import project
from redash import models
from redash import models, utils
from redash.handlers.base import (
BaseResource,
get_object_or_404,
@@ -14,6 +14,10 @@ from redash.permissions import (
view_only,
)
from redash.serializers import serialize_alert
from redash.tasks.alerts import (
notify_subscriptions,
should_notify,
)
class AlertResource(BaseResource):
@@ -43,6 +47,21 @@ class AlertResource(BaseResource):
models.db.session.commit()
class AlertEvaluateResource(BaseResource):
def post(self, alert_id):
alert = get_object_or_404(models.Alert.get_by_id_and_org, alert_id, self.current_org)
require_admin_or_owner(alert.user.id)
new_state = alert.evaluate()
if should_notify(alert, new_state):
alert.state = new_state
alert.last_triggered_at = utils.utcnow()
models.db.session.commit()
notify_subscriptions(alert, new_state, {})
self.record_event({"action": "evaluate", "object_id": alert.id, "object_type": "alert"})
class AlertMuteResource(BaseResource):
def post(self, alert_id):
alert = get_object_or_404(models.Alert.get_by_id_and_org, alert_id, self.current_org)

View File

@@ -3,6 +3,7 @@ from flask_restful import Api
from werkzeug.wrappers import Response
from redash.handlers.alerts import (
AlertEvaluateResource,
AlertListResource,
AlertMuteResource,
AlertResource,
@@ -117,6 +118,7 @@ def json_representation(data, code, headers=None):
api.add_org_resource(AlertResource, "/api/alerts/<alert_id>", endpoint="alert")
api.add_org_resource(AlertMuteResource, "/api/alerts/<alert_id>/mute", endpoint="alert_mute")
api.add_org_resource(AlertEvaluateResource, "/api/alerts/<alert_id>/eval", endpoint="alert_eval")
api.add_org_resource(
AlertSubscriptionListResource,
"/api/alerts/<alert_id>/subscriptions",
@@ -236,11 +238,11 @@ api.add_org_resource(
)
api.add_org_resource(
QueryResultResource,
"/api/query_results/<result_id>.<filetype>",
"/api/query_results/<result_id>",
"/api/query_results/<query_result_id>.<filetype>",
"/api/query_results/<query_result_id>",
"/api/queries/<query_id>/results",
"/api/queries/<query_id>/results.<filetype>",
"/api/queries/<query_id>/results/<result_id>.<filetype>",
"/api/queries/<query_id>/results/<query_result_id>.<filetype>",
endpoint="query_result",
)
api.add_org_resource(

View File

@@ -15,6 +15,7 @@ from redash.authentication.account import (
)
from redash.handlers import routes
from redash.handlers.base import json_response, org_scoped_rule
from redash.version_check import get_latest_version
logger = logging.getLogger(__name__)
@@ -28,6 +29,7 @@ def get_google_auth_url(next_path):
def render_token_login_page(template, org_slug, token, invite):
error_message = None
try:
user_id = validate_token(token)
org = current_org._get_current_object()
@@ -39,19 +41,19 @@ def render_token_login_page(template, org_slug, token, invite):
user_id,
org_slug,
)
error_message = "Your invite link is invalid. Bad user id in token. Please ask for a new one."
except SignatureExpired:
logger.exception("Token signature has expired. Token: %s, org=%s", token, org_slug)
error_message = "Your invite link has expired. Please ask for a new one."
except BadSignature:
logger.exception("Bad signature for the token: %s, org=%s", token, org_slug)
error_message = "Your invite link is invalid. Bad signature. Please double-check the token."
if error_message:
return (
render_template(
"error.html",
error_message="Invalid invite link. Please ask for a new one.",
),
400,
)
except (SignatureExpired, BadSignature):
logger.exception("Failed to verify invite token: %s, org=%s", token, org_slug)
return (
render_template(
"error.html",
error_message="Your invite link has expired. Please ask for a new one.",
error_message=error_message,
),
400,
)
@@ -255,11 +257,15 @@ def number_format_config():
def client_config():
if not current_user.is_api_user() and current_user.is_authenticated:
client_config_inner = {
client_config = {
"newVersionAvailable": bool(get_latest_version()),
"version": __version__,
}
else:
client_config_inner = {}
client_config = {}
if current_user.has_permission("admin") and current_org.get_setting("beacon_consent") is None:
client_config["showBeaconConsentMessage"] = True
defaults = {
"allowScriptsInUserInput": settings.ALLOW_SCRIPTS_IN_USER_INPUT,
@@ -279,12 +285,12 @@ def client_config():
"tableCellMaxJSONSize": settings.TABLE_CELL_MAX_JSON_SIZE,
}
client_config_inner.update(defaults)
client_config_inner.update({"basePath": base_href()})
client_config_inner.update(date_time_format_config())
client_config_inner.update(number_format_config())
client_config.update(defaults)
client_config.update({"basePath": base_href()})
client_config.update(date_time_format_config())
client_config.update(number_format_config())
return client_config_inner
return client_config
def messages():

View File

@@ -7,13 +7,13 @@ from flask_restful import Resource, abort
from sqlalchemy import cast
from sqlalchemy.dialects.postgresql import ARRAY
from sqlalchemy.orm.exc import NoResultFound
from sqlalchemy_utils.functions import sort_query
from redash import settings
from redash.authentication import current_org
from redash.models import db
from redash.tasks import record_event as record_event_task
from redash.utils import json_dumps
from redash.utils.query_order import sort_query
routes = Blueprint("redash", __name__, template_folder=settings.fix_assets_path("templates"))

View File

@@ -5,7 +5,6 @@ import regex
from flask import make_response, request
from flask_login import current_user
from flask_restful import abort
from rq.job import JobStatus
from redash import models, settings
from redash.handlers.base import BaseResource, get_object_or_404, record_event
@@ -39,7 +38,7 @@ from redash.utils import (
def error_response(message, http_status=400):
return {"job": {"status": JobStatus.FAILED, "error": message}}, http_status
return {"job": {"status": 4, "error": message}}, http_status
error_messages = {
@@ -226,7 +225,7 @@ class QueryResultResource(BaseResource):
headers["Access-Control-Allow-Credentials"] = str(settings.ACCESS_CONTROL_ALLOW_CREDENTIALS).lower()
@require_any_of_permission(("view_query", "execute_query"))
def options(self, query_id=None, result_id=None, filetype="json"):
def options(self, query_id=None, query_result_id=None, filetype="json"):
headers = {}
self.add_cors_headers(headers)
@@ -286,12 +285,12 @@ class QueryResultResource(BaseResource):
return error_messages["no_permission"]
@require_any_of_permission(("view_query", "execute_query"))
def get(self, query_id=None, result_id=None, filetype="json"):
def get(self, query_id=None, query_result_id=None, filetype="json"):
"""
Retrieve query results.
:param number query_id: The ID of the query whose results should be fetched
:param number result_id: the ID of the query result to fetch
:param number query_result_id: the ID of the query result to fetch
:param string filetype: Format to return. One of 'json', 'xlsx', or 'csv'. Defaults to 'json'.
:<json number id: Query result ID
@@ -306,13 +305,13 @@ class QueryResultResource(BaseResource):
# This method handles two cases: retrieving result by id & retrieving result by query id.
# They need to be split, as they have different logic (for example, retrieving by query id
# should check for query parameters and shouldn't cache the result).
should_cache = result_id is not None
should_cache = query_result_id is not None
query_result = None
query = None
if result_id:
query_result = get_object_or_404(models.QueryResult.get_by_id_and_org, result_id, self.current_org)
if query_result_id:
query_result = get_object_or_404(models.QueryResult.get_by_id_and_org, query_result_id, self.current_org)
if query_id is not None:
query = get_object_or_404(models.Query.get_by_id_and_org, query_id, self.current_org)
@@ -347,7 +346,7 @@ class QueryResultResource(BaseResource):
event["object_id"] = query_id
else:
event["object_type"] = "query_result"
event["object_id"] = result_id
event["object_id"] = query_result_id
self.record_event(event)

View File

@@ -1,12 +1,13 @@
from flask import g, redirect, render_template, request, url_for
from flask_login import login_user
from wtforms import Form, PasswordField, StringField, validators
from wtforms import BooleanField, Form, PasswordField, StringField, validators
from wtforms.fields.html5 import EmailField
from redash import settings
from redash.authentication.org_resolving import current_org
from redash.handlers.base import routes
from redash.models import Group, Organization, User, db
from redash.tasks.general import subscribe
class SetupForm(Form):
@@ -14,6 +15,8 @@ class SetupForm(Form):
email = EmailField("Email Address", validators=[validators.Email()])
password = PasswordField("Password", validators=[validators.Length(6)])
org_name = StringField("Organization Name", validators=[validators.InputRequired()])
security_notifications = BooleanField()
newsletter = BooleanField()
def create_org(org_name, user_name, email, password):
@@ -54,6 +57,8 @@ def setup():
return redirect("/")
form = SetupForm(request.form)
form.newsletter.data = True
form.security_notifications.data = True
if request.method == "POST" and form.validate():
default_org, user = create_org(form.org_name.data, form.name.data, form.email.data, form.password.data)
@@ -61,6 +66,10 @@ def setup():
g.org = default_org
login_user(user)
# signup to newsletter if needed
if form.newsletter.data or form.security_notifications:
subscribe.delay(form.data)
return redirect(url_for("redash.index", org_slug=None))
return render_template("setup.html", form=form)

View File

@@ -5,7 +5,7 @@ from flask import g, has_request_context
from sqlalchemy.engine import Engine
from sqlalchemy.event import listens_for
from sqlalchemy.orm.util import _ORMJoin
from sqlalchemy.sql.selectable import Alias
from sqlalchemy.sql.selectable import Alias, Join
from redash import statsd_client
@@ -18,7 +18,7 @@ def _table_name_from_select_element(elt):
if isinstance(t, Alias):
t = t.original.froms[0]
while isinstance(t, _ORMJoin):
while isinstance(t, _ORMJoin) or isinstance(t, Join):
t = t.left
return t.name

View File

@@ -46,6 +46,7 @@ from redash.models.parameterized_query import (
QueryDetachedFromDataSourceError,
)
from redash.models.types import (
Configuration,
EncryptedConfiguration,
JSONText,
MutableDict,
@@ -227,16 +228,7 @@ class DataSource(BelongsToOrgMixin, db.Model):
def _sort_schema(self, schema):
return [
{
"name": i["name"],
"description": i.get("description"),
"columns": sorted(
i["columns"],
key=lambda col: (
("partition" in col["type"], col.get("idx", 0), col["name"]) if isinstance(col, dict) else col
),
),
}
{"name": i["name"], "columns": sorted(i["columns"], key=lambda x: x["name"] if isinstance(x, dict) else x)}
for i in sorted(schema, key=lambda x: x["name"])
]
@@ -395,6 +387,10 @@ class QueryResult(db.Model, BelongsToOrgMixin):
def should_schedule_next(previous_iteration, now, interval, time=None, day_of_week=None, failures=0):
# if previous_iteration is None, it means the query has never been run before
# so we should schedule it immediately
if previous_iteration is None:
return True
# if time exists then interval > 23 hours (82800s)
# if day_of_week exists then interval > 6 days (518400s)
if time is None:
@@ -610,6 +606,11 @@ class Query(ChangeTrackingMixin, TimestampMixin, BelongsToOrgMixin, db.Model):
if query.schedule.get("disabled"):
continue
# Skip queries that have None for all schedule values. It's unclear whether this
# something that can happen in practice, but we have a test case for it.
if all(value is None for value in query.schedule.values()):
continue
if query.schedule["until"]:
schedule_until = pytz.utc.localize(datetime.datetime.strptime(query.schedule["until"], "%Y-%m-%d"))
@@ -621,7 +622,7 @@ class Query(ChangeTrackingMixin, TimestampMixin, BelongsToOrgMixin, db.Model):
)
if should_schedule_next(
retrieved_at or now,
retrieved_at,
now,
query.schedule["interval"],
query.schedule["time"],
@@ -907,6 +908,7 @@ def next_state(op, value, threshold):
# boolean value is Python specific and most likely will be confusing to
# users.
value = str(value).lower()
value_is_number = False
else:
try:
value = float(value)
@@ -924,6 +926,8 @@ def next_state(op, value, threshold):
if op(value, threshold):
new_state = Alert.TRIGGERED_STATE
elif not value_is_number and op not in [OPERATORS.get("!="), OPERATORS.get("=="), OPERATORS.get("equals")]:
new_state = Alert.UNKNOWN_STATE
else:
new_state = Alert.OK_STATE
@@ -935,6 +939,7 @@ class Alert(TimestampMixin, BelongsToOrgMixin, db.Model):
UNKNOWN_STATE = "unknown"
OK_STATE = "ok"
TRIGGERED_STATE = "triggered"
TEST_STATE = "test"
id = primary_key("Alert")
name = Column(db.String(255))
@@ -964,17 +969,38 @@ class Alert(TimestampMixin, BelongsToOrgMixin, db.Model):
return super(Alert, cls).get_by_id_and_org(object_id, org, Query)
def evaluate(self):
data = self.query_rel.latest_query_data.data
data = self.query_rel.latest_query_data.data if self.query_rel.latest_query_data else None
new_state = self.UNKNOWN_STATE
if data["rows"] and self.options["column"] in data["rows"][0]:
if data and data["rows"] and self.options["column"] in data["rows"][0]:
op = OPERATORS.get(self.options["op"], lambda v, t: False)
if "selector" not in self.options:
selector = "first"
else:
selector = self.options["selector"]
try:
if selector == "max":
max_val = float("-inf")
for i in range(len(data["rows"])):
max_val = max(max_val, float(data["rows"][i][self.options["column"]]))
value = max_val
elif selector == "min":
min_val = float("inf")
for i in range(len(data["rows"])):
min_val = min(min_val, float(data["rows"][i][self.options["column"]]))
value = min_val
else:
value = data["rows"][0][self.options["column"]]
except ValueError:
return self.UNKNOWN_STATE
threshold = self.options["value"]
if value is not None:
new_state = next_state(op, value, threshold)
else:
new_state = self.UNKNOWN_STATE
return new_state
@@ -997,11 +1023,11 @@ class Alert(TimestampMixin, BelongsToOrgMixin, db.Model):
result_table = [] # A two-dimensional array which can rendered as a table in Mustache
for row in data["rows"]:
result_table.append([row[col["name"]] for col in data["columns"]])
context = {
"ALERT_NAME": self.name,
"ALERT_URL": "{host}/alerts/{alert_id}".format(host=host, alert_id=self.id),
"ALERT_STATUS": self.state.upper(),
"ALERT_SELECTOR": self.options["selector"],
"ALERT_CONDITION": self.options["op"],
"ALERT_THRESHOLD": self.options["value"],
"QUERY_NAME": self.query_rel.name,

View File

@@ -1,3 +1,4 @@
import re
from functools import partial
from numbers import Number
@@ -88,6 +89,16 @@ def _is_number(string):
return True
def _is_regex_pattern(value, regex):
try:
if re.compile(regex).fullmatch(value):
return True
else:
return False
except re.error:
return False
def _is_date(string):
parse(string)
return True
@@ -135,6 +146,7 @@ class ParameterizedQuery:
enum_options = definition.get("enumOptions")
query_id = definition.get("queryId")
regex = definition.get("regex")
allow_multiple_values = isinstance(definition.get("multiValuesOptions"), dict)
if isinstance(enum_options, str):
@@ -142,6 +154,7 @@ class ParameterizedQuery:
validators = {
"text": lambda value: isinstance(value, str),
"text-pattern": lambda value: _is_regex_pattern(value, regex),
"number": _is_number,
"enum": lambda value: _is_value_within_options(value, enum_options, allow_multiple_values),
"query": lambda value: _is_value_within_options(

View File

@@ -3,10 +3,21 @@ from sqlalchemy.ext.mutable import Mutable
from sqlalchemy.types import TypeDecorator
from sqlalchemy_utils import EncryptedType
from redash.models.base import db
from redash.utils import json_dumps, json_loads
from redash.utils.configuration import ConfigurationContainer
from .base import db
class Configuration(TypeDecorator):
impl = db.Text
def process_bind_param(self, value, dialect):
return value.to_json()
def process_result_value(self, value, dialect):
return ConfigurationContainer.from_json(value)
class EncryptedConfiguration(EncryptedType):
def process_bind_param(self, value, dialect):

View File

@@ -166,7 +166,7 @@ class User(TimestampMixin, db.Model, BelongsToOrgMixin, UserMixin, PermissionsCh
if self._profile_image_url:
return self._profile_image_url
email_md5 = hashlib.md5(self.email.lower().encode()).hexdigest()
email_md5 = hashlib.md5(self.email.lower().encode(), usedforsecurity=False).hexdigest()
return "https://www.gravatar.com/avatar/{}?s=40&d=identicon".format(email_md5)
@property
@@ -233,7 +233,9 @@ class User(TimestampMixin, db.Model, BelongsToOrgMixin, UserMixin, PermissionsCh
return AccessPermission.exists(obj, access_type, grantee=self)
def get_id(self):
identity = hashlib.md5("{},{}".format(self.email, self.password_hash).encode()).hexdigest()
identity = hashlib.md5(
"{},{}".format(self.email, self.password_hash).encode(), usedforsecurity=False
).hexdigest()
return "{0}-{1}".format(self.id, identity)
def get_actual_user(self):

View File

@@ -59,7 +59,7 @@ def get_status():
def rq_job_ids():
queues = Queue.all(connection=redis_connection)
queues = Queue.all(connection=rq_redis_connection)
started_jobs = [StartedJobRegistry(queue=q).get_job_ids() for q in queues]
queued_jobs = [q.job_ids for q in queues]

View File

@@ -21,9 +21,7 @@ OPTIONAL_CREDENTIALS = parse_boolean(os.environ.get("ATHENA_OPTIONAL_CREDENTIALS
try:
import boto3
import pandas as pd
import pyathena
from pyathena.pandas_cursor import PandasCursor
enabled = True
except ImportError:
@@ -78,6 +76,10 @@ class Athena(BaseQueryRunner):
"default": "default",
},
"glue": {"type": "boolean", "title": "Use Glue Data Catalog"},
"catalog_ids": {
"type": "string",
"title": "Enter Glue Data Catalog IDs, separated by commas (leave blank for default catalog)",
},
"work_group": {
"type": "string",
"title": "Athena Work Group",
@@ -88,15 +90,26 @@ class Athena(BaseQueryRunner):
"title": "Athena cost per Tb scanned (USD)",
"default": 5,
},
"result_reuse_enable": {
"type": "boolean",
"title": "Reuse Athena query results",
},
"result_reuse_minutes": {
"type": "number",
"title": "Minutes to reuse Athena query results",
"default": 60,
},
},
"required": ["region", "s3_staging_dir"],
"extra_options": ["glue", "cost_per_tb"],
"extra_options": ["glue", "catalog_ids", "cost_per_tb", "result_reuse_enable", "result_reuse_minutes"],
"order": [
"region",
"s3_staging_dir",
"schema",
"work_group",
"cost_per_tb",
"result_reuse_enable",
"result_reuse_minutes",
],
"secret": ["aws_secret_key"],
}
@@ -174,60 +187,53 @@ class Athena(BaseQueryRunner):
"region_name": self.configuration["region"],
}
def __get_schema_from_glue(self):
def __get_schema_from_glue(self, catalog_id=""):
client = boto3.client("glue", **self._get_iam_credentials())
schema = {}
database_paginator = client.get_paginator("get_databases")
table_paginator = client.get_paginator("get_tables")
for databases in database_paginator.paginate():
databases_iterator = database_paginator.paginate(
**({"CatalogId": catalog_id} if catalog_id != "" else {}),
)
for databases in databases_iterator:
for database in databases["DatabaseList"]:
iterator = table_paginator.paginate(DatabaseName=database["Name"])
iterator = table_paginator.paginate(
DatabaseName=database["Name"],
**({"CatalogId": catalog_id} if catalog_id != "" else {}),
)
for table in iterator.search("TableList[]"):
table_name = "%s.%s" % (database["Name"], table["Name"])
if "StorageDescriptor" not in table:
logger.warning("Glue table doesn't have StorageDescriptor: %s", table_name)
continue
if table_name not in schema:
columns = []
for cols in table["StorageDescriptor"]["Columns"]:
c = {
"name": cols["Name"],
}
if "Type" in cols:
c["type"] = cols["Type"]
if "Comment" in cols:
c["comment"] = cols["Comment"]
columns.append(c)
schema[table_name] = {"name": table_name, "columns": []}
schema[table_name] = {
"name": table_name,
"columns": columns,
"description": table.get("Description"),
for column_data in table["StorageDescriptor"]["Columns"]:
column = {
"name": column_data["Name"],
"type": column_data["Type"] if "Type" in column_data else None,
}
for idx, partition in enumerate(table.get("PartitionKeys", [])):
schema[table_name]["columns"].append(
{
schema[table_name]["columns"].append(column)
for partition in table.get("PartitionKeys", []):
partition_column = {
"name": partition["Name"],
"type": "partition",
"idx": idx,
"type": partition["Type"] if "Type" in partition else None,
}
)
if "Type" in partition:
_type = partition["Type"]
c["type"] = f"partition ({_type})"
if "Comment" in partition:
c["comment"] = partition["Comment"]
schema[table_name]["columns"].append(partition_column)
return list(schema.values())
def get_schema(self, get_stats=False):
if self.configuration.get("glue", False):
return self.__get_schema_from_glue()
catalog_ids = [id.strip() for id in self.configuration.get("catalog_ids", "").split(",")]
return sum([self.__get_schema_from_glue(catalog_id) for catalog_id in catalog_ids], [])
schema = {}
query = """
SELECT table_schema, table_name, column_name
SELECT table_schema, table_name, column_name, data_type
FROM information_schema.columns
WHERE table_schema NOT IN ('information_schema')
"""
@@ -240,7 +246,7 @@ class Athena(BaseQueryRunner):
table_name = "{0}.{1}".format(row["table_schema"], row["table_name"])
if table_name not in schema:
schema[table_name] = {"name": table_name, "columns": []}
schema[table_name]["columns"].append(row["column_name"])
schema[table_name]["columns"].append({"name": row["column_name"], "type": row["data_type"]})
return list(schema.values())
@@ -252,7 +258,8 @@ class Athena(BaseQueryRunner):
kms_key=self.configuration.get("kms_key", None),
work_group=self.configuration.get("work_group", "primary"),
formatter=SimpleFormatter(),
cursor_class=PandasCursor,
result_reuse_enable=self.configuration.get("result_reuse_enable", False),
result_reuse_minutes=self.configuration.get("result_reuse_minutes", 60),
**self._get_iam_credentials(user=user),
).cursor()
@@ -260,8 +267,7 @@ class Athena(BaseQueryRunner):
cursor.execute(query)
column_tuples = [(i[0], _TYPE_MAPPINGS.get(i[1], None)) for i in cursor.description]
columns = self.fetch_columns(column_tuples)
df = cursor.as_pandas().replace({pd.NA: None})
rows = df.to_dict(orient="records")
rows = [dict(zip(([c["name"] for c in columns]), r)) for i, r in enumerate(cursor.fetchall())]
qbytes = None
athena_query_id = None
try:

View File

@@ -7,6 +7,7 @@ from base64 import b64decode
from redash import settings
from redash.query_runner import (
TYPE_BOOLEAN,
TYPE_DATE,
TYPE_DATETIME,
TYPE_FLOAT,
TYPE_INTEGER,
@@ -37,6 +38,8 @@ types_map = {
"BOOLEAN": TYPE_BOOLEAN,
"STRING": TYPE_STRING,
"TIMESTAMP": TYPE_DATETIME,
"DATETIME": TYPE_DATETIME,
"DATE": TYPE_DATE,
}
@@ -301,7 +304,7 @@ class BigQuery(BaseQueryRunner):
datasets = self._get_project_datasets(project_id)
query_base = """
SELECT table_schema, table_name, field_path
SELECT table_schema, table_name, field_path, data_type
FROM `{dataset_id}`.INFORMATION_SCHEMA.COLUMN_FIELD_PATHS
WHERE table_schema NOT IN ('information_schema')
"""
@@ -322,7 +325,7 @@ class BigQuery(BaseQueryRunner):
table_name = "{0}.{1}".format(row["table_schema"], row["table_name"])
if table_name not in schema:
schema[table_name] = {"name": table_name, "columns": []}
schema[table_name]["columns"].append(row["field_path"])
schema[table_name]["columns"].append({"name": row["field_path"], "type": row["data_type"]})
return list(schema.values())

View File

@@ -91,8 +91,8 @@ class BaseElasticSearch(BaseQueryRunner):
logger.setLevel(logging.DEBUG)
self.server_url = self.configuration["server"]
if self.server_url[-1] == "/":
self.server_url = self.configuration.get("server", "")
if self.server_url and self.server_url[-1] == "/":
self.server_url = self.server_url[:-1]
basic_auth_user = self.configuration.get("basic_auth_user", None)

View File

@@ -1,3 +1,4 @@
import json
import logging
from typing import Optional, Tuple
@@ -64,6 +65,7 @@ class ElasticSearch2(BaseHTTPQueryRunner):
return data, error
def _build_query(self, query: str) -> Tuple[dict, str, Optional[list]]:
query = json.loads(query)
index_name = query.pop("index", "")
result_fields = query.pop("result_fields", None)
url = "/{}/_search".format(index_name)

File diff suppressed because it is too large Load Diff

View File

@@ -117,6 +117,7 @@ def parse_results(results: list, flatten: bool = False) -> list:
parsed_row = _parse_dict(row, flatten)
for column_name, value in parsed_row.items():
if _get_column_by_name(columns, column_name) is None:
columns.append(
{
"name": column_name,
@@ -130,6 +131,17 @@ def parse_results(results: list, flatten: bool = False) -> list:
return rows, columns
def _sorted_fields(fields):
ord = {}
for k, v in fields.items():
if isinstance(v, int):
ord[k] = v
else:
ord[k] = len(fields)
return sorted(ord, key=ord.get)
class MongoDB(BaseQueryRunner):
should_annotate_query = False
@@ -176,7 +188,7 @@ class MongoDB(BaseQueryRunner):
self.syntax = "json"
self.db_name = self.configuration["dbName"]
self.db_name = self.configuration.get("dbName", "")
self.is_replica_set = (
True if "replicaSetName" in self.configuration and self.configuration["replicaSetName"] else False
@@ -364,7 +376,7 @@ class MongoDB(BaseQueryRunner):
if f:
ordered_columns = []
for k in sorted(f, key=f.get):
for k in _sorted_fields(f):
column = _get_column_by_name(columns, k)
if column:
ordered_columns.append(column)

View File

@@ -152,7 +152,7 @@ class Mysql(BaseSQLQueryRunner):
col.table_name as table_name,
col.column_name as column_name
FROM `information_schema`.`columns` col
WHERE col.table_schema NOT IN ('information_schema', 'performance_schema', 'mysql', 'sys');
WHERE LOWER(col.table_schema) NOT IN ('information_schema', 'performance_schema', 'mysql', 'sys');
"""
results, error = self.run_query(query, None)

View File

@@ -108,8 +108,6 @@ def build_schema(query_result, schema):
column = row["column_name"]
if row.get("data_type") is not None:
column = {"name": row["column_name"], "type": row["data_type"]}
if "column_comment" in row:
column["comment"] = row["column_comment"]
schema[table_name]["columns"].append(column)
@@ -224,9 +222,7 @@ class PostgreSQL(BaseSQLQueryRunner):
SELECT s.nspname as table_schema,
c.relname as table_name,
a.attname as column_name,
null as data_type,
null as column_comment,
null as idx
null as data_type
FROM pg_class c
JOIN pg_namespace s
ON c.relnamespace = s.oid
@@ -235,23 +231,17 @@ class PostgreSQL(BaseSQLQueryRunner):
ON a.attrelid = c.oid
AND a.attnum > 0
AND NOT a.attisdropped
WHERE c.relkind IN ('m', 'f', 'p') AND has_table_privilege(s.nspname || '.' || c.relname, 'select')
WHERE c.relkind IN ('m', 'f', 'p')
AND has_table_privilege(s.nspname || '.' || c.relname, 'select')
AND has_schema_privilege(s.nspname, 'usage')
UNION
SELECT table_schema,
table_name,
column_name,
data_type,
pgd.description,
isc.ordinal_position
FROM information_schema.columns as isc
LEFT JOIN pg_catalog.pg_statio_all_tables as st
ON isc.table_schema = st.schemaname
AND isc.table_name = st.relname
LEFT JOIN pg_catalog.pg_description pgd
ON pgd.objoid=st.relid
AND pgd.objsubid=isc.ordinal_position
data_type
FROM information_schema.columns
WHERE table_schema NOT IN ('pg_catalog', 'information_schema')
"""
@@ -398,12 +388,13 @@ class Redshift(PostgreSQL):
SELECT DISTINCT table_name,
table_schema,
column_name,
data_type,
ordinal_position AS pos
FROM svv_columns
WHERE table_schema NOT IN ('pg_internal','pg_catalog','information_schema')
AND table_schema NOT LIKE 'pg_temp_%'
)
SELECT table_name, table_schema, column_name
SELECT table_name, table_schema, column_name, data_type
FROM tables
WHERE
HAS_SCHEMA_PRIVILEGE(table_schema, 'USAGE') AND

View File

@@ -90,7 +90,9 @@ def create_tables_from_query_ids(user, connection, query_ids, query_params, cach
for query in set(query_params):
results = get_query_results(user, query[0], False, query[1])
table_hash = hashlib.md5("query_{query}_{hash}".format(query=query[0], hash=query[1]).encode()).hexdigest()
table_hash = hashlib.md5(
"query_{query}_{hash}".format(query=query[0], hash=query[1]).encode(), usedforsecurity=False
).hexdigest()
table_name = "query_{query_id}_{param_hash}".format(query_id=query[0], param_hash=table_hash)
create_table(connection, table_name, results)
@@ -142,7 +144,9 @@ def create_table(connection, table_name, query_results):
def prepare_parameterized_query(query, query_params):
for params in query_params:
table_hash = hashlib.md5("query_{query}_{hash}".format(query=params[0], hash=params[1]).encode()).hexdigest()
table_hash = hashlib.md5(
"query_{query}_{hash}".format(query=params[0], hash=params[1]).encode(), usedforsecurity=False
).hexdigest()
key = "param_query_{query_id}_{{{param_string}}}".format(query_id=params[0], param_string=params[1])
value = "query_{query_id}_{param_hash}".format(query_id=params[0], param_hash=table_hash)
query = query.replace(key, value)

View File

@@ -55,12 +55,13 @@ class Script(BaseQueryRunner):
def __init__(self, configuration):
super(Script, self).__init__(configuration)
path = self.configuration.get("path", "")
# If path is * allow any execution path
if self.configuration["path"] == "*":
if path == "*":
return
# Poor man's protection against running scripts from outside the scripts directory
if self.configuration["path"].find("../") > -1:
if path.find("../") > -1:
raise ValueError("Scripts can only be run from the configured scripts directory")
def test_connection(self):

View File

@@ -28,7 +28,7 @@ class Sqlite(BaseSQLQueryRunner):
def __init__(self, configuration):
super(Sqlite, self).__init__(configuration)
self._dbpath = self.configuration["dbpath"]
self._dbpath = self.configuration.get("dbpath", "")
def _get_tables(self, schema):
query_table = "select tbl_name from sqlite_master where type='table'"

View File

@@ -7,7 +7,6 @@ separation of concerns.
from flask_login import current_user
from funcy import project
from rq.job import JobStatus
from rq.results import Result
from rq.timeouts import JobTimeoutException
from redash import models
@@ -272,19 +271,38 @@ class DashboardSerializer(Serializer):
def serialize_job(job):
# TODO: this is mapping to the old Job class statuses. Need to update the client side and remove this
STATUSES = {
JobStatus.QUEUED: 1,
JobStatus.STARTED: 2,
JobStatus.FINISHED: 3,
JobStatus.FAILED: 4,
JobStatus.CANCELED: 5,
JobStatus.DEFERRED: 6,
JobStatus.SCHEDULED: 7,
}
job_status = job.get_status()
if job.is_started:
updated_at = job.started_at or 0
else:
updated_at = 0
status = job.get_status()
error = result_id = None
job_result = job.latest_result()
if job_result:
if job_result.type == Result.Type.SUCCESSFUL:
result_id = job_result.return_value
status = STATUSES[job_status]
result = query_result_id = None
if job.is_cancelled:
error = "Query cancelled by user."
status = 4
elif isinstance(job.result, Exception):
error = str(job.result)
status = 4
elif isinstance(job.result, dict) and "error" in job.result:
error = job.result["error"]
status = 4
else:
error = job_result.exc_string
error = ""
result = query_result_id = job.result
return {
"job": {
@@ -292,6 +310,7 @@ def serialize_job(job):
"updated_at": updated_at,
"status": status,
"error": error,
"result_id": result_id,
"result": result,
"query_result_id": query_result_id,
}
}

View File

@@ -50,6 +50,7 @@ QUERY_RESULTS_EXPIRED_TTL_ENABLED = parse_boolean(os.environ.get("REDASH_QUERY_R
QUERY_RESULTS_EXPIRED_TTL = int(os.environ.get("REDASH_QUERY_RESULTS_EXPIRED_TTL", "86400"))
SCHEMAS_REFRESH_SCHEDULE = int(os.environ.get("REDASH_SCHEMAS_REFRESH_SCHEDULE", 30))
SCHEMAS_REFRESH_TIMEOUT = int(os.environ.get("REDASH_SCHEMAS_REFRESH_TIMEOUT", 300))
AUTH_TYPE = os.environ.get("REDASH_AUTH_TYPE", "api_key")
INVITATION_TOKEN_MAX_AGE = int(os.environ.get("REDASH_INVITATION_TOKEN_MAX_AGE", 60 * 60 * 24 * 7))
@@ -412,6 +413,7 @@ PAGE_SIZE_OPTIONS = list(
TABLE_CELL_MAX_JSON_SIZE = int(os.environ.get("REDASH_TABLE_CELL_MAX_JSON_SIZE", 50000))
# Features:
VERSION_CHECK = parse_boolean(os.environ.get("REDASH_VERSION_CHECK", "true"))
FEATURE_DISABLE_REFRESH_QUERIES = parse_boolean(os.environ.get("REDASH_FEATURE_DISABLE_REFRESH_QUERIES", "false"))
FEATURE_SHOW_QUERY_RESULTS_COUNT = parse_boolean(os.environ.get("REDASH_FEATURE_SHOW_QUERY_RESULTS_COUNT", "true"))
FEATURE_ALLOW_CUSTOM_JS_VISUALIZATIONS = parse_boolean(

Some files were not shown because too many files have changed in this diff Show More