mirror of
https://github.com/getredash/redash.git
synced 2025-12-19 17:37:19 -05:00
Compare commits
9 Commits
25.08.0-de
...
release/10
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
2589bef1f2 | ||
|
|
1c5ceecd50 | ||
|
|
41f948201a | ||
|
|
9c928bd1d3 | ||
|
|
f312adf77b | ||
|
|
92e5d78dde | ||
|
|
0983e6926f | ||
|
|
dec88799ab | ||
|
|
64a1d7a6cd |
@@ -13,6 +13,7 @@ services:
|
|||||||
REDASH_LOG_LEVEL: "INFO"
|
REDASH_LOG_LEVEL: "INFO"
|
||||||
REDASH_REDIS_URL: "redis://redis:6379/0"
|
REDASH_REDIS_URL: "redis://redis:6379/0"
|
||||||
REDASH_DATABASE_URL: "postgresql://postgres@postgres/postgres"
|
REDASH_DATABASE_URL: "postgresql://postgres@postgres/postgres"
|
||||||
|
REDASH_COOKIE_SECRET: "2H9gNG9obnAQ9qnR9BDTQUph6CbXKCzF"
|
||||||
redis:
|
redis:
|
||||||
image: redis:3.0-alpine
|
image: redis:3.0-alpine
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
|
|||||||
@@ -12,6 +12,7 @@ x-redash-environment: &redash-environment
|
|||||||
REDASH_DATABASE_URL: "postgresql://postgres@postgres/postgres"
|
REDASH_DATABASE_URL: "postgresql://postgres@postgres/postgres"
|
||||||
REDASH_RATELIMIT_ENABLED: "false"
|
REDASH_RATELIMIT_ENABLED: "false"
|
||||||
REDASH_ENFORCE_CSRF: "true"
|
REDASH_ENFORCE_CSRF: "true"
|
||||||
|
REDASH_COOKIE_SECRET: "2H9gNG9obnAQ9qnR9BDTQUph6CbXKCzF"
|
||||||
services:
|
services:
|
||||||
server:
|
server:
|
||||||
<<: *redash-service
|
<<: *redash-service
|
||||||
|
|||||||
147
CHANGELOG.md
147
CHANGELOG.md
@@ -1,5 +1,152 @@
|
|||||||
# Change Log
|
# Change Log
|
||||||
|
|
||||||
|
## V10.1.0 - 2021-11-23
|
||||||
|
|
||||||
|
This release includes patches for three security vulnerabilities:
|
||||||
|
|
||||||
|
- Insecure default configuration affects installations where REDASH_COOKIE_SECRET is not set explicitly (CVE-2021-41192)
|
||||||
|
- SSRF vulnerability affects installations that enabled URL-loading data sources (CVE-2021-43780)
|
||||||
|
- Incorrect usage of state parameter in OAuth client code affects installations where Google Login is enabled (CVE-2021-43777)
|
||||||
|
|
||||||
|
And a couple features that didn't merge in time for 10.0.0
|
||||||
|
|
||||||
|
- Big Query: Speed up schema loading (#5632)
|
||||||
|
- Add support for Firebolt data source (#5606)
|
||||||
|
- Fix: Loading schema for Sqlite DB with "Order" column name fails (#5623)
|
||||||
|
|
||||||
|
## v10.0.0 - 2021-10-01
|
||||||
|
|
||||||
|
A few changes were merged during the V10 beta period.
|
||||||
|
|
||||||
|
- New Data Source: CSV/Excel Files
|
||||||
|
- Fix: Edit Source button disappeared for users without CanEdit permissions
|
||||||
|
- We pinned our docker base image to Python3.7-slim-buster to avoid build issues
|
||||||
|
- Fix: dashboard list pagination didn't work
|
||||||
|
|
||||||
|
## v10.0.0-beta - 2021-06-16
|
||||||
|
|
||||||
|
Just over a year since our last release, the V10 beta is ready. Since we never made a non-beta release of V9, we expect many users will upgrade directly from V8 -> V10. This will bring a lot of exciting features. Please check out the V9 beta release notes below to learn more.
|
||||||
|
|
||||||
|
This V10 beta incorporates fixes for the feedback we received on the V9 beta along with a few long-requested features (horizontal bar charts!) and other changes to improve UX and reliability.
|
||||||
|
|
||||||
|
This release was made possible by contributions from 35+ people (the Github API didn't let us pull handles this time around): Alex Kovar, Alexander Rusanov, Arik Fraimovich, Ben Amor, Christopher Grant, Đặng Minh Dũng, Daniel Lang, deecay, Elad Ossadon, Gabriel Dutra, iwakiriK, Jannis Leidel, Jerry, Jesse Whitehouse, Jiajie Zhong, Jim Sparkman, Jonathan Hult, Josh Bohde, Justin Talbot, koooge, Lei Ni, Levko Kravets, Lingkai Kong, max-voronov, Mike Nason, Nolan Nichols, Omer Lachish, Patrick Yang, peterlee, Rafael Wendel, Sebastian Tramp, simonschneider-db, Tim Gates, Tobias Macey, Vipul Mathur, and Vladislav Denisov
|
||||||
|
|
||||||
|
Our special thanks to [Sohail Ahmed](https://pk.linkedin.com/in/sohail-ahmed-755776184) for reporting a vulnerability in our "forgot password" page (#5425)
|
||||||
|
|
||||||
|
### Upgrading
|
||||||
|
|
||||||
|
(This section is duplicated from the previous release - since many users will upgrade directly from V8 -> V10)
|
||||||
|
|
||||||
|
Typically, if you are running your own instance of Redash and wish to upgrade, you would simply modify the Docker tag in your `docker-compose.yml` file. Since RQ has replaced Celery in this version, there are a couple extra modifications that need to be done in your `docker-compose.yml`:
|
||||||
|
|
||||||
|
1. Under `services/scheduler/environment`, omit `QUEUES` and `WORKERS_COUNT` (and omit `environment` altogether if it is empty).
|
||||||
|
2. Under `services`, add a new service for general RQ jobs:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
worker:
|
||||||
|
<<: *redash-service
|
||||||
|
command: worker
|
||||||
|
environment:
|
||||||
|
QUEUES: "periodic emails default"
|
||||||
|
WORKERS_COUNT: 1
|
||||||
|
```
|
||||||
|
|
||||||
|
Following that, force a recreation of your containers with `docker-compose up --force-recreate --build` and you should be good to go.
|
||||||
|
### UX
|
||||||
|
- Redash now uses a vertical navbar
|
||||||
|
- Dashboard list now includes “My Dashboards” filter
|
||||||
|
- Dashboard parameters can now be re-ordered
|
||||||
|
- Queries can now be executed with Shift + Enter on all platforms.
|
||||||
|
- Added New Dashboard/Query/Alert buttons to corresponding list pages
|
||||||
|
- Dashboard text widgets now prompt to confirm before closing the text editor
|
||||||
|
- A plus sign is now shown between tags used for search
|
||||||
|
- On the queries list view “My Queries” has moved above “Archived”
|
||||||
|
- Improved behavior for filtering by tags in list views
|
||||||
|
- When a user’s session expires for inactivity, they are prompted to log-in with a pop-up so they don’t lose their place in the app
|
||||||
|
- Numerous accessibility changes towards the a11y standard
|
||||||
|
- Hide the “Create” menu button if current user doesn’t have permission to any data sources
|
||||||
|
|
||||||
|
### Visualizations
|
||||||
|
- Feature: Added support for horizontal box plots
|
||||||
|
- Feature: Added support for horizontal bar charts
|
||||||
|
- Feature: Added “Reverse” option for Chart visualization legend
|
||||||
|
- Feature: Added option to align Chart Y-axes at zero
|
||||||
|
- Feature: The table visualization header is now fixed when scrolling
|
||||||
|
- Feature: Added USA map to choropleth visualization
|
||||||
|
- Fix: Selected filters were reset when switching visualizations
|
||||||
|
- Fix: Stacked bar chart showed the wrong Y-axis range in some cases
|
||||||
|
- Fix: Bar chart with second y axis overlapped data series
|
||||||
|
- Fix: Y-axis autoscale failed when min or max was set
|
||||||
|
- Fix: Custom JS visualization was broken because of a typo
|
||||||
|
- Fix: Too large visualization caused filters block to collapse
|
||||||
|
- Fix: Sankey visualization looked inconsistent if the data source returned VARCHAR instead of numeric types
|
||||||
|
|
||||||
|
### Structural Updates
|
||||||
|
- Redash now prevents CSRF attacks
|
||||||
|
- Migration to TypeScript
|
||||||
|
- Upgrade to Antd version 4
|
||||||
|
### Data Sources
|
||||||
|
- New Data Sources: SPARQL Endpoint, Eccenca Corporate Memory, TrinoDB
|
||||||
|
- Databricks
|
||||||
|
- Custom Schema Browser that allows switching between databases
|
||||||
|
- Option added to truncate large results
|
||||||
|
- Support for multiple-statement queries
|
||||||
|
- Schema browser can now use eventlet instead of RQ
|
||||||
|
- MongoDB:
|
||||||
|
- Moved Username and Password out of the connection string so that password can be stored secretly
|
||||||
|
- Oracle:
|
||||||
|
- Fix: Annotated queries always failed. Annotation is now disabled
|
||||||
|
- Postgres/CockroachDB:
|
||||||
|
- SSL certfile/keyfile fields are now handled as secret
|
||||||
|
- Python:
|
||||||
|
- Feature: Custom built-ins are now supported
|
||||||
|
- Fix: Query runner was not compatible with Python 3
|
||||||
|
- Snowflake:
|
||||||
|
- Data source now accepts a custom host address (for use with proxies)
|
||||||
|
- TreasureData:
|
||||||
|
- API key field is now handled as secret
|
||||||
|
- Yandex:
|
||||||
|
- OAuth token field is now handled as secret
|
||||||
|
|
||||||
|
### Alerts
|
||||||
|
- Feature: Added ability to mute alerts without deleting them
|
||||||
|
- Change: Non-email alert destination details are now obfuscated to avoid leaking sensitive information (webhook URLs, tokens etc.)
|
||||||
|
- Fix: numerical comparisons failed if value from query was a string
|
||||||
|
|
||||||
|
### Parameters
|
||||||
|
- Added “Last 12 months” option for dynamic date ranges
|
||||||
|
|
||||||
|
### Bug Fixes
|
||||||
|
- Fix: Private addresses were not allowed even when enforcing was disabled
|
||||||
|
- Fix: Python query runner wasn’t updated for Python 3
|
||||||
|
- Fix: Sorting queries by schedule returned the wrong order
|
||||||
|
- Fix: Counter visualization was enormous in some cases
|
||||||
|
- Fix: Dashboard URL will now change when the dashboard title changes
|
||||||
|
- Fix: URL parameters were removed when forking a query
|
||||||
|
- Fix: Create link on data sources page was broken
|
||||||
|
- Fix: Queries could be reassigned to read-only data sources
|
||||||
|
- Fix: Multi-select dropdown was very slow if there were 1k+ options
|
||||||
|
- Fix: Search Input couldn’t be focused or updated while editing a dashboard
|
||||||
|
- Fix: The CLI command for “status” did not work
|
||||||
|
- Fix: The dashboard list screen displayed too few items under certain pagination configurations
|
||||||
|
|
||||||
|
### Other
|
||||||
|
- Added an environment variable to disable public sharing links for queries and dashboards
|
||||||
|
- Alert destinations are now encrypted at the database
|
||||||
|
- The base query runner now has stubs to implement result truncating for other data sources
|
||||||
|
- Static SAML configuration and assertion encryption are now supported
|
||||||
|
- Adds new component for adding extra actions to the query and dashboard pages
|
||||||
|
- Non-admins with at least view_only permission on a dashboard can now make GET requests to the data source resource
|
||||||
|
- Added a BLOCKED_DOMAINS setting to prevent sign-ups from emails at specific domains
|
||||||
|
- Added a rate limit to the “forgot password” page
|
||||||
|
- RQ workers will now shutdown gracefully for known error codes
|
||||||
|
- Scheduled execution failure counter now resets following a successful ad hoc execution
|
||||||
|
- Redash now deletes locks for cancelled queries
|
||||||
|
- Upgraded Ace Editor from v6 to v9
|
||||||
|
- Added a periodic job to remove ghost locks
|
||||||
|
- Removed content width limit on all pages
|
||||||
|
- Introduce a <Link> React component
|
||||||
|
|
||||||
## v9.0.0-beta - 2020-06-11
|
## v9.0.0-beta - 2020-06-11
|
||||||
|
|
||||||
This release was long time in the making and has several major changes:
|
This release was long time in the making and has several major changes:
|
||||||
|
|||||||
@@ -22,7 +22,8 @@ RUN if [ "x$skip_frontend_build" = "x" ] ; then npm ci --unsafe-perm; fi
|
|||||||
COPY --chown=redash client /frontend/client
|
COPY --chown=redash client /frontend/client
|
||||||
COPY --chown=redash webpack.config.js /frontend/
|
COPY --chown=redash webpack.config.js /frontend/
|
||||||
RUN if [ "x$skip_frontend_build" = "x" ] ; then npm run build; else mkdir -p /frontend/client/dist && touch /frontend/client/dist/multi_org.html && touch /frontend/client/dist/index.html; fi
|
RUN if [ "x$skip_frontend_build" = "x" ] ; then npm run build; else mkdir -p /frontend/client/dist && touch /frontend/client/dist/multi_org.html && touch /frontend/client/dist/index.html; fi
|
||||||
FROM python:3.7-slim
|
|
||||||
|
FROM python:3.7-slim-buster
|
||||||
|
|
||||||
EXPOSE 5000
|
EXPOSE 5000
|
||||||
|
|
||||||
|
|||||||
@@ -43,6 +43,7 @@ Redash supports more than 35 SQL and NoSQL [data sources](https://redash.io/help
|
|||||||
- DB2 by IBM
|
- DB2 by IBM
|
||||||
- Druid
|
- Druid
|
||||||
- Elasticsearch
|
- Elasticsearch
|
||||||
|
- Firebolt
|
||||||
- Google Analytics
|
- Google Analytics
|
||||||
- Google BigQuery
|
- Google BigQuery
|
||||||
- Google Spreadsheets
|
- Google Spreadsheets
|
||||||
|
|||||||
BIN
client/app/assets/images/db-logos/excel.png
Normal file
BIN
client/app/assets/images/db-logos/excel.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 3.6 KiB |
BIN
client/app/assets/images/db-logos/firebolt.png
Normal file
BIN
client/app/assets/images/db-logos/firebolt.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 12 KiB |
@@ -179,7 +179,7 @@ export default function QueryPageHeader({
|
|||||||
|
|
||||||
{!queryFlags.isNew && queryFlags.canViewSource && (
|
{!queryFlags.isNew && queryFlags.canViewSource && (
|
||||||
<span>
|
<span>
|
||||||
{!sourceMode && queryFlags.canEdit && (
|
{!sourceMode && (
|
||||||
<Link.Button className="m-r-5" href={query.getUrl(true, selectedVisualization)}>
|
<Link.Button className="m-r-5" href={query.getUrl(true, selectedVisualization)}>
|
||||||
<i className="fa fa-pencil-square-o" aria-hidden="true" />
|
<i className="fa fa-pencil-square-o" aria-hidden="true" />
|
||||||
<span className="m-l-5">Edit Source</span>
|
<span className="m-l-5">Edit Source</span>
|
||||||
|
|||||||
@@ -8,6 +8,8 @@ x-redash-service: &redash-service
|
|||||||
skip_frontend_build: "true"
|
skip_frontend_build: "true"
|
||||||
volumes:
|
volumes:
|
||||||
- .:/app
|
- .:/app
|
||||||
|
env_file:
|
||||||
|
- .env
|
||||||
x-redash-environment: &redash-environment
|
x-redash-environment: &redash-environment
|
||||||
REDASH_LOG_LEVEL: "INFO"
|
REDASH_LOG_LEVEL: "INFO"
|
||||||
REDASH_REDIS_URL: "redis://redis:6379/0"
|
REDASH_REDIS_URL: "redis://redis:6379/0"
|
||||||
@@ -16,6 +18,7 @@ x-redash-environment: &redash-environment
|
|||||||
REDASH_MAIL_DEFAULT_SENDER: "redash@example.com"
|
REDASH_MAIL_DEFAULT_SENDER: "redash@example.com"
|
||||||
REDASH_MAIL_SERVER: "email"
|
REDASH_MAIL_SERVER: "email"
|
||||||
REDASH_ENFORCE_CSRF: "true"
|
REDASH_ENFORCE_CSRF: "true"
|
||||||
|
# Set secret keys in the .env file
|
||||||
services:
|
services:
|
||||||
server:
|
server:
|
||||||
<<: *redash-service
|
<<: *redash-service
|
||||||
|
|||||||
2
package-lock.json
generated
2
package-lock.json
generated
@@ -1,6 +1,6 @@
|
|||||||
{
|
{
|
||||||
"name": "redash-client",
|
"name": "redash-client",
|
||||||
"version": "9.0.0-beta",
|
"version": "10.1.0",
|
||||||
"lockfileVersion": 1,
|
"lockfileVersion": 1,
|
||||||
"requires": true,
|
"requires": true,
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
{
|
{
|
||||||
"name": "redash-client",
|
"name": "redash-client",
|
||||||
"version": "9.0.0-beta",
|
"version": "10.1.0",
|
||||||
"description": "The frontend part of Redash.",
|
"description": "The frontend part of Redash.",
|
||||||
"main": "index.js",
|
"main": "index.js",
|
||||||
"scripts": {
|
"scripts": {
|
||||||
|
|||||||
@@ -15,7 +15,7 @@ from .app import create_app # noqa
|
|||||||
from .query_runner import import_query_runners
|
from .query_runner import import_query_runners
|
||||||
from .destinations import import_destinations
|
from .destinations import import_destinations
|
||||||
|
|
||||||
__version__ = "9.0.0-beta"
|
__version__ = "10.1.0"
|
||||||
|
|
||||||
|
|
||||||
if os.environ.get("REMOTE_DEBUG"):
|
if os.environ.get("REMOTE_DEBUG"):
|
||||||
|
|||||||
@@ -243,12 +243,13 @@ def logout_and_redirect_to_index():
|
|||||||
|
|
||||||
def init_app(app):
|
def init_app(app):
|
||||||
from redash.authentication import (
|
from redash.authentication import (
|
||||||
google_oauth,
|
|
||||||
saml_auth,
|
saml_auth,
|
||||||
remote_user_auth,
|
remote_user_auth,
|
||||||
ldap_auth,
|
ldap_auth,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
from redash.authentication.google_oauth import create_google_oauth_blueprint
|
||||||
|
|
||||||
login_manager.init_app(app)
|
login_manager.init_app(app)
|
||||||
login_manager.anonymous_user = models.AnonymousUser
|
login_manager.anonymous_user = models.AnonymousUser
|
||||||
login_manager.REMEMBER_COOKIE_DURATION = settings.REMEMBER_COOKIE_DURATION
|
login_manager.REMEMBER_COOKIE_DURATION = settings.REMEMBER_COOKIE_DURATION
|
||||||
@@ -259,8 +260,9 @@ def init_app(app):
|
|||||||
app.permanent_session_lifetime = timedelta(seconds=settings.SESSION_EXPIRY_TIME)
|
app.permanent_session_lifetime = timedelta(seconds=settings.SESSION_EXPIRY_TIME)
|
||||||
|
|
||||||
from redash.security import csrf
|
from redash.security import csrf
|
||||||
for auth in [google_oauth, saml_auth, remote_user_auth, ldap_auth]:
|
|
||||||
blueprint = auth.blueprint
|
# Authlib's flask oauth client requires a Flask app to initialize
|
||||||
|
for blueprint in [create_google_oauth_blueprint(app), saml_auth.blueprint, remote_user_auth.blueprint, ldap_auth.blueprint, ]:
|
||||||
csrf.exempt(blueprint)
|
csrf.exempt(blueprint)
|
||||||
app.register_blueprint(blueprint)
|
app.register_blueprint(blueprint)
|
||||||
|
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
import logging
|
import logging
|
||||||
import requests
|
import requests
|
||||||
from flask import redirect, url_for, Blueprint, flash, request, session
|
from flask import redirect, url_for, Blueprint, flash, request, session
|
||||||
from flask_oauthlib.client import OAuth
|
|
||||||
|
|
||||||
from redash import models, settings
|
from redash import models, settings
|
||||||
from redash.authentication import (
|
from redash.authentication import (
|
||||||
@@ -11,42 +11,7 @@ from redash.authentication import (
|
|||||||
)
|
)
|
||||||
from redash.authentication.org_resolving import current_org
|
from redash.authentication.org_resolving import current_org
|
||||||
|
|
||||||
logger = logging.getLogger("google_oauth")
|
from authlib.integrations.flask_client import OAuth
|
||||||
|
|
||||||
oauth = OAuth()
|
|
||||||
blueprint = Blueprint("google_oauth", __name__)
|
|
||||||
|
|
||||||
|
|
||||||
def google_remote_app():
|
|
||||||
if "google" not in oauth.remote_apps:
|
|
||||||
oauth.remote_app(
|
|
||||||
"google",
|
|
||||||
base_url="https://www.google.com/accounts/",
|
|
||||||
authorize_url="https://accounts.google.com/o/oauth2/auth?prompt=select_account+consent",
|
|
||||||
request_token_url=None,
|
|
||||||
request_token_params={
|
|
||||||
"scope": "https://www.googleapis.com/auth/userinfo.email https://www.googleapis.com/auth/userinfo.profile"
|
|
||||||
},
|
|
||||||
access_token_url="https://accounts.google.com/o/oauth2/token",
|
|
||||||
access_token_method="POST",
|
|
||||||
consumer_key=settings.GOOGLE_CLIENT_ID,
|
|
||||||
consumer_secret=settings.GOOGLE_CLIENT_SECRET,
|
|
||||||
)
|
|
||||||
|
|
||||||
return oauth.google
|
|
||||||
|
|
||||||
|
|
||||||
def get_user_profile(access_token):
|
|
||||||
headers = {"Authorization": "OAuth {}".format(access_token)}
|
|
||||||
response = requests.get(
|
|
||||||
"https://www.googleapis.com/oauth2/v1/userinfo", headers=headers
|
|
||||||
)
|
|
||||||
|
|
||||||
if response.status_code == 401:
|
|
||||||
logger.warning("Failed getting user profile (response code 401).")
|
|
||||||
return None
|
|
||||||
|
|
||||||
return response.json()
|
|
||||||
|
|
||||||
|
|
||||||
def verify_profile(org, profile):
|
def verify_profile(org, profile):
|
||||||
@@ -65,26 +30,62 @@ def verify_profile(org, profile):
|
|||||||
return False
|
return False
|
||||||
|
|
||||||
|
|
||||||
@blueprint.route("/<org_slug>/oauth/google", endpoint="authorize_org")
|
def create_google_oauth_blueprint(app):
|
||||||
def org_login(org_slug):
|
oauth = OAuth(app)
|
||||||
|
|
||||||
|
logger = logging.getLogger("google_oauth")
|
||||||
|
blueprint = Blueprint("google_oauth", __name__)
|
||||||
|
|
||||||
|
CONF_URL = "https://accounts.google.com/.well-known/openid-configuration"
|
||||||
|
oauth = OAuth(app)
|
||||||
|
oauth.register(
|
||||||
|
name="google",
|
||||||
|
server_metadata_url=CONF_URL,
|
||||||
|
client_kwargs={"scope": "openid email profile"},
|
||||||
|
)
|
||||||
|
|
||||||
|
def get_user_profile(access_token):
|
||||||
|
headers = {"Authorization": "OAuth {}".format(access_token)}
|
||||||
|
response = requests.get(
|
||||||
|
"https://www.googleapis.com/oauth2/v1/userinfo", headers=headers
|
||||||
|
)
|
||||||
|
|
||||||
|
if response.status_code == 401:
|
||||||
|
logger.warning("Failed getting user profile (response code 401).")
|
||||||
|
return None
|
||||||
|
|
||||||
|
return response.json()
|
||||||
|
|
||||||
|
@blueprint.route("/<org_slug>/oauth/google", endpoint="authorize_org")
|
||||||
|
def org_login(org_slug):
|
||||||
session["org_slug"] = current_org.slug
|
session["org_slug"] = current_org.slug
|
||||||
return redirect(url_for(".authorize", next=request.args.get("next", None)))
|
return redirect(url_for(".authorize", next=request.args.get("next", None)))
|
||||||
|
|
||||||
|
@blueprint.route("/oauth/google", endpoint="authorize")
|
||||||
|
def login():
|
||||||
|
|
||||||
|
redirect_uri = url_for(".callback", _external=True)
|
||||||
|
|
||||||
@blueprint.route("/oauth/google", endpoint="authorize")
|
|
||||||
def login():
|
|
||||||
callback = url_for(".callback", _external=True)
|
|
||||||
next_path = request.args.get(
|
next_path = request.args.get(
|
||||||
"next", url_for("redash.index", org_slug=session.get("org_slug"))
|
"next", url_for("redash.index", org_slug=session.get("org_slug"))
|
||||||
)
|
)
|
||||||
logger.debug("Callback url: %s", callback)
|
logger.debug("Callback url: %s", redirect_uri)
|
||||||
logger.debug("Next is: %s", next_path)
|
logger.debug("Next is: %s", next_path)
|
||||||
return google_remote_app().authorize(callback=callback, state=next_path)
|
|
||||||
|
|
||||||
|
session["next_url"] = next_path
|
||||||
|
|
||||||
|
return oauth.google.authorize_redirect(redirect_uri)
|
||||||
|
|
||||||
|
@blueprint.route("/oauth/google_callback", endpoint="callback")
|
||||||
|
def authorized():
|
||||||
|
|
||||||
|
logger.debug("Authorized user inbound")
|
||||||
|
|
||||||
|
resp = oauth.google.authorize_access_token()
|
||||||
|
user = resp.get("userinfo")
|
||||||
|
if user:
|
||||||
|
session["user"] = user
|
||||||
|
|
||||||
@blueprint.route("/oauth/google_callback", endpoint="callback")
|
|
||||||
def authorized():
|
|
||||||
resp = google_remote_app().authorized_response()
|
|
||||||
access_token = resp["access_token"]
|
access_token = resp["access_token"]
|
||||||
|
|
||||||
if access_token is None:
|
if access_token is None:
|
||||||
@@ -108,17 +109,23 @@ def authorized():
|
|||||||
profile["email"],
|
profile["email"],
|
||||||
org,
|
org,
|
||||||
)
|
)
|
||||||
flash("Your Google Apps account ({}) isn't allowed.".format(profile["email"]))
|
flash(
|
||||||
|
"Your Google Apps account ({}) isn't allowed.".format(profile["email"])
|
||||||
|
)
|
||||||
return redirect(url_for("redash.login", org_slug=org.slug))
|
return redirect(url_for("redash.login", org_slug=org.slug))
|
||||||
|
|
||||||
picture_url = "%s?sz=40" % profile["picture"]
|
picture_url = "%s?sz=40" % profile["picture"]
|
||||||
user = create_and_login_user(org, profile["name"], profile["email"], picture_url)
|
user = create_and_login_user(
|
||||||
|
org, profile["name"], profile["email"], picture_url
|
||||||
|
)
|
||||||
if user is None:
|
if user is None:
|
||||||
return logout_and_redirect_to_index()
|
return logout_and_redirect_to_index()
|
||||||
|
|
||||||
unsafe_next_path = request.args.get("state") or url_for(
|
unsafe_next_path = session.get("next_url") or url_for(
|
||||||
"redash.index", org_slug=org.slug
|
"redash.index", org_slug=org.slug
|
||||||
)
|
)
|
||||||
next_path = get_next_path(unsafe_next_path)
|
next_path = get_next_path(unsafe_next_path)
|
||||||
|
|
||||||
return redirect(next_path)
|
return redirect(next_path)
|
||||||
|
|
||||||
|
return blueprint
|
||||||
|
|||||||
@@ -1120,7 +1120,7 @@ class Dashboard(ChangeTrackingMixin, TimestampMixin, BelongsToOrgMixin, db.Model
|
|||||||
joinedload(Dashboard.user).load_only(
|
joinedload(Dashboard.user).load_only(
|
||||||
"id", "name", "_profile_image_url", "email"
|
"id", "name", "_profile_image_url", "email"
|
||||||
)
|
)
|
||||||
)
|
).distinct(Dashboard.created_at, Dashboard.slug)
|
||||||
.outerjoin(Widget)
|
.outerjoin(Widget)
|
||||||
.outerjoin(Visualization)
|
.outerjoin(Visualization)
|
||||||
.outerjoin(Query)
|
.outerjoin(Query)
|
||||||
|
|||||||
@@ -13,7 +13,8 @@ from redash import settings, utils
|
|||||||
from redash.utils import json_loads, query_is_select_no_limit, add_limit_to_query
|
from redash.utils import json_loads, query_is_select_no_limit, add_limit_to_query
|
||||||
from rq.timeouts import JobTimeoutException
|
from rq.timeouts import JobTimeoutException
|
||||||
|
|
||||||
from redash.utils.requests_session import requests, requests_session
|
from redash.utils.requests_session import requests_or_advocate, requests_session, UnacceptableAddressException
|
||||||
|
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
@@ -236,12 +237,6 @@ class BaseSQLQueryRunner(BaseQueryRunner):
|
|||||||
return query_text
|
return query_text
|
||||||
|
|
||||||
|
|
||||||
def is_private_address(url):
|
|
||||||
hostname = urlparse(url).hostname
|
|
||||||
ip_address = socket.gethostbyname(hostname)
|
|
||||||
return ipaddress.ip_address(text_type(ip_address)).is_private
|
|
||||||
|
|
||||||
|
|
||||||
class BaseHTTPQueryRunner(BaseQueryRunner):
|
class BaseHTTPQueryRunner(BaseQueryRunner):
|
||||||
should_annotate_query = False
|
should_annotate_query = False
|
||||||
response_error = "Endpoint returned unexpected status code"
|
response_error = "Endpoint returned unexpected status code"
|
||||||
@@ -285,8 +280,6 @@ class BaseHTTPQueryRunner(BaseQueryRunner):
|
|||||||
return None
|
return None
|
||||||
|
|
||||||
def get_response(self, url, auth=None, http_method="get", **kwargs):
|
def get_response(self, url, auth=None, http_method="get", **kwargs):
|
||||||
if is_private_address(url) and settings.ENFORCE_PRIVATE_ADDRESS_BLOCK:
|
|
||||||
raise Exception("Can't query private addresses.")
|
|
||||||
|
|
||||||
# Get authentication values if not given
|
# Get authentication values if not given
|
||||||
if auth is None:
|
if auth is None:
|
||||||
@@ -307,12 +300,15 @@ class BaseHTTPQueryRunner(BaseQueryRunner):
|
|||||||
if response.status_code != 200:
|
if response.status_code != 200:
|
||||||
error = "{} ({}).".format(self.response_error, response.status_code)
|
error = "{} ({}).".format(self.response_error, response.status_code)
|
||||||
|
|
||||||
except requests.HTTPError as exc:
|
except requests_or_advocate.HTTPError as exc:
|
||||||
logger.exception(exc)
|
logger.exception(exc)
|
||||||
error = "Failed to execute query. " "Return Code: {} Reason: {}".format(
|
error = "Failed to execute query. " "Return Code: {} Reason: {}".format(
|
||||||
response.status_code, response.text
|
response.status_code, response.text
|
||||||
)
|
)
|
||||||
except requests.RequestException as exc:
|
except UnacceptableAddressException as exc:
|
||||||
|
logger.exception(exc)
|
||||||
|
error = "Can't query private addresses."
|
||||||
|
except requests_or_advocate.RequestException as exc:
|
||||||
# Catch all other requests exceptions and return the error.
|
# Catch all other requests exceptions and return the error.
|
||||||
logger.exception(exc)
|
logger.exception(exc)
|
||||||
error = str(exc)
|
error = str(exc)
|
||||||
|
|||||||
@@ -268,41 +268,33 @@ class BigQuery(BaseQueryRunner):
|
|||||||
service = self._get_bigquery_service()
|
service = self._get_bigquery_service()
|
||||||
project_id = self._get_project_id()
|
project_id = self._get_project_id()
|
||||||
datasets = service.datasets().list(projectId=project_id).execute()
|
datasets = service.datasets().list(projectId=project_id).execute()
|
||||||
schema = []
|
|
||||||
|
query_base = """
|
||||||
|
SELECT table_schema, table_name, column_name
|
||||||
|
FROM `{dataset_id}`.INFORMATION_SCHEMA.COLUMNS
|
||||||
|
WHERE table_schema NOT IN ('information_schema')
|
||||||
|
"""
|
||||||
|
|
||||||
|
schema = {}
|
||||||
|
queries = []
|
||||||
for dataset in datasets.get("datasets", []):
|
for dataset in datasets.get("datasets", []):
|
||||||
dataset_id = dataset["datasetReference"]["datasetId"]
|
dataset_id = dataset["datasetReference"]["datasetId"]
|
||||||
tables = (
|
query = query_base.format(dataset_id=dataset_id)
|
||||||
service.tables()
|
queries.append(query)
|
||||||
.list(projectId=project_id, datasetId=dataset_id)
|
|
||||||
.execute()
|
|
||||||
)
|
|
||||||
while True:
|
|
||||||
for table in tables.get("tables", []):
|
|
||||||
table_data = (
|
|
||||||
service.tables()
|
|
||||||
.get(
|
|
||||||
projectId=project_id,
|
|
||||||
datasetId=dataset_id,
|
|
||||||
tableId=table["tableReference"]["tableId"],
|
|
||||||
)
|
|
||||||
.execute()
|
|
||||||
)
|
|
||||||
table_schema = self._get_columns_schema(table_data)
|
|
||||||
schema.append(table_schema)
|
|
||||||
|
|
||||||
next_token = tables.get("nextPageToken", None)
|
query = '\nUNION ALL\n'.join(queries)
|
||||||
if next_token is None:
|
results, error = self.run_query(query, None)
|
||||||
break
|
if error is not None:
|
||||||
|
raise Exception("Failed getting schema.")
|
||||||
|
|
||||||
tables = (
|
results = json_loads(results)
|
||||||
service.tables()
|
for row in results["rows"]:
|
||||||
.list(
|
table_name = "{0}.{1}".format(row["table_schema"], row["table_name"])
|
||||||
projectId=project_id, datasetId=dataset_id, pageToken=next_token
|
if table_name not in schema:
|
||||||
)
|
schema[table_name] = {"name": table_name, "columns": []}
|
||||||
.execute()
|
schema[table_name]["columns"].append(row["column_name"])
|
||||||
)
|
|
||||||
|
|
||||||
return schema
|
return list(schema.values())
|
||||||
|
|
||||||
def run_query(self, query, user):
|
def run_query(self, query, user):
|
||||||
logger.debug("BigQuery got query: %s", query)
|
logger.debug("BigQuery got query: %s", query)
|
||||||
|
|||||||
100
redash/query_runner/csv.py
Normal file
100
redash/query_runner/csv.py
Normal file
@@ -0,0 +1,100 @@
|
|||||||
|
import logging
|
||||||
|
import yaml
|
||||||
|
import io
|
||||||
|
|
||||||
|
from redash.utils.requests_session import requests_or_advocate, UnacceptableAddressException
|
||||||
|
|
||||||
|
from redash.query_runner import *
|
||||||
|
from redash.utils import json_dumps
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
try:
|
||||||
|
import pandas as pd
|
||||||
|
import numpy as np
|
||||||
|
enabled = True
|
||||||
|
except ImportError:
|
||||||
|
enabled = False
|
||||||
|
|
||||||
|
|
||||||
|
class CSV(BaseQueryRunner):
|
||||||
|
should_annotate_query = False
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def name(cls):
|
||||||
|
return "CSV"
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def enabled(cls):
|
||||||
|
return enabled
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def configuration_schema(cls):
|
||||||
|
return {
|
||||||
|
'type': 'object',
|
||||||
|
'properties': {},
|
||||||
|
}
|
||||||
|
|
||||||
|
def __init__(self, configuration):
|
||||||
|
super(CSV, self).__init__(configuration)
|
||||||
|
self.syntax = "yaml"
|
||||||
|
|
||||||
|
def test_connection(self):
|
||||||
|
pass
|
||||||
|
|
||||||
|
def run_query(self, query, user):
|
||||||
|
path = ""
|
||||||
|
ua = ""
|
||||||
|
args = {}
|
||||||
|
try:
|
||||||
|
args = yaml.safe_load(query)
|
||||||
|
path = args['url']
|
||||||
|
args.pop('url', None)
|
||||||
|
ua = args['user-agent']
|
||||||
|
args.pop('user-agent', None)
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
|
try:
|
||||||
|
response = requests_or_advocate.get(url=path, headers={"User-agent": ua})
|
||||||
|
workbook = pd.read_csv(io.BytesIO(response.content),sep=",", **args)
|
||||||
|
|
||||||
|
df = workbook.copy()
|
||||||
|
data = {'columns': [], 'rows': []}
|
||||||
|
conversions = [
|
||||||
|
{'pandas_type': np.integer, 'redash_type': 'integer',},
|
||||||
|
{'pandas_type': np.inexact, 'redash_type': 'float',},
|
||||||
|
{'pandas_type': np.datetime64, 'redash_type': 'datetime', 'to_redash': lambda x: x.strftime('%Y-%m-%d %H:%M:%S')},
|
||||||
|
{'pandas_type': np.bool_, 'redash_type': 'boolean'},
|
||||||
|
{'pandas_type': np.object, 'redash_type': 'string'}
|
||||||
|
]
|
||||||
|
labels = []
|
||||||
|
for dtype, label in zip(df.dtypes, df.columns):
|
||||||
|
for conversion in conversions:
|
||||||
|
if issubclass(dtype.type, conversion['pandas_type']):
|
||||||
|
data['columns'].append({'name': label, 'friendly_name': label, 'type': conversion['redash_type']})
|
||||||
|
labels.append(label)
|
||||||
|
func = conversion.get('to_redash')
|
||||||
|
if func:
|
||||||
|
df[label] = df[label].apply(func)
|
||||||
|
break
|
||||||
|
data['rows'] = df[labels].replace({np.nan: None}).to_dict(orient='records')
|
||||||
|
|
||||||
|
json_data = json_dumps(data)
|
||||||
|
error = None
|
||||||
|
except KeyboardInterrupt:
|
||||||
|
error = "Query cancelled by user."
|
||||||
|
json_data = None
|
||||||
|
except UnacceptableAddressException:
|
||||||
|
error = "Can't query private addresses."
|
||||||
|
json_data = None
|
||||||
|
except Exception as e:
|
||||||
|
error = "Error reading {0}. {1}".format(path, str(e))
|
||||||
|
json_data = None
|
||||||
|
|
||||||
|
return json_data, error
|
||||||
|
|
||||||
|
def get_schema(self):
|
||||||
|
raise NotSupported()
|
||||||
|
|
||||||
|
register(CSV)
|
||||||
97
redash/query_runner/excel.py
Normal file
97
redash/query_runner/excel.py
Normal file
@@ -0,0 +1,97 @@
|
|||||||
|
import logging
|
||||||
|
import yaml
|
||||||
|
|
||||||
|
from redash.utils.requests_session import requests_or_advocate, UnacceptableAddressException
|
||||||
|
|
||||||
|
from redash.query_runner import *
|
||||||
|
from redash.utils import json_dumps
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
try:
|
||||||
|
import pandas as pd
|
||||||
|
import xlrd
|
||||||
|
import openpyxl
|
||||||
|
import numpy as np
|
||||||
|
enabled = True
|
||||||
|
except ImportError:
|
||||||
|
enabled = False
|
||||||
|
|
||||||
|
class Excel(BaseQueryRunner):
|
||||||
|
should_annotate_query = False
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def enabled(cls):
|
||||||
|
return enabled
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def configuration_schema(cls):
|
||||||
|
return {
|
||||||
|
'type': 'object',
|
||||||
|
'properties': {},
|
||||||
|
}
|
||||||
|
|
||||||
|
def __init__(self, configuration):
|
||||||
|
super(Excel, self).__init__(configuration)
|
||||||
|
self.syntax = "yaml"
|
||||||
|
|
||||||
|
def test_connection(self):
|
||||||
|
pass
|
||||||
|
|
||||||
|
def run_query(self, query, user):
|
||||||
|
path = ""
|
||||||
|
ua = ""
|
||||||
|
args = {}
|
||||||
|
try:
|
||||||
|
args = yaml.safe_load(query)
|
||||||
|
path = args['url']
|
||||||
|
args.pop('url', None)
|
||||||
|
ua = args['user-agent']
|
||||||
|
args.pop('user-agent', None)
|
||||||
|
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
|
try:
|
||||||
|
response = requests_or_advocate.get(url=path, headers={"User-agent": ua})
|
||||||
|
workbook = pd.read_excel(response.content, **args)
|
||||||
|
|
||||||
|
df = workbook.copy()
|
||||||
|
data = {'columns': [], 'rows': []}
|
||||||
|
conversions = [
|
||||||
|
{'pandas_type': np.integer, 'redash_type': 'integer',},
|
||||||
|
{'pandas_type': np.inexact, 'redash_type': 'float',},
|
||||||
|
{'pandas_type': np.datetime64, 'redash_type': 'datetime', 'to_redash': lambda x: x.strftime('%Y-%m-%d %H:%M:%S')},
|
||||||
|
{'pandas_type': np.bool_, 'redash_type': 'boolean'},
|
||||||
|
{'pandas_type': np.object, 'redash_type': 'string'}
|
||||||
|
]
|
||||||
|
labels = []
|
||||||
|
for dtype, label in zip(df.dtypes, df.columns):
|
||||||
|
for conversion in conversions:
|
||||||
|
if issubclass(dtype.type, conversion['pandas_type']):
|
||||||
|
data['columns'].append({'name': label, 'friendly_name': label, 'type': conversion['redash_type']})
|
||||||
|
labels.append(label)
|
||||||
|
func = conversion.get('to_redash')
|
||||||
|
if func:
|
||||||
|
df[label] = df[label].apply(func)
|
||||||
|
break
|
||||||
|
data['rows'] = df[labels].replace({np.nan: None}).to_dict(orient='records')
|
||||||
|
|
||||||
|
json_data = json_dumps(data)
|
||||||
|
error = None
|
||||||
|
except KeyboardInterrupt:
|
||||||
|
error = "Query cancelled by user."
|
||||||
|
json_data = None
|
||||||
|
except UnacceptableAddressException:
|
||||||
|
error = "Can't query private addresses."
|
||||||
|
json_data = None
|
||||||
|
except Exception as e:
|
||||||
|
error = "Error reading {0}. {1}".format(path, str(e))
|
||||||
|
json_data = None
|
||||||
|
|
||||||
|
return json_data, error
|
||||||
|
|
||||||
|
def get_schema(self):
|
||||||
|
raise NotSupported()
|
||||||
|
|
||||||
|
register(Excel)
|
||||||
94
redash/query_runner/firebolt.py
Normal file
94
redash/query_runner/firebolt.py
Normal file
@@ -0,0 +1,94 @@
|
|||||||
|
try:
|
||||||
|
from firebolt_db.firebolt_connector import Connection
|
||||||
|
enabled = True
|
||||||
|
except ImportError:
|
||||||
|
enabled = False
|
||||||
|
|
||||||
|
from redash.query_runner import BaseQueryRunner, register
|
||||||
|
from redash.query_runner import TYPE_STRING, TYPE_INTEGER, TYPE_BOOLEAN
|
||||||
|
from redash.utils import json_dumps, json_loads
|
||||||
|
|
||||||
|
TYPES_MAP = {1: TYPE_STRING, 2: TYPE_INTEGER, 3: TYPE_BOOLEAN}
|
||||||
|
|
||||||
|
|
||||||
|
class Firebolt(BaseQueryRunner):
|
||||||
|
noop_query = "SELECT 1"
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def configuration_schema(cls):
|
||||||
|
return {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"host": {"type": "string", "default": "localhost"},
|
||||||
|
"port": {"type": "string", "default": 8123},
|
||||||
|
"DB": {"type": "string"},
|
||||||
|
"user": {"type": "string"},
|
||||||
|
"password": {"type": "string"},
|
||||||
|
},
|
||||||
|
"order": ["host", "port", "user", "password", "DB"],
|
||||||
|
"required": ["user","password"],
|
||||||
|
"secret": ["password"],
|
||||||
|
}
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def enabled(cls):
|
||||||
|
return enabled
|
||||||
|
|
||||||
|
def run_query(self, query, user):
|
||||||
|
connection = Connection(
|
||||||
|
host=self.configuration["host"],
|
||||||
|
port=self.configuration["port"],
|
||||||
|
username=(self.configuration.get("user") or None),
|
||||||
|
password=(self.configuration.get("password") or None),
|
||||||
|
db_name=(self.configuration.get("DB") or None),
|
||||||
|
)
|
||||||
|
|
||||||
|
cursor = connection.cursor()
|
||||||
|
|
||||||
|
try:
|
||||||
|
cursor.execute(query)
|
||||||
|
columns = self.fetch_columns(
|
||||||
|
[(i[0], TYPES_MAP.get(i[1], None)) for i in cursor.description]
|
||||||
|
)
|
||||||
|
rows = [
|
||||||
|
dict(zip((column["name"] for column in columns), row)) for row in cursor
|
||||||
|
]
|
||||||
|
|
||||||
|
data = {"columns": columns, "rows": rows}
|
||||||
|
error = None
|
||||||
|
json_data = json_dumps(data)
|
||||||
|
finally:
|
||||||
|
connection.close()
|
||||||
|
|
||||||
|
return json_data, error
|
||||||
|
|
||||||
|
|
||||||
|
def get_schema(self, get_stats=False):
|
||||||
|
query = """
|
||||||
|
SELECT TABLE_SCHEMA,
|
||||||
|
TABLE_NAME,
|
||||||
|
COLUMN_NAME
|
||||||
|
FROM INFORMATION_SCHEMA.COLUMNS
|
||||||
|
WHERE TABLE_SCHEMA <> 'INFORMATION_SCHEMA'
|
||||||
|
"""
|
||||||
|
|
||||||
|
results, error = self.run_query(query, None)
|
||||||
|
|
||||||
|
if error is not None:
|
||||||
|
raise Exception("Failed getting schema.")
|
||||||
|
|
||||||
|
schema = {}
|
||||||
|
results = json_loads(results)
|
||||||
|
|
||||||
|
for row in results["rows"]:
|
||||||
|
table_name = "{}.{}".format(row["table_schema"], row["table_name"])
|
||||||
|
|
||||||
|
if table_name not in schema:
|
||||||
|
schema[table_name] = {"name": table_name, "columns": []}
|
||||||
|
|
||||||
|
schema[table_name]["columns"].append(row["column_name"])
|
||||||
|
|
||||||
|
return list(schema.values())
|
||||||
|
|
||||||
|
|
||||||
|
register(Firebolt)
|
||||||
@@ -2,7 +2,9 @@ import logging
|
|||||||
import yaml
|
import yaml
|
||||||
import datetime
|
import datetime
|
||||||
from funcy import compact, project
|
from funcy import compact, project
|
||||||
from redash import settings
|
|
||||||
|
from redash.utils.requests_session import requests_or_advocate, UnacceptableAddressException
|
||||||
|
|
||||||
from redash.utils import json_dumps
|
from redash.utils import json_dumps
|
||||||
from redash.query_runner import (
|
from redash.query_runner import (
|
||||||
BaseHTTPQueryRunner,
|
BaseHTTPQueryRunner,
|
||||||
@@ -12,7 +14,6 @@ from redash.query_runner import (
|
|||||||
TYPE_FLOAT,
|
TYPE_FLOAT,
|
||||||
TYPE_INTEGER,
|
TYPE_INTEGER,
|
||||||
TYPE_STRING,
|
TYPE_STRING,
|
||||||
is_private_address,
|
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
@@ -163,8 +164,6 @@ class JSON(BaseHTTPQueryRunner):
|
|||||||
if "url" not in query:
|
if "url" not in query:
|
||||||
raise QueryParseError("Query must include 'url' option.")
|
raise QueryParseError("Query must include 'url' option.")
|
||||||
|
|
||||||
if is_private_address(query["url"]) and settings.ENFORCE_PRIVATE_ADDRESS_BLOCK:
|
|
||||||
raise Exception("Can't query private addresses.")
|
|
||||||
|
|
||||||
method = query.get("method", "get")
|
method = query.get("method", "get")
|
||||||
request_options = project(query, ("params", "headers", "data", "auth", "json"))
|
request_options = project(query, ("params", "headers", "data", "auth", "json"))
|
||||||
|
|||||||
@@ -29,7 +29,7 @@ class Sqlite(BaseSQLQueryRunner):
|
|||||||
|
|
||||||
def _get_tables(self, schema):
|
def _get_tables(self, schema):
|
||||||
query_table = "select tbl_name from sqlite_master where type='table'"
|
query_table = "select tbl_name from sqlite_master where type='table'"
|
||||||
query_columns = "PRAGMA table_info(%s)"
|
query_columns = "PRAGMA table_info(\"%s\")"
|
||||||
|
|
||||||
results, error = self.run_query(query_table, None)
|
results, error = self.run_query(query_table, None)
|
||||||
|
|
||||||
|
|||||||
@@ -64,7 +64,11 @@ INVITATION_TOKEN_MAX_AGE = int(
|
|||||||
)
|
)
|
||||||
|
|
||||||
# The secret key to use in the Flask app for various cryptographic features
|
# The secret key to use in the Flask app for various cryptographic features
|
||||||
SECRET_KEY = os.environ.get("REDASH_COOKIE_SECRET", "c292a0a3aa32397cdb050e233733900f")
|
SECRET_KEY = os.environ.get("REDASH_COOKIE_SECRET")
|
||||||
|
|
||||||
|
if SECRET_KEY is None:
|
||||||
|
raise Exception("You must set the REDASH_COOKIE_SECRET environment variable. Visit http://redash.io/help/open-source/admin-guide/secrets for more information.")
|
||||||
|
|
||||||
# The secret key to use when encrypting data source options
|
# The secret key to use when encrypting data source options
|
||||||
DATASOURCE_SECRET_KEY = os.environ.get("REDASH_SECRET_KEY", SECRET_KEY)
|
DATASOURCE_SECRET_KEY = os.environ.get("REDASH_SECRET_KEY", SECRET_KEY)
|
||||||
|
|
||||||
@@ -380,7 +384,10 @@ default_query_runners = [
|
|||||||
"redash.query_runner.cloudwatch",
|
"redash.query_runner.cloudwatch",
|
||||||
"redash.query_runner.cloudwatch_insights",
|
"redash.query_runner.cloudwatch_insights",
|
||||||
"redash.query_runner.corporate_memory",
|
"redash.query_runner.corporate_memory",
|
||||||
"redash.query_runner.sparql_endpoint"
|
"redash.query_runner.sparql_endpoint",
|
||||||
|
"redash.query_runner.excel",
|
||||||
|
"redash.query_runner.csv",
|
||||||
|
"redash.query_runner.firebolt"
|
||||||
]
|
]
|
||||||
|
|
||||||
enabled_query_runners = array_from_string(
|
enabled_query_runners = array_from_string(
|
||||||
|
|||||||
@@ -1,8 +1,14 @@
|
|||||||
import requests
|
|
||||||
from redash import settings
|
from redash import settings
|
||||||
|
|
||||||
|
from advocate.exceptions import UnacceptableAddressException
|
||||||
|
if settings.ENFORCE_PRIVATE_ADDRESS_BLOCK:
|
||||||
|
import advocate as requests_or_advocate
|
||||||
|
else:
|
||||||
|
import requests as requests_or_advocate
|
||||||
|
|
||||||
class ConfiguredSession(requests.Session):
|
|
||||||
|
|
||||||
|
class ConfiguredSession(requests_or_advocate.Session):
|
||||||
def request(self, *args, **kwargs):
|
def request(self, *args, **kwargs):
|
||||||
if not settings.REQUESTS_ALLOW_REDIRECTS:
|
if not settings.REQUESTS_ALLOW_REDIRECTS:
|
||||||
kwargs.update({"allow_redirects": False})
|
kwargs.update({"allow_redirects": False})
|
||||||
|
|||||||
@@ -8,9 +8,6 @@ httplib2==0.14.0
|
|||||||
wtforms==2.2.1
|
wtforms==2.2.1
|
||||||
Flask-RESTful==0.3.7
|
Flask-RESTful==0.3.7
|
||||||
Flask-Login==0.4.1
|
Flask-Login==0.4.1
|
||||||
Flask-OAuthLib==0.9.5
|
|
||||||
# pin this until https://github.com/lepture/flask-oauthlib/pull/388 is released
|
|
||||||
requests-oauthlib>=0.6.2,<1.2.0
|
|
||||||
Flask-SQLAlchemy==2.4.1
|
Flask-SQLAlchemy==2.4.1
|
||||||
Flask-Migrate==2.5.2
|
Flask-Migrate==2.5.2
|
||||||
flask-mail==0.9.1
|
flask-mail==0.9.1
|
||||||
@@ -42,6 +39,7 @@ jsonschema==3.1.1
|
|||||||
RestrictedPython==5.0
|
RestrictedPython==5.0
|
||||||
pysaml2==6.1.0
|
pysaml2==6.1.0
|
||||||
pycrypto==2.6.1
|
pycrypto==2.6.1
|
||||||
|
python-dotenv==0.19.2
|
||||||
funcy==1.13
|
funcy==1.13
|
||||||
sentry-sdk>=0.14.3,<0.15.0
|
sentry-sdk>=0.14.3,<0.15.0
|
||||||
semver==2.8.1
|
semver==2.8.1
|
||||||
@@ -67,3 +65,5 @@ werkzeug==0.16.1
|
|||||||
# Uncomment the requirement for ldap3 if using ldap.
|
# Uncomment the requirement for ldap3 if using ldap.
|
||||||
# It is not included by default because of the GPL license conflict.
|
# It is not included by default because of the GPL license conflict.
|
||||||
# ldap3==2.2.4
|
# ldap3==2.2.4
|
||||||
|
Authlib==0.15.5
|
||||||
|
advocate==1.0.0
|
||||||
@@ -1,4 +1,5 @@
|
|||||||
google-api-python-client==1.7.11
|
google-api-python-client==1.7.11
|
||||||
|
protobuf==3.17.3
|
||||||
gspread==3.1.0
|
gspread==3.1.0
|
||||||
impyla==0.16.0
|
impyla==0.16.0
|
||||||
influxdb==5.2.3
|
influxdb==5.2.3
|
||||||
@@ -37,3 +38,6 @@ python-rapidjson==0.8.0
|
|||||||
pyodbc==4.0.28
|
pyodbc==4.0.28
|
||||||
trino~=0.305
|
trino~=0.305
|
||||||
cmem-cmempy==21.2.3
|
cmem-cmempy==21.2.3
|
||||||
|
xlrd==2.0.1
|
||||||
|
openpyxl==3.0.7
|
||||||
|
firebolt-sqlalchemy
|
||||||
@@ -50,3 +50,32 @@ class TestDashboardsByUser(BaseTestCase):
|
|||||||
# not using self.assertIn/NotIn because otherwise this fails :O
|
# not using self.assertIn/NotIn because otherwise this fails :O
|
||||||
self.assertTrue(d in dashboards)
|
self.assertTrue(d in dashboards)
|
||||||
self.assertFalse(d2 in dashboards)
|
self.assertFalse(d2 in dashboards)
|
||||||
|
|
||||||
|
|
||||||
|
def test_returns_correct_number_of_dashboards(self):
|
||||||
|
# Solving https://github.com/getredash/redash/issues/5466
|
||||||
|
|
||||||
|
usr = self.factory.create_user()
|
||||||
|
|
||||||
|
ds1 = self.factory.create_data_source()
|
||||||
|
ds2 = self.factory.create_data_source()
|
||||||
|
|
||||||
|
qry1 = self.factory.create_query(data_source=ds1, user=usr)
|
||||||
|
qry2 = self.factory.create_query(data_source=ds2, user=usr)
|
||||||
|
|
||||||
|
viz1 = self.factory.create_visualization(query_rel=qry1, )
|
||||||
|
viz2 = self.factory.create_visualization(query_rel=qry2, )
|
||||||
|
|
||||||
|
def create_dashboard():
|
||||||
|
dash = self.factory.create_dashboard(name="boy howdy", user=usr)
|
||||||
|
self.factory.create_widget(dashboard=dash, visualization=viz1)
|
||||||
|
self.factory.create_widget(dashboard=dash, visualization=viz2)
|
||||||
|
|
||||||
|
return dash
|
||||||
|
|
||||||
|
d1 = create_dashboard()
|
||||||
|
d2 = create_dashboard()
|
||||||
|
|
||||||
|
results = Dashboard.all(self.factory.org, usr.group_ids, usr.id)
|
||||||
|
|
||||||
|
self.assertEqual(2, results.count(), "The incorrect number of dashboards were returned")
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
import mock
|
import mock
|
||||||
from unittest import TestCase
|
from unittest import TestCase
|
||||||
|
|
||||||
from redash.utils.requests_session import requests, ConfiguredSession
|
from redash.utils.requests_session import requests_or_advocate, ConfiguredSession
|
||||||
from redash.query_runner import BaseHTTPQueryRunner
|
from redash.query_runner import BaseHTTPQueryRunner
|
||||||
|
|
||||||
|
|
||||||
@@ -84,7 +84,7 @@ class TestBaseHTTPQueryRunner(TestCase):
|
|||||||
mock_response = mock.Mock()
|
mock_response = mock.Mock()
|
||||||
mock_response.status_code = 500
|
mock_response.status_code = 500
|
||||||
mock_response.text = "Server Error"
|
mock_response.text = "Server Error"
|
||||||
http_error = requests.HTTPError()
|
http_error = requests_or_advocate.HTTPError()
|
||||||
mock_response.raise_for_status.side_effect = http_error
|
mock_response.raise_for_status.side_effect = http_error
|
||||||
mock_get.return_value = mock_response
|
mock_get.return_value = mock_response
|
||||||
|
|
||||||
@@ -101,7 +101,7 @@ class TestBaseHTTPQueryRunner(TestCase):
|
|||||||
mock_response.status_code = 500
|
mock_response.status_code = 500
|
||||||
mock_response.text = "Server Error"
|
mock_response.text = "Server Error"
|
||||||
exception_message = "Some requests exception"
|
exception_message = "Some requests exception"
|
||||||
requests_exception = requests.RequestException(exception_message)
|
requests_exception = requests_or_advocate.RequestException(exception_message)
|
||||||
mock_response.raise_for_status.side_effect = requests_exception
|
mock_response.raise_for_status.side_effect = requests_exception
|
||||||
mock_get.return_value = mock_response
|
mock_get.return_value = mock_response
|
||||||
|
|
||||||
|
|||||||
Reference in New Issue
Block a user