Compare commits

..

124 Commits

Author SHA1 Message Date
Arik Fraimovich
bac4025eff Merge pull request #882 from tsibley/new-data-source-via-cli
CLI: Restore ability to pass JSON options string for a new data source
2016-03-06 12:01:23 +02:00
Thomas Sibley
d07bf7e0aa CLI: Restore ability to pass JSON options string for a new data source
Commit "Encapsulate data source/query runner configuration in an
object." (ed99b84) accidentally removed that functionality by not
inflating the passed in JSON into a ConfigurationContainer object.

This led to errors of the form if you passed -o:

    Traceback (most recent call last):
      ...
      File "/opt/redash/redash-git/redash/models.py", line 321, in db_value
	return value.to_json()
    AttributeError: 'unicode' object has no attribute 'to_json'
2016-03-04 09:22:42 -08:00
Arik Fraimovich
8f937b7a35 Merge pull request #850 from toyama0919/master
Kibana: add support for limiting # of results returned
2016-03-01 23:37:24 +02:00
Arik Fraimovich
8747d0e724 Merge pull request #872 from erans/master
MongoDB: support for count queries
2016-03-01 23:36:24 +02:00
Arik Fraimovich
4a280eea29 Merge pull request #877 from ink-adavison/ink-adavison-ubuntubasepathfix
Correct Ubuntu Bootstrap.sh to a working Base URL
2016-03-01 13:36:25 +02:00
ink-adavison
194e9f4d7e Correct Ubuntu Bootstrap.sh to a working Base URL
The redash/docker/setup/... path is returning 404, but redash/master/setup/... works
2016-03-01 11:17:11 +00:00
Arik Fraimovich
4c5d2f14bf Merge pull request #866 from jeffwidman/fix_end_of_file_spacing
Fix whitespace issues
2016-02-29 22:39:15 +02:00
Jeff Widman
b86cf6ea4d Check should be 'x not in y' rather than 'not x in y' 2016-02-29 12:34:50 -08:00
Jeff Widman
dd72faaa77 Fix docstring spacing per PEP 257 2016-02-29 12:34:50 -08:00
Jeff Widman
c1b33939d0 A few misc whitespaces fixes 2016-02-29 12:34:50 -08:00
Jeff Widman
1aad95986b Add spaces around arithmetic operators per PEP 8 2016-02-29 12:34:50 -08:00
Jeff Widman
80209defc9 Remove extraneous spaces at end of lines 2016-02-29 12:34:46 -08:00
Jeff Widman
c198d22691 Format files to end in a single newline per PEP 8 2016-02-29 12:00:03 -08:00
Arik Fraimovich
43ac5600e5 Merge pull request #873 from getredash/feature/print_layout
Feature: print layout for dashboards
2016-02-29 21:11:41 +02:00
Arik Fraimovich
863365a412 Feature: print layout for dashboards 2016-02-29 21:07:07 +02:00
Eran Sandler
50c6bca421 added support for count queries 2016-02-29 21:05:50 +02:00
Arik Fraimovich
30b97e37f0 Merge pull request #854 from erans/master
Minors fixes for MongoDB and Python query runners
2016-02-29 21:04:14 +02:00
Eran Sandler
7f96de8b22 updated reverted change 2016-02-29 21:02:05 +02:00
Arik Fraimovich
dec30549f6 Merge pull request #867 from jeffwidman/fix_weird_indenting
Fix non-standard indentation to conventional four spaces
2016-02-29 15:25:11 +02:00
Arik Fraimovich
1a9059f1cc Merge pull request #865 from jeffwidman/fix_flask_ext
Replace deprecated flask.ext.* with flask_*
2016-02-29 15:22:13 +02:00
Jeff Widman
5208abd072 Make lines indented by four spaces instead of three 2016-02-29 01:58:19 -08:00
Jeff Widman
0ccbb24b3f Fix non-standard indentation to conventional four spaces
Not sure what happened in this section of code, but it was incorrectly indented by two spaces rather than four in several places.
2016-02-29 01:09:43 -08:00
Jeff Widman
09ccec59f5 Replace deprecated flask.ext.* with flask_*
Importing flask extensions using flask.ext.* is deprecated in favor of flask_*
For background, see: https://github.com/mitsuhiko/flask/issues/1135
2016-02-29 00:39:50 -08:00
Arik Fraimovich
8688b1c432 Merge pull request #864 from getredash/fix/datasource_show
Fix: data source loaded without properties
2016-02-28 18:56:10 +02:00
Arik Fraimovich
d06d1ada28 Fix: data source loaded without properties 2016-02-28 18:54:20 +02:00
Arik Fraimovich
3328de3462 Merge pull request #863 from getredash/fix/filters_datetime
Fix: properly show date/time in filters
2016-02-28 11:06:02 +02:00
Arik Fraimovich
84f71d1837 Fix: properly show date/time in filters 2016-02-28 11:04:08 +02:00
Arik Fraimovich
f219d20299 Merge pull request #857 from jeffwidman/patch-1
Fix typo: completly => completely
2016-02-27 21:55:56 +02:00
Eran Sandler
bdd2e0c418 fixed the query runner actually running code 2016-02-26 09:59:49 +02:00
Jeff Widman
d0cdf53b33 Fix typo: completly => completely 2016-02-25 18:00:15 -08:00
Arik Fraimovich
27faf8f88a Merge pull request #849 from sortable/presto-column-names
Fix: Presto - deduplicate column names
2016-02-25 21:39:29 +02:00
Arik Fraimovich
caf0734bac Merge pull request #855 from sreynen/patch-2
Docs: document settings.py
2016-02-25 21:37:57 +02:00
Eran Sandler
5f501b9df6 added JSON serialization support for internal bson timestamp which sometimes gets return in newer pymongo versions 2016-02-25 09:25:26 +02:00
Scott Reynen
caaf180d13 Create settings.rst
This is mostly just a template so far, as I'm only able to describe the settings I already understand, which isn't many.
2016-02-24 16:28:24 -07:00
Eran Sandler
68220a0d67 Added 'additionalModulesPaths' to the config allowing import modules from an external verified path. You'd still need to whitelist the module name in 'allowedImportModules' 2016-02-25 00:35:00 +02:00
Colin Dellow
0ebb53994b presto: use the disambiguated column name 2016-02-24 15:40:22 -05:00
Colin Dellow
177b62ea40 presto: use existing disambiguation path 2016-02-24 15:23:50 -05:00
Arik Fraimovich
a26da3aed3 Merge pull request #846 from sreynen/patch-1
Add link to dev environment documentation.
2016-02-24 22:07:35 +02:00
Arik Fraimovich
3a27955d24 Merge pull request #853 from getredash/arikfr-patch-1
Fix #851: embed doesn't load due to missing module.
2016-02-24 21:59:29 +02:00
Arik Fraimovich
86f2a0172f Fix #851: embed doesn't load due to missing module. 2016-02-24 21:53:12 +02:00
Arik Fraimovich
db59b34bda Merge pull request #852 from hudl/ConfigurationFixes
Small fixes for new ConfigurationContainer use
2016-02-24 21:49:57 +02:00
Alex DeBrie
51e92e0c71 Small fixes for new ConfigurationContainer use 2016-02-24 17:52:49 +00:00
toyama0919
246ce10a7f fix bug: Kibana not working limit, all select results. 2016-02-24 13:02:28 +09:00
Colin Dellow
cde54cec8b presto: don't merge columns with the same name
Fixes #847
2016-02-23 18:40:49 -05:00
Scott Reynen
21dc36b506 Add link to dev environment documentation. 2016-02-23 14:16:47 -07:00
Arik Fraimovich
d74442184e Merge pull request #844 from getredash/kms
Encapsulate data sources configuration logic in an object
2016-02-23 15:06:29 +02:00
Arik Fraimovich
db3e689e68 Update query runners for new config class 2016-02-23 15:02:49 +02:00
Arik Fraimovich
491e2e10d1 Fix test 2016-02-23 15:02:49 +02:00
Arik Fraimovich
ed99b8452c Encapsulate data source/query runner configuration in an object.
This is a step towards adding more complex logic in configuration
handling, like encryption of secrets.
2016-02-23 15:02:49 +02:00
Arik Fraimovich
f1e90fde31 Merge pull request #843 from getredash/design/download_links
Fix: dashboard query results links broken
2016-02-23 11:35:06 +02:00
Arik Fraimovich
954e63a41f Fix: dashboard query results links broken 2016-02-23 11:33:13 +02:00
Arik Fraimovich
6ec4c4c19c Merge pull request #840 from getredash/design/download_links
Improve layout of download links
2016-02-22 11:53:43 +02:00
Arik Fraimovich
553c6ac8d7 Improve layout of download links 2016-02-22 11:50:42 +02:00
Arik Fraimovich
b462869be7 Merge pull request #833 from toyama0919/feature/download-excel-from-dashboard
Feature: download Excel file link from dashboard.
2016-02-22 10:55:20 +02:00
Arik Fraimovich
3a5d59cf69 Merge pull request #839 from getredash/feature/api_params
Feature: add API to trigger query refresh and support for parameters.
2016-02-22 10:43:42 +02:00
Arik Fraimovich
c12b059d10 Add API to trigger query refresh and support for parameters. 2016-02-22 10:40:46 +02:00
Arik Fraimovich
e705ede3b7 Merge pull request #838 from erans/master
Python query runner -- added access to sorted and reversed functions
2016-02-21 13:03:05 +02:00
Eran Sandler
3b5aafa8e1 Added access to sorted and reversed functions 2016-02-21 11:13:46 +02:00
Arik Fraimovich
2440a83e46 Merge pull request #835 from benmanns/ubuntu-bootstrap-update-reorder
Ubuntu bootstrap script - move update before upgrade
2016-02-18 10:56:00 +02:00
Benjamin Manns
2b5a36cb3f Move update before upgrade
Running update before upgrade will fetch the latest sources, so we
can be sure that the upgrades will bring the box to the latest
versions of everything. Otherwise, this is often a no-op because
the box's sources will be cached at time of generation, meaning
there is nothing to upgrade.
2016-02-17 16:11:16 -05:00
toyama0919
78511fd0ce add feature, Excel download from dashboard. 2016-02-17 14:48:23 +09:00
Arik Fraimovich
a50ae19236 Merge pull request #823 from mobiledefense/add-widescreen-toggle
Feature: Button toggle to display dashboard in at full screen width
2016-02-16 10:23:16 +02:00
Arik Fraimovich
65f81c4d93 Merge pull request #831 from ninneko/801-download-excel
Feature: download results in Excel (XSLX) format (closes #801)
2016-02-16 09:19:53 +02:00
yohei.naruse
0afca7321a #801 fix test case.
schedule = "{:02d}:00".format(now.hour - 3) maybe be negative value when now.hour < 3.
I've fixed it.
2016-02-16 13:16:31 +09:00
yohei.naruse
32824f7575 apply reviews 2016-02-16 10:56:06 +09:00
yohei.naruse
2f16c8ae5f #801 Download DataSheets as Excel file 2016-02-15 23:47:31 +09:00
Arik Fraimovich
868263315b Merge pull request #829 from getredash/fix/embed
Fix: Plot.ly was given wrong timestamp
2016-02-14 20:40:43 +02:00
Arik Fraimovich
1ceddc9e91 Fix: we were sending wrong timestamp to Plotly 2016-02-14 20:38:32 +02:00
Arik Fraimovich
a96d135a4f Merge pull request #828 from getredash/fix/embed
Fix #797: user redirected to homepage when changing permission type
2016-02-14 20:11:51 +02:00
Arik Fraimovich
cec4e71d99 Fix #797: user redirected to homepage when changing permission type 2016-02-14 16:01:36 +02:00
Arik Fraimovich
0730ed8ed4 Merge pull request #827 from getredash/fix/embed
Feature: pivot tables are now regular visualizations that can be *saved*
2016-02-14 15:21:45 +02:00
Arik Fraimovich
e3420acd4b Feature: pivots are now regular visualizations that can be *saved*. 2016-02-14 15:17:52 +02:00
Arik Fraimovich
d21e2a79cc Close #772: upgrade to latest PivotTable.js lib 2016-02-14 14:40:01 +02:00
Arik Fraimovich
d1cf376ab3 Merge pull request #826 from getredash/fix/embed
Fix #802: switching to/from query source view resets chart colors
2016-02-14 14:13:24 +02:00
Arik Fraimovich
0ea0ba3fbe Fix #802: switching to/from query source view resets chart colors 2016-02-14 13:52:53 +02:00
Arik Fraimovich
0c93fe12ba Amend jshint settings 2016-02-14 13:51:32 +02:00
Arik Fraimovich
dad7b22cba Merge pull request #825 from getredash/fix/embed
Fix: sorting X values in charts had no effect
2016-02-14 13:26:32 +02:00
Arik Fraimovich
19766cf4ce Fix: sorting X values had no effect. 2016-02-14 13:24:14 +02:00
Arik Fraimovich
5e2727cfdf Use unminified d3.js in development. 2016-02-14 12:25:16 +02:00
Arik Fraimovich
3da326009b Bump Plot.ly version to a more recent one. 2016-02-14 12:24:56 +02:00
Matt Sochor
240739a445 Add dashboard toggle to display dashboard in at full screen width 2016-02-11 16:39:40 -05:00
Arik Fraimovich
253c4fd0a6 Merge pull request #821 from getredash/fix/embed
Fix embed URL & move logic into a directive
2016-02-10 15:41:05 +02:00
Arik Fraimovich
cda1068ff1 Show logo in embdes 2016-02-10 15:39:02 +02:00
Arik Fraimovich
eb324a4067 Limit the amount of information we return for embeds 2016-02-10 15:34:48 +02:00
Arik Fraimovich
8cf7314dc0 Fix embed URL & move logic into a directive 2016-02-10 15:34:31 +02:00
Arik Fraimovich
32b928d247 BSD 2-Clause
Updated copyright holder & removed last paragraph that doesn't belong to BSD-2 clause.
2016-02-10 11:32:00 +02:00
Arik Fraimovich
a5168ecc80 Update bootstrap.sh to more recent release 2016-02-09 15:18:58 +02:00
Arik Fraimovich
262ebb3bf1 Merge pull request #820 from getredash/gulp
Switch to Gulp from Grunt for faster builds
2016-02-09 15:15:38 +02:00
Arik Fraimovich
3e58d8798a Copy additional files on gulp build 2016-02-09 15:09:31 +02:00
Arik Fraimovich
bab536aaea Support for embeds in multi-org 2016-02-09 14:59:38 +02:00
Arik Fraimovich
bab4080430 Switch to Gulp from Grunt 2016-02-09 14:59:19 +02:00
Arik Fraimovich
a894f035dd Merge pull request #815 from getredash/fix/cli
Fix CLI issues with recent version
2016-02-09 00:56:16 +02:00
Arik Fraimovich
d4a83e29d4 Fix: delete data source CLI failing when data source has references 2016-02-09 00:03:35 +02:00
Arik Fraimovich
ded4761c8a If start_time not found skip metric collections (probably not running in real request context) 2016-02-09 00:01:44 +02:00
Arik Fraimovich
3fa143cfb1 Merge pull request #813 from ojarva/shellcheck-fixes
Fix shellcheck issues in bootstrap.sh scripts
2016-02-07 08:49:06 +02:00
Olli Jarva
de01184bbd Small shellcheck updates
Fix shellcheck complaints. These changes are not particularly important,
but spotting new/real issues is easier when checker output is empty by
default.
2016-02-06 16:42:24 +02:00
Arik Fraimovich
635bcc3e9f Pull latest docker image before building 2016-02-03 11:06:09 +02:00
Arik Fraimovich
b6b8daced6 Update Circle: build deps on master branch 2016-02-03 10:20:07 +02:00
Arik Fraimovich
b222f85d88 Add freetds-dev to Dockerfile 2016-02-03 09:06:01 +02:00
Arik Fraimovich
27c3fee345 Merge pull request #808 from joeharris76/master
Feature: Microsoft SQL Server query runner
2016-02-03 08:48:05 +02:00
Joe Harris
8c48ec5508 Cleanup of issues with the SQL Server feature PR 2016-02-02 16:18:21 -05:00
Joe Harris
cc176f5cba Add error handling to the pymssql import 2016-02-02 09:24:58 -05:00
Joe Harris
3a970a00c4 Add Microsoft SQL Server as a data source
Uses `pymssql` which in turn uses `FreeTDS`. Note that the data type
support is somewhat limited (see “datasources” page in docs).
2016-02-01 16:53:52 -05:00
Joe Harris
3b395a05b8 Merge pull request #1 from getredash/master
Pull from origin
2016-01-29 15:55:50 -05:00
Arik Fraimovich
9fa249a519 Update screenshots. 2016-01-27 12:55:52 +02:00
Arik Fraimovich
4e9b60ac82 Merge pull request #794 from getredash/hotfixes
Fixes for #792, #785, #733 and additional logging for execute_query
2016-01-24 12:03:34 +02:00
Arik Fraimovich
7a7e5be166 Fix #733: update migrations to work with new code 2016-01-24 11:57:05 +02:00
Arik Fraimovich
a1eec8490a Add more logging to execute_query 2016-01-24 11:32:44 +02:00
Arik Fraimovich
197bbde788 Fix #785: remove admin check box and direct users to use the groups
admin.
2016-01-24 11:08:02 +02:00
Arik Fraimovich
fed9d80fdb Fix #792: can't grant admin with CLI 2016-01-24 10:58:05 +02:00
Arik Fraimovich
78ba6f2739 Merge pull request #781 from woei66/master
Amazon Linux bootstrap script: check nginx default directory
2016-01-23 16:52:39 +02:00
Arik Fraimovich
cbb84ae3d3 Merge pull request #786 from JohnConnell/patch-1
Docs: instructions for compressed backup.
2016-01-23 16:51:21 +02:00
Arik Fraimovich
8120158119 Merge pull request #782 from shyamgopal/bug-768
Fix: Empty cells in google sheets displayed as datetime values #768
2016-01-23 16:50:44 +02:00
Arik Fraimovich
bd7b60d859 Merge pull request #784 from bobrik/fix-isoformat
Fix json serialization for datetime.timedelta, closes #783
2016-01-23 16:49:46 +02:00
Arik Fraimovich
80c03a5900 Merge pull request #787 from JohnConnell/patch-2
Docs: Updated links to Google's documentation about creating a service account
2016-01-23 16:46:04 +02:00
Arik Fraimovich
77e2d5db9b Merge pull request #790 from tknzk/fix_typo_on_doc
Docs: fix a typo in backup instructions
2016-01-23 16:44:35 +02:00
tknzk
7174dd856e fix a typo. 2016-01-22 17:45:40 +09:00
John Connell
6b5efc9e16 Update datasources.rst
Updated links to Google's documentation about creating a service account.
2016-01-21 13:05:02 -07:00
John Connell
4f95205795 Update maintenance.rst
Added: How to create a compressed backup.
2016-01-21 12:51:13 -07:00
Ivan Babrou
e26ea40c9b Fix json serialization for datetime.timedelta, closes #783 2016-01-21 14:37:36 +00:00
Arik Fraimovich
24137e87fd Update cloud images references 2016-01-21 14:47:51 +02:00
Shyamgopal Kundapurkar
221ec3a2a1 Fix of #768 2016-01-21 16:35:03 +05:30
David Lin
7081e25fa3 add -y to expect package, check nginx default directory and install to the right directory 2016-01-21 06:37:12 +00:00
Arik Fraimovich
8d126331cf Fix #778: update docs with correct CLI command. 2016-01-20 22:19:43 +02:00
136 changed files with 1738 additions and 1233 deletions

View File

@@ -1,5 +1,4 @@
FROM ubuntu:trusty
MAINTAINER Di Wu <diwu@yelp.com>
# Ubuntu packages
RUN apt-get update && \
@@ -7,7 +6,7 @@ RUN apt-get update && \
# Postgres client
libpq-dev \
# Additional packages required for data sources:
libssl-dev libmysqlclient-dev && \
libssl-dev libmysqlclient-dev freetds-dev && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*

19
LICENSE
View File

@@ -1,4 +1,5 @@
Copyright 2013 DoAT. All rights reserved.
Copyright (c) 2013-2016, Arik Fraimovich.
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted provided that the following conditions are met:
@@ -10,17 +11,13 @@ are permitted provided that the following conditions are met:
this list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
THIS SOFTWARE IS PROVIDED “AS IS” WITHOUT ANY WARRANTIES WHATSOEVER.
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF NON INFRINGEMENT, MERCHANTABILITY AND FITNESS FOR A
PARTICULAR PURPOSE ARE HEREBY DISCLAIMED. IN NO EVENT SHALL DoAT OR CONTRIBUTORS
BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS
BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE,
EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
The views and conclusions contained in the software and documentation are those of
the authors and should not be interpreted as representing official policies,
either expressed or implied, of DoAT.

View File

@@ -26,7 +26,7 @@ Presto, Google Spreadsheets, Cloudera Impala, Hive and custom scripts.
## Demo
![Screenshots](https://raw.github.com/getredash/redash/screenshots/screenshots.gif)
<img src="https://cloud.githubusercontent.com/assets/71468/12611424/1faf4d6a-c4f5-11e5-89b5-31efc1155d2c.gif" width="60%"/>
You can try out the demo instance: http://demo.redash.io/ (login with any Google account).
@@ -45,7 +45,7 @@ You can try out the demo instance: http://demo.redash.io/ (login with any Google
## Reporting Bugs and Contributing Code
* Want to report a bug or request a feature? Please open [an issue](https://github.com/getredash/redash/issues/new).
* Want to help us build **_re:dash_**? Fork the project and make a pull request. We need all the help we can get!
* Want to help us build **_re:dash_**? Fork the project, edit in a [dev environment](http://docs.redash.io/en/latest/dev/vagrant.html), and make a pull request. We need all the help we can get!
## License

View File

@@ -63,7 +63,7 @@ def upload_asset(release, filepath):
headers = {'Content-Type': 'application/gzip'}
response = requests.post(upload_url, file_content, params={'name': filename}, headers=headers, auth=auth, verify=False)
if response.status_code != 201: # not 200/201/...
if response.status_code != 201: # not 200/201/...
raise exception_from_error('Failed uploading asset', response)
return response

View File

@@ -11,6 +11,7 @@ dependencies:
pre:
- pip install -r requirements_dev.txt
- pip install -r requirements.txt
- if [ "$CIRCLE_BRANCH" = "master" ]; then make deps; fi
cache_directories:
- rd_ui/node_modules/
- rd_ui/app/bower_components/
@@ -21,10 +22,10 @@ deployment:
github_and_docker:
branch: master
commands:
- make deps
- make pack
- make upload
- echo "rd_ui/app" >> .dockerignore
- docker pull redash/redash:latest
- docker build -t redash/redash:$(./manage.py version | sed -e "s/\+/./") .
- docker login -e $DOCKER_EMAIL -u $DOCKER_USER -p $DOCKER_PASS
- docker push redash/redash:$(./manage.py version | sed -e "s/\+/./")

View File

@@ -108,4 +108,3 @@ texinfo_documents = [
author, 'redash', 'One line description of project.',
'Miscellaneous'),
]

View File

@@ -48,7 +48,7 @@ Google BigQuery
- **Options**:
- Project ID (mandatory)
- JSON key file, generated when creating a service account (see `instructions <https://developers.google.com/console/help/new/#serviceaccounts>`__).
- JSON key file, generated when creating a service account (see `instructions <https://developers.google.com/identity/protocols/OAuth2ServiceAccount#creatinganaccount>`__).
- **Additional requirements**:
@@ -145,7 +145,7 @@ Google Spreadsheets
- **Options**:
- JSON key file, generated when creating a service account (see `instructions <https://developers.google.com/console/help/new/#serviceaccounts>`__).
- JSON key file, generated when creating a service account (see `instructions <https://developers.google.com/identity/protocols/OAuth2ServiceAccount#creatinganaccount>`__).
- **Additional requirements**:
@@ -231,3 +231,31 @@ Treasure Data
- Must have account on https://console.treasuredata.com
Documentation: https://docs.treasuredata.com/articles/redash
Microsoft SQL Server
-----
- **Options**:
- Database (mandatory)
- User #TODO: DB users only? What about domain users?
- Password
- Server
- Port
- **Notes**:
- Data type support is currently quite limited.
- Complex and new types are converted to strings in ``re:dash``
- Coerce into simpler types if needed using ``CAST()``
- Known conversion issues for:
- DATE
- TIME
- DATETIMEOFFSET
- **Additional requirements**:
- ``freetds-dev`` C library
- ``pymsssql`` python package, requires FreeTDS to be installed first

View File

@@ -76,7 +76,7 @@ query as a template and merge it with parameters taken from query string
or UI (or both).
When the caching facility isn't required (with queries that return in a
reasonable time frame) the implementation can be completly client side
reasonable time frame) the implementation can be completely client side
and the backend can be "blind" to the parameters - it just receives the
final query to execute and returns result.

View File

@@ -48,7 +48,7 @@ How To: Backup your re:dash database and restore it on a different server
.. code::
dbdrop redash
dropdb redash
createdb -T template0 redash
gunzip -c redash_backup.gz | psql redash

62
docs/settings.rst Normal file
View File

@@ -0,0 +1,62 @@
Settings
########
Much of the functionality of re:dash can be changes with settings. Settings are read by `/redash/settings.py` from environment variables which (for most installs) can be set in `/opt/redash/current/.env`
The follow is a list of settings and what they control:
- **REDASH_NAME**: name of the site, used in page titles, *default "re:dash"*
- **REDASH_REDIS_URL**: *default "redis://localhost:6379/0"*
- **REDASH_PROXIES_COUNT**: *default "1"*
- **REDASH_STATSD_HOST**: *default "127.0.0.1"*
- **REDASH_STATSD_PORT**: *default "8125"*
- **REDASH_STATSD_PREFIX**: *default "redash"*
- **REDASH_DATABASE_URL**: *default "postgresql://postgres"*
- **REDASH_CELERY_BROKER**: *default REDIS_URL*
- **REDASH_CELERY_BACKEND**: *default CELERY_BROKER*
- **REDASH_QUERY_RESULTS_CLEANUP_ENABLED**: *default "true"*
- **REDASH_QUERY_RESULTS_CLEANUP_COUNT**: *default "100"*
- **REDASH_QUERY_RESULTS_CLEANUP_MAX_AGE**: *default "7"*
- **REDASH_AUTH_TYPE**: *default "api_key"*
- **REDASH_PASSWORD_LOGIN_ENABLED**: *default "true"*
- **REDASH_ENFORCE_HTTPS**: *default "false"*
- **REDASH_MULTI_ORG**: *default "false"*
- **REDASH_GOOGLE_APPS_DOMAIN**: *default ""*
- **REDASH_GOOGLE_CLIENT_ID**: *default ""*
- **REDASH_GOOGLE_CLIENT_SECRET**: *default ""*
- **REDASH_SAML_METADATA_URL**: *default ""*
- **REDASH_SAML_CALLBACK_SERVER_NAME**: *default ""*
- **REDASH_STATIC_ASSETS_PATH**: *default "../rd_ui/app/"*
- **REDASH_JOB_EXPIRY_TIME**: *default 3600 * 6*
- **REDASH_COOKIE_SECRET**: *default "c292a0a3aa32397cdb050e233733900f"*
- **REDASH_LOG_LEVEL**: *default "INFO"*
- **REDASH_ANALYTICS**: *default ""*
- **REDASH_MAIL_SERVER**: *default "localhost"*
- **REDASH_MAIL_PORT**: *default 25*
- **REDASH_MAIL_USE_TLS**: *default "false"*
- **REDASH_MAIL_USE_SSL**: *default "false"*
- **REDASH_MAIL_USERNAME**: *default None*
- **REDASH_MAIL_PASSWORD**: *default None*
- **REDASH_MAIL_DEFAULT_SENDER**: *default None*
- **REDASH_MAIL_MAX_EMAILS**: *default None*
- **REDASH_MAIL_ASCII_ATTACHMENTS**: *default "false"*
- **REDASH_HOST**: *default ""*
- **REDASH_HIPCHAT_API_TOKEN**: *default None*
- **REDASH_HIPCHAT_API_URL**: *default None*
- **REDASH_HIPCHAT_ROOM_ID**: *default None*
- **REDASH_WEBHOOK_ENDPOINT**: *default None*
- **REDASH_WEBHOOK_USERNAME**: *default None*
- **REDASH_CORS_ACCESS_CONTROL_ALLOW_ORIGIN**: *default ""*
- **REDASH_CORS_ACCESS_CONTROL_ALLOW_CREDENTIALS**: *default "false"*
- **REDASH_CORS_ACCESS_CONTROL_REQUEST_METHOD**: *default GET, POST, PUT""*
- **REDASH_CORS_ACCESS_CONTROL_ALLOW_HEADERS**: *default "Content-Type"*
- **REDASH_ENABLED_QUERY_RUNNERS**: *default ",".join(default_query_runners)*
- **REDASH_ADDITIONAL_QUERY_RUNNERS**: *default ""*
- **REDASH_SENTRY_DSN**: *default ""*
- **REDASH_ALLOW_SCRIPTS_IN_USER_INPUT**: disable sanitization of text input, allowing full HTML, *default "true"*
- **REDASH_DATE_FORMAT**: *default "DD/MM/YY"*
- **REDASH_FEATURE_ALLOW_ALL_TO_EDIT**: *default "true"*
- **REDASH_FEATURE_TABLES_PERMISSIONS**: *default "false"*
- **REDASH_VERSION_CEHCK**: *default "true"*
- **REDASH_BIGQUERY_HTTP_TIMEOUT**: *default "600"*
- **REDASH_SCHEMA_RUN_TABLE_SIZE_CALCULATIONS**: *default "false"*

View File

@@ -18,17 +18,19 @@ AWS
Launch the instance with from the pre-baked AMI (for small deployments
t2.micro should be enough):
- us-east-1: `ami-752c7f10 <https://console.aws.amazon.com/ec2/home?region=us-east-1#LaunchInstanceWizard:ami=ami-752c7f10>`__
- us-west-1: `ami-b36babf7 <https://console.aws.amazon.com/ec2/home?region=us-west-1#LaunchInstanceWizard:ami=ami-b36babf7>`__
- us-west-2: `ami-a0a04393 <https://console.aws.amazon.com/ec2/home?region=us-west-2#LaunchInstanceWizard:ami=ami-a0a04393>`__
- eu-west-1: `ami-198cb16e <https://console.aws.amazon.com/ec2/home?region=eu-west-1#LaunchInstanceWizard:ami=ami-198cb16e>`__
- eu-central-1: `ami-a81418b5 <https://console.aws.amazon.com/ec2/home?region=eu-central-1#LaunchInstanceWizard:ami=ami-a81418b5>`__
- sa-east-1: `ami-2b52c336 <https://console.aws.amazon.com/ec2/home?region=sa-east-1#LaunchInstanceWizard:ami=ami-2b52c336>`__
- ap-northeast-1: `ami-4898fb48 <https://console.aws.amazon.com/ec2/home?region=ap-northeast-1#LaunchInstanceWizard:ami=ami-4898fb48>`__
- ap-southeast-2: `ami-7559134f <https://console.aws.amazon.com/ec2/home?region=ap-southeast-2#LaunchInstanceWizard:ami=ami-7559134f>`__
- ap-southeast-1: `ami-a0786bf2 <https://console.aws.amazon.com/ec2/home?region=ap-southeast-1#LaunchInstanceWizard:ami=ami-a0786bf2>`__
- us-east-1: `ami-a7ddfbcd <https://console.aws.amazon.com/ec2/home?region=us-east-1#LaunchInstanceWizard:ami=ami-a7ddfbcd>`__
- us-west-1: `ami-269feb46 <https://console.aws.amazon.com/ec2/home?region=us-west-1#LaunchInstanceWizard:ami=ami-269feb46>`__
- us-west-2: `ami-435fba23 <https://console.aws.amazon.com/ec2/home?region=us-west-2#LaunchInstanceWizard:ami=ami-435fba23>`__
- eu-west-1: `ami-b4c277c7 <https://console.aws.amazon.com/ec2/home?region=eu-west-1#LaunchInstanceWizard:ami=ami-b4c277c7>`__
- eu-central-1: `ami-07ced76b <https://console.aws.amazon.com/ec2/home?region=eu-central-1#LaunchInstanceWizard:ami=ami-07ced76b>`__
- sa-east-1: `ami-6e2eaf02 <https://console.aws.amazon.com/ec2/home?region=sa-east-1#LaunchInstanceWizard:ami=ami-6e2eaf02>`__
- ap-northeast-1: `ami-aa5a64c4 <https://console.aws.amazon.com/ec2/home?region=ap-northeast-1#LaunchInstanceWizard:ami=ami-aa5a64c4>`__
- ap-southeast-1: `ami-1c45897f <https://console.aws.amazon.com/ec2/home?region=ap-southeast-1#LaunchInstanceWizard:ami=ami-1c45897f>`__
- ap-southeast-2: `ami-42b79221 <https://console.aws.amazon.com/ec2/home?region=ap-southeast-2#LaunchInstanceWizard:ami=ami-42b79221>`__
When launching the instance make sure to use a security grop, that only allows incoming traffic on: port 22 (SSH), 80 (HTTP) and 443 (HTTPS).
(the above AMIs are of version: 0.9.1)
When launching the instance make sure to use a security group, that **only** allows incoming traffic on: port 22 (SSH), 80 (HTTP) and 443 (HTTPS).
Now proceed to `"Setup" <#setup>`__.
@@ -39,7 +41,7 @@ First, you need to add the images to your account:
.. code:: bash
$ gcloud compute images create "redash-081-b1110" --source-uri gs://redash-images/redash.0.8.1.b1110.tar.gz
$ gcloud compute images create "redash-091-b1377" --source-uri gs://redash-images/redash.0.9.1.b1377.tar.gz
Next you need to launch an instance using this image (n1-standard-1
instance type is recommended). If you plan using re:dash with BigQuery,
@@ -48,13 +50,13 @@ you can use a dedicated image which comes with BigQuery preconfigured
.. code:: bash
$ gcloud compute images create "redash-081-b1110-bq" --source-uri gs://redash-images/redash.0.8.1.b1110-bq.tar.gz
$ gcloud compute images create "redash-091-b1377-bq" --source-uri gs://redash-images/redash.0.9.1.b1377-bq.tar.gz
Note that you need to launch this instance with BigQuery access:
.. code:: bash
$ gcloud compute instances create <your_instance_name> --image redash-081-b1110-bq --scopes storage-ro,bigquery
$ gcloud compute instances create <your_instance_name> --image redash-091-b1377-bq --scopes storage-ro,bigquery
(the same can be done from the web interface, just make sure to enable
BigQuery access)
@@ -113,7 +115,7 @@ file.
.. code::
cd /opt/redash/current
sudo -u redash bin/run ./manage.py set_google_apps_domains {{domains}}
sudo -u redash bin/run ./manage.py org set_google_apps_domains {{domains}}
If you're passing multiple domains, separate them with commas.
@@ -124,8 +126,7 @@ If you're passing multiple domains, separate them with commas.
6. Once you have Google OAuth enabled, you can login using your Google
Apps account. If you want to grant admin permissions to some users,
you can do this by editing the user profile and enabling admin
permission for it.
you can do this by adding them to the admin group (from ``/groups`` page).
7. If you don't use Google OAuth or just need username/password logins,
you can create additional users by opening the ``/users/new`` page.

View File

@@ -60,7 +60,9 @@ DB
Backup re:dash's DB:
--------------------
``sudo -u redash pg_dump > backup_filename.sql``
Uncompressed backup: ``sudo -u redash pg_dump > backup_filename.sql``
Compressed backup: ``sudo -u redash pg_dump redash | gzip > backup_filename.gz``
Version
=======

View File

@@ -4,7 +4,7 @@ CLI to manage redash.
"""
import json
from flask.ext.script import Manager
from flask_script import Manager
from redash import settings, models, __version__
from redash.wsgi import app

View File

@@ -18,4 +18,3 @@ if __name__ == '__main__':
db.database.execute_sql("ALTER TABLE {} ALTER COLUMN {} TYPE timestamp with time zone;".format(*column))
db.close_db(None)

View File

@@ -1,13 +1,31 @@
import json
import jsonschema
from jsonschema import ValidationError
from redash import query_runner
from redash.models import DataSource
def validate_configuration(query_runner_type, configuration_json):
query_runner_class = query_runner.query_runners.get(query_runner_type, None)
if query_runner_class is None:
return False
try:
if isinstance(configuration_json, basestring):
configuration = json.loads(configuration_json)
else:
configuration = configuration_json
jsonschema.validate(configuration, query_runner_class.configuration_schema())
except (ValidationError, ValueError):
return False
return True
def update(data_source):
print "[%s] Old options: %s" % (data_source.name, data_source.options)
if query_runner.validate_configuration(data_source.type, data_source.options):
if validate_configuration(data_source.type, data_source.options):
print "[%s] configuration already valid. skipping." % data_source.name
return
@@ -65,9 +83,9 @@ def update(data_source):
print "[%s] No need to convert type of: %s" % (data_source.name, data_source.type)
print "[%s] New options: %s" % (data_source.name, data_source.options)
data_source.save()
data_source.save(only=data_source.dirty_fields)
if __name__ == '__main__':
for data_source in DataSource.select():
update(data_source)
for data_source in DataSource.select(DataSource.id, DataSource.name, DataSource.type, DataSource.options):
update(data_source)

View File

@@ -23,4 +23,3 @@ if __name__ == '__main__':
db.database.execute_sql("UPDATE widgets SET updated_at = created_at;")
db.close_db(None)

View File

@@ -15,5 +15,3 @@ if __name__ == '__main__':
db.database.execute_sql("UPDATE queries SET last_modified_by_id = user_id;")
db.close_db(None)

View File

@@ -19,5 +19,3 @@ if __name__ == '__main__':
)
db.close_db(None)

View File

@@ -14,14 +14,11 @@ if __name__ == '__main__':
migrator.add_column('users', 'api_key', models.User.api_key),
)
for user in models.User.select():
user.save()
for user in models.User.select(models.User.id, models.User.api_key):
user.save(only=user.dirty_fields)
migrate(
migrator.add_not_null('users', 'api_key')
)
db.close_db(None)

View File

@@ -12,7 +12,3 @@ if __name__ == '__main__':
)
db.close_db(None)

View File

@@ -2,7 +2,7 @@ from redash.models import db, Alert, AlertSubscription
if __name__ == '__main__':
with db.database.transaction():
Alert.create_table()
AlertSubscription.create_table()
Alert.create_table()
AlertSubscription.create_table()
db.close_db(None)

View File

@@ -12,7 +12,7 @@ def convert_p12_to_pem(p12file):
if __name__ == '__main__':
for ds in DataSource.select():
for ds in DataSource.select(DataSource.id, DataSource.type, DataSource.options):
if ds.type == 'bigquery':
options = json.loads(ds.options)
@@ -29,7 +29,7 @@ if __name__ == '__main__':
}
ds.options = json.dumps(new_options)
ds.save()
ds.save(only=ds.dirty_fields)
elif ds.type == 'google_spreadsheets':
options = json.loads(ds.options)
if 'jsonKeyFile' in options:
@@ -41,4 +41,4 @@ if __name__ == '__main__':
}
ds.options = json.dumps(new_options)
ds.save()
ds.save(only=ds.dirty_fields)

View File

@@ -1,6 +1,7 @@
from redash import models
if __name__ == '__main__':
default_group = models.Group.get(models.Group.name=='default')
default_group = models.Group.select(models.Group.id, models.Group.permissions).where(models.Group.name=='default').first()
default_group.permissions.append('list_users')
default_group.save()
default_group.save(only=[models.Group.permissions])

View File

@@ -21,4 +21,3 @@ if __name__ == '__main__':
print "After: ", options
vis.options = json.dumps(options)
vis.save()

View File

@@ -1,3 +1,4 @@
import peewee
from playhouse.migrate import PostgresqlMigrator, migrate
from redash.models import db
@@ -7,8 +8,14 @@ if __name__ == '__main__':
db.connect_db()
migrator = PostgresqlMigrator(db.database)
cursor = db.database.execute_sql("SELECT column_name FROM information_schema.columns WHERE table_name='alerts' and column_name='rearm';")
if cursor.rowcount > 0:
print "Column exists. Skipping."
exit()
with db.database.transaction():
migrate(
migrator.add_column('alerts', 'rearm', models.Alert.rearm),
)
db.close_db(None)

View File

@@ -4,7 +4,7 @@ from redash.models import DataSource
if __name__ == '__main__':
for ds in DataSource.select():
for ds in DataSource.select(DataSource.id, DataSource.type):
if ds.type == 'elasticsearch':
ds.type = 'kibana'
ds.save()
ds.save(only=ds.dirty_fields)

View File

@@ -1,6 +1,6 @@
from redash import models
if __name__ == '__main__':
default_group = models.Group.get(models.Group.name=='default')
default_group = models.Group.select(models.Group.id, models.Group.permissions).where(models.Group.name=='default').first()
default_group.permissions.append('schedule_query')
default_group.save()
default_group.save(only=[models.Group.permissions])

View File

@@ -7,4 +7,3 @@ if __name__ == '__main__':
AlertSubscription.create_table()
db.close_db(None)

View File

@@ -12,7 +12,3 @@ if __name__ == '__main__':
)
db.close_db(None)

View File

@@ -32,4 +32,3 @@ if __name__ == '__main__':
)
db.close_db(None)

View File

@@ -42,4 +42,3 @@ if __name__ == '__main__':
migrate(migrator.drop_column('users', 'old_groups'))
db.close_db(None)

View File

@@ -11,15 +11,17 @@
"latedef": true,
"newcap": true,
"noarg": true,
"quotmark": "single",
"quotmark": false,
"regexp": true,
"undef": true,
"unused": true,
"strict": true,
"strict": false,
"trailing": true,
"smarttabs": true,
"globals": {
"angular": false,
"_": false
"_": false,
"$": false,
"currentUser": false
}
}

View File

@@ -1,409 +0,0 @@
// Generated on 2014-07-30 using generator-angular 0.9.2
'use strict';
// # Globbing
// for performance reasons we're only matching one level down:
// 'test/spec/{,*/}*.js'
// use this if you want to recursively match all subfolders:
// 'test/spec/**/*.js'
module.exports = function (grunt) {
// Load grunt tasks automatically
require('load-grunt-tasks')(grunt);
// Time how long tasks take. Can help when optimizing build times
require('time-grunt')(grunt);
// Configurable paths for the application
var appConfig = {
app: require('./bower.json').appPath || 'app',
dist: 'dist'
};
// Define the configuration for all the tasks
grunt.initConfig({
// Project settings
yeoman: appConfig,
// Watches files for changes and runs tasks based on the changed files
watch: {
bower: {
files: ['bower.json'],
tasks: ['wiredep']
},
js: {
files: ['<%= yeoman.app %>/scripts/{,*/}*.js'],
tasks: ['newer:jshint:all'],
options: {
livereload: '<%= connect.options.livereload %>'
}
},
jsTest: {
files: ['test/spec/{,*/}*.js'],
tasks: ['newer:jshint:test', 'karma']
},
styles: {
files: ['<%= yeoman.app %>/styles/{,*/}*.css'],
tasks: ['newer:copy:styles', 'autoprefixer']
},
gruntfile: {
files: ['Gruntfile.js']
},
livereload: {
options: {
livereload: '<%= connect.options.livereload %>'
},
files: [
'<%= yeoman.app %>/{,*/}*.html',
'.tmp/styles/{,*/}*.css',
'<%= yeoman.app %>/images/{,*/}*.{png,jpg,jpeg,gif,webp,svg}'
]
}
},
// The actual grunt server settings
connect: {
options: {
port: 9000,
// Change this to '0.0.0.0' to access the server from outside.
hostname: 'localhost',
livereload: 35729
},
livereload: {
options: {
open: true,
middleware: function (connect) {
return [
connect.static('.tmp'),
connect().use(
'/bower_components',
connect.static('./bower_components')
),
connect.static(appConfig.app)
];
}
}
},
test: {
options: {
port: 9001,
middleware: function (connect) {
return [
connect.static('.tmp'),
connect.static('test'),
connect().use(
'/bower_components',
connect.static('./bower_components')
),
connect.static(appConfig.app)
];
}
}
},
dist: {
options: {
open: true,
base: '<%= yeoman.dist %>'
}
}
},
// Make sure code styles are up to par and there are no obvious mistakes
jshint: {
options: {
jshintrc: '.jshintrc',
reporter: require('jshint-stylish')
},
all: {
src: [
'Gruntfile.js',
'<%= yeoman.app %>/scripts/{,*/}*.js'
]
},
test: {
options: {
jshintrc: 'test/.jshintrc'
},
src: ['test/spec/{,*/}*.js']
}
},
// Empties folders to start fresh
clean: {
dist: {
files: [{
dot: true,
src: [
'.tmp',
'<%= yeoman.dist %>/{,*/}*',
'!<%= yeoman.dist %>/.git*'
]
}]
},
server: '.tmp'
},
// Add vendor prefixed styles
autoprefixer: {
options: {
browsers: ['last 1 version']
},
dist: {
files: [{
expand: true,
cwd: '.tmp/styles/',
src: '{,*/}*.css',
dest: '.tmp/styles/'
}]
}
},
// Automatically inject Bower components into the app
wiredep: {
options: {
},
app: {
src: ['<%= yeoman.app %>/index.html'],
ignorePath: /\.\.\//
}
},
// Renames files for browser caching purposes
filerev: {
dist: {
src: [
'<%= yeoman.dist %>/scripts/{,*/}*.js',
'<%= yeoman.dist %>/styles/{,*/}*.css',
'<%= yeoman.dist %>/images/{,*/}*.{png,jpg,jpeg,gif,webp,svg}',
'<%= yeoman.dist %>/styles/fonts/*'
]
}
},
// Reads HTML for usemin blocks to enable smart builds that automatically
// concat, minify and revision files. Creates configurations in memory so
// additional tasks can operate on them
useminPrepare: {
html: ['<%= yeoman.app %>/index.html', '<%= yeoman.app %>/login.html', '<%= yeoman.app %>/embed.html'],
options: {
dest: '<%= yeoman.dist %>',
flow: {
html: {
steps: {
js: ['concat', 'uglifyjs'],
css: ['cssmin']
},
post: {}
}
}
}
},
// Performs rewrites based on filerev and the useminPrepare configuration
usemin: {
html: ['<%= yeoman.dist %>/{,*/}*.html'],
css: ['<%= yeoman.dist %>/styles/{,*/}*.css'],
options: {
assetsDirs: ['<%= yeoman.dist %>','<%= yeoman.dist %>/images']
}
},
// The following *-min tasks will produce minified files in the dist folder
// By default, your `index.html`'s <!-- Usemin block --> will take care of
// minification. These next options are pre-configured if you do not wish
// to use the Usemin blocks.
// cssmin: {
// dist: {
// files: {
// '<%= yeoman.dist %>/styles/main.css': [
// '.tmp/styles/{,*/}*.css'
// ]
// }
// }
// },
// uglify: {
// dist: {
// files: {
// '<%= yeoman.dist %>/scripts/scripts.js': [
// '<%= yeoman.dist %>/scripts/scripts.js'
// ]
// }
// }
// },
// concat: {
// dist: {}
// },
svgmin: {
dist: {
files: [{
expand: true,
cwd: '<%= yeoman.app %>/images',
src: '{,*/}*.svg',
dest: '<%= yeoman.dist %>/images'
}]
}
},
htmlmin: {
dist: {
options: {
collapseWhitespace: true,
conservativeCollapse: true,
collapseBooleanAttributes: true,
removeCommentsFromCDATA: true,
removeOptionalTags: true
},
files: [{
expand: true,
cwd: '<%= yeoman.dist %>',
src: ['*.html', 'views/{,*/}*.html'],
dest: '<%= yeoman.dist %>'
}]
}
},
// ngmin tries to make the code safe for minification automatically by
// using the Angular long form for dependency injection. It doesn't work on
// things like resolve or inject so those have to be done manually.
ngmin: {
dist: {
files: [{
expand: true,
cwd: '.tmp/concat/scripts',
src: '*.js',
dest: '.tmp/concat/scripts'
}]
}
},
// Replace Google CDN references
cdnify: {
dist: {
html: ['<%= yeoman.dist %>/*.html']
}
},
// Copies remaining files to places other tasks can use
copy: {
dist: {
files: [{
expand: true,
dot: true,
cwd: '<%= yeoman.app %>',
dest: '<%= yeoman.dist %>',
src: [
'*.{ico,png,txt}',
'.htaccess',
'*.html',
'views/{,*/}*.html',
'images/{,*/}*.{webp}',
'fonts/*'
]
}, {
expand: true,
cwd: '<%= yeoman.app %>/images',
dest: '<%= yeoman.dist %>/images',
src: ['*']
}, {
expand: true,
cwd: '.tmp/images',
dest: '<%= yeoman.dist %>/images',
src: ['generated/*']
}, {
expand: true,
cwd: '<%= yeoman.app %>/bower_components/bootstrap/dist',
src: 'fonts/*',
dest: '<%= yeoman.dist %>'
}, {
expand: true,
cwd: '<%= yeoman.app %>/bower_components/font-awesome',
src: 'fonts/*',
dest: '<%= yeoman.dist %>'
}]
},
styles: {
expand: true,
cwd: '<%= yeoman.app %>/styles',
dest: '.tmp/styles/',
src: '{,*/}*.css'
}
},
// Run some tasks in parallel to speed up the build process
concurrent: {
server: [
'copy:styles'
],
test: [
'copy:styles'
],
dist: [
'copy:styles',
'svgmin'
]
},
// Test settings
karma: {
unit: {
configFile: 'test/karma.conf.js',
singleRun: true
}
}
});
grunt.registerTask('serve', 'Compile then start a connect web server', function (target) {
if (target === 'dist') {
return grunt.task.run(['build', 'connect:dist:keepalive']);
}
grunt.task.run([
'clean:server',
'wiredep',
'concurrent:server',
'autoprefixer',
'connect:livereload',
'watch'
]);
});
grunt.registerTask('server', 'DEPRECATED TASK. Use the "serve" task instead', function (target) {
grunt.log.warn('The `server` task has been deprecated. Use `grunt serve` to start a server.');
grunt.task.run(['serve:' + target]);
});
grunt.registerTask('test', [
'clean:server',
'concurrent:test',
'autoprefixer',
'connect:test',
'karma'
]);
grunt.registerTask('build', [
'clean:dist',
'wiredep',
'useminPrepare',
'concurrent:dist',
'autoprefixer',
'concat',
'ngmin',
'copy:dist',
'cdnify',
'cssmin',
'uglify',
'filerev',
'usemin',
'htmlmin'
]);
grunt.registerTask('default', [
'newer:jshint',
'test',
'build'
]);
};

View File

@@ -4,6 +4,7 @@
<!--[if IE 8]> <html class="no-js lt-ie9" ng-app="redash" ng-controller='MainCtrl'> <![endif]-->
<!--[if gt IE 8]><!--> <html class="no-js" ng-app="redash" ng-controller='EmbedCtrl'> <!--<![endif]-->
<head>
<base href="{{base_href}}">
<title ng-bind="'{{name}} | ' + pageTitle"></title>
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<meta http-equiv="X-UA-Compatible" content="IE=edge">
@@ -36,53 +37,7 @@
<div growl></div>
<div ng-view></div>
<script src="/bower_components/jquery/jquery.js"></script>
<!-- build:js /scripts/embed-plugins.js -->
<script src="/bower_components/angular/angular.js"></script>
<script src="/bower_components/angular-sanitize/angular-sanitize.js"></script>
<script src="/bower_components/jquery-ui/ui/jquery-ui.js"></script>
<script src="/bower_components/bootstrap/js/collapse.js"></script>
<script src="/bower_components/bootstrap/js/modal.js"></script>
<script src="/bower_components/angular-resource/angular-resource.js"></script>
<script src="/bower_components/angular-route/angular-route.js"></script>
<script src="/bower_components/underscore/underscore.js"></script>
<script src="/bower_components/moment/moment.js"></script>
<script src="/bower_components/angular-moment/angular-moment.js"></script>
<script src="/bower_components/codemirror/lib/codemirror.js"></script>
<script src="/bower_components/codemirror/addon/edit/matchbrackets.js"></script>
<script src="/bower_components/codemirror/addon/edit/closebrackets.js"></script>
<script src="/bower_components/codemirror/addon/hint/show-hint.js"></script>
<script src="/bower_components/codemirror/addon/hint/anyword-hint.js"></script>
<script src="/bower_components/codemirror/mode/sql/sql.js"></script>
<script src="/bower_components/codemirror/mode/python/python.js"></script>
<script src="/bower_components/codemirror/mode/javascript/javascript.js"></script>
<script src="/bower_components/gridster/dist/jquery.gridster.js"></script>
<script src="/bower_components/angular-growl/build/angular-growl.js"></script>
<script src="/bower_components/pivottable/dist/pivot.js"></script>
<script src="/bower_components/cornelius/src/cornelius.js"></script>
<script src="/bower_components/mousetrap/mousetrap.js"></script>
<script src="/bower_components/mousetrap/plugins/global-bind/mousetrap-global-bind.js"></script>
<script src="/bower_components/angular-ui-select/dist/select.js"></script>
<script src="/bower_components/underscore.string/lib/underscore.string.js"></script>
<script src="/bower_components/marked/lib/marked.js"></script>
<script src="/bower_components/angular-base64-upload/dist/angular-base64-upload.js"></script>
<script src="/bower_components/plotly/plotly.js"></script>
<script src="/bower_components/angular-plotly/src/angular-plotly.js"></script>
<script src="/scripts/directives/plotly.js"></script>
<script src="/scripts/ng_smart_table.js"></script>
<script src="/bower_components/angular-ui-bootstrap-bower/ui-bootstrap-tpls.js"></script>
<script src="/bower_components/bucky/bucky.js"></script>
<script src="/bower_components/pace/pace.js"></script>
<script src="/bower_components/mustache/mustache.js"></script>
<script src="/bower_components/canvg/rgbcolor.js"></script>
<script src="/bower_components/canvg/StackBlur.js"></script>
<script src="/bower_components/canvg/canvg.js"></script>
<script src="/bower_components/leaflet/dist/leaflet.js"></script>
<script src="/bower_components/angular-bootstrap-show-errors/src/showErrors.js"></script>
<script src="/bower_components/d3/d3.min.js"></script>
<script src="/bower_components/angular-ui-sortable/sortable.js"></script>
<!-- endbuild -->
{% include 'vendor_scripts.html' %}
<!-- build:js({.tmp,app}) /scripts/embed-scripts.js -->
<script src="/scripts/embed.js"></script>

View File

@@ -132,7 +132,11 @@
</div>
{% raw %}
<div class="container-fluid footer">
<div class="visible-print">
<hr>
Source: {{location}}
</div>
<div class="container-fluid footer hidden-print">
<hr/>
<div class="container">
<div class="row">
@@ -147,52 +151,7 @@
</div>
{% endraw %}
<script src="/bower_components/jquery/jquery.js"></script>
<!-- build:js /scripts/plugins.js -->
<script src="/bower_components/angular/angular.js"></script>
<script src="/bower_components/angular-sanitize/angular-sanitize.js"></script>
<script src="/bower_components/jquery-ui/ui/jquery-ui.js"></script>
<script src="/bower_components/bootstrap/js/collapse.js"></script>
<script src="/bower_components/bootstrap/js/modal.js"></script>
<script src="/bower_components/angular-resource/angular-resource.js"></script>
<script src="/bower_components/angular-route/angular-route.js"></script>
<script src="/bower_components/underscore/underscore.js"></script>
<script src="/bower_components/moment/moment.js"></script>
<script src="/bower_components/angular-moment/angular-moment.js"></script>
<script src="/bower_components/codemirror/lib/codemirror.js"></script>
<script src="/bower_components/codemirror/addon/edit/matchbrackets.js"></script>
<script src="/bower_components/codemirror/addon/edit/closebrackets.js"></script>
<script src="/bower_components/codemirror/addon/hint/show-hint.js"></script>
<script src="/bower_components/codemirror/addon/hint/anyword-hint.js"></script>
<script src="/bower_components/codemirror/mode/sql/sql.js"></script>
<script src="/bower_components/codemirror/mode/python/python.js"></script>
<script src="/bower_components/codemirror/mode/javascript/javascript.js"></script>
<script src="/bower_components/gridster/dist/jquery.gridster.js"></script>
<script src="/bower_components/angular-growl/build/angular-growl.js"></script>
<script src="/bower_components/pivottable/dist/pivot.js"></script>
<script src="/bower_components/cornelius/src/cornelius.js"></script>
<script src="/bower_components/mousetrap/mousetrap.js"></script>
<script src="/bower_components/mousetrap/plugins/global-bind/mousetrap-global-bind.js"></script>
<script src="/bower_components/angular-ui-select/dist/select.js"></script>
<script src="/bower_components/underscore.string/lib/underscore.string.js"></script>
<script src="/bower_components/marked/lib/marked.js"></script>
<script src="/bower_components/angular-base64-upload/dist/angular-base64-upload.js"></script>
<script src="/bower_components/plotly/plotly.js"></script>
<script src="/bower_components/angular-plotly/src/angular-plotly.js"></script>
<script src="/scripts/directives/plotly.js"></script>
<script src="/scripts/ng_smart_table.js"></script>
<script src="/bower_components/angular-ui-bootstrap-bower/ui-bootstrap-tpls.js"></script>
<script src="/bower_components/pace/pace.js"></script>
<script src="/bower_components/mustache/mustache.js"></script>
<script src="/bower_components/canvg/rgbcolor.js"></script>
<script src="/bower_components/canvg/StackBlur.js"></script>
<script src="/bower_components/canvg/canvg.js"></script>
<script src="/bower_components/leaflet/dist/leaflet.js"></script>
<script src="/bower_components/angular-bootstrap-show-errors/src/showErrors.js"></script>
<script src="/bower_components/d3/d3.min.js"></script>
<script src="/bower_components/angular-ui-sortable/sortable.js"></script>
<!-- endbuild -->
{% include 'vendor_scripts.html' %}
<!-- build:js({.tmp,app}) /scripts/scripts.js -->
<script src="/scripts/app.js"></script>
@@ -228,6 +187,7 @@
<script>
// TODO: move currentUser & features to be an Angular service
var clientConfig = {{ client_config|safe }};
var basePath = "{{base_href}}";
var currentUser = {{ user|safe }};
var currentOrgSlug = "{{ org_slug }}";

View File

@@ -4,7 +4,6 @@ angular.module('redash', [
'redash.controllers',
'redash.filters',
'redash.services',
'redash.renderers',
'redash.visualization',
'plotly',
'plotly-chart',

View File

@@ -163,6 +163,7 @@
}
});
$scope.location = String(document.location);
$scope.version = clientConfig.version;
$scope.newVersionAvailable = clientConfig.newVersionAvailable && currentUser.hasPermission("admin");

View File

@@ -1,6 +1,7 @@
(function() {
var DashboardCtrl = function($scope, Events, Widget, $routeParams, $location, $http, $timeout, $q, Dashboard) {
$scope.refreshEnabled = false;
$scope.isFullscreen = false;
$scope.refreshRate = 60;
var renderDashboard = function (dashboard) {
@@ -103,6 +104,10 @@
}
}
$scope.toggleFullscreen = function() {
$scope.isFullscreen = !$scope.isFullscreen;
};
$scope.triggerRefresh = function() {
$scope.refreshEnabled = !$scope.refreshEnabled;

View File

@@ -306,12 +306,6 @@
email: $scope.user.email
};
if ($scope.user.admin === true && $scope.user.groups.indexOf("admin") === -1) {
data.groups = $scope.user.groups.concat("admin");
} else if ($scope.user.admin === false && $scope.user.groups.indexOf("admin") !== -1) {
data.groups = _.without($scope.user.groups, "admin");
}
User.save(data, function(user) {
growl.addSuccessMessage("Saved.")
$scope.user = user;

View File

@@ -4,7 +4,7 @@
var directives = angular.module('redash.directives');
// Angular strips data- from the directive, so data-source-form becomes sourceForm...
directives.directive('sourceForm', ['$http', 'growl', function ($http, growl) {
directives.directive('sourceForm', ['$http', 'growl', '$q', function ($http, growl, $q) {
return {
restrict: 'E',
replace: true,
@@ -34,7 +34,10 @@
});
});
$http.get('api/data_sources/types').success(function (types) {
var typesPromise = $http.get('api/data_sources/types');
$q.all([typesPromise, $scope.dataSource.$promise]).then(function(responses) {
var types = responses[0].data;
setType(types);
$scope.dataSourceTypes = types;

View File

@@ -40,6 +40,19 @@
}
}]);
directives.directive('hashLink', ['$location', function($location) {
return {
restrict: 'A',
scope: {
'hash': '@'
},
link: function (scope, element) {
var basePath = $location.path().substring(1);
element[0].href = basePath + "#" + scope.hash;
}
};
}]);
directives.directive('rdTab', ['$location', function ($location) {
return {
restrict: 'E',

View File

@@ -17,21 +17,22 @@
'Pink': '#FFC0CB',
'Dark Blue': '#00008b'
};
var ColorPaletteArray = _.values(ColorPalette)
var ColorPaletteArray = _.values(ColorPalette);
var fillXValues = function(seriesList) {
var xValues = _.uniq(_.flatten(_.pluck(seriesList, 'x')));
xValues.sort();
var xValues = _.sortBy(_.union.apply(_, _.pluck(seriesList, 'x')), _.identity);
_.each(seriesList, function(series) {
series.x.sort();
series.x = _.sortBy(series.x, _.identity);
_.each(xValues, function(value, index) {
if (series.x[index] != value) {
if (series.x[index] !== value) {
series.x.splice(index, 0, value);
series.y.splice(index, 0, 0);
series.y.splice(index, 0, null);
}
});
});
}
};
var normalAreaStacking = function(seriesList) {
fillXValues(seriesList);
@@ -46,11 +47,13 @@
seriesList[i].y[j] += sum;
}
}
}
};
var percentAreaStacking = function(seriesList) {
if (seriesList.length == 0)
if (seriesList.length === 0) {
return;
}
fillXValues(seriesList);
_.each(seriesList, function(series) {
series.text = [];
@@ -61,20 +64,24 @@
for(var j = 0; j < seriesList.length; j++) {
sum += seriesList[j].y[i];
}
for(var j = 0; j < seriesList.length; j++) {
var value = seriesList[j].y[i] / sum * 100;
seriesList[j].text.push('Value: ' + seriesList[j].y[i] + '<br>Relative: ' + value.toFixed(2) + '%');
seriesList[j].y[i] = value;
if (j > 0)
if (j > 0) {
seriesList[j].y[i] += seriesList[j-1].y[i];
}
}
}
}
};
var percentBarStacking = function(seriesList) {
if (seriesList.length == 0)
if (seriesList.length === 0) {
return;
}
fillXValues(seriesList);
_.each(seriesList, function(series) {
series.text = [];
@@ -95,7 +102,7 @@
var normalizeValue = function(value) {
if (moment.isMoment(value)) {
return value.format("YYYY-MM-DD HH:MM:SS.ssssss");
return value.format("YYYY-MM-DD HH:mm:ss");
}
return value;
}
@@ -111,32 +118,34 @@
series: "=",
minHeight: "="
},
link: function (scope, element, attrs) {
link: function (scope) {
var getScaleType = function(scale) {
if (scale == 'datetime')
if (scale === 'datetime') {
return 'date';
if (scale == 'logarithmic')
}
if (scale === 'logarithmic') {
return 'log';
}
return scale;
}
};
var setType = function(series, type) {
if (type == 'column') {
series['type'] = 'bar';
} else if (type == 'line') {
series['mode'] = 'lines';
} else if (type == 'area') {
series['fill'] = scope.options.series.stacking == null ? 'tozeroy' : 'tonexty';
series['mode'] = 'lines';
} else if (type == 'scatter') {
series['type'] = 'scatter';
series['mode'] = 'markers';
if (type === 'column') {
series.type = 'bar';
} else if (type === 'line') {
series.mode = 'lines';
} else if (type === 'area') {
series.fill = scope.options.series.stacking === null ? 'tozeroy' : 'tonexty';
series.mode = 'lines';
} else if (type === 'scatter') {
series.type = 'scatter';
series.mode = 'markers';
}
}
};
var getColor = function(index) {
return ColorPaletteArray[index % ColorPaletteArray.length];
}
};
var bottomMargin = 50,
pixelsPerLegendRow = 21;
@@ -148,10 +157,10 @@
delete scope.layout.yaxis;
delete scope.layout.yaxis2;
if (scope.options.globalSeriesType == 'pie') {
if (scope.options.globalSeriesType === 'pie') {
var hasX = _.contains(_.values(scope.options.columnMapping), 'x');
var rows = scope.series.length > 2 ? 2 : 1;
var cellsInRow = Math.ceil(scope.series.length / rows)
var cellsInRow = Math.ceil(scope.series.length / rows);
var cellWidth = 1 / cellsInRow;
var cellHeight = 1 / rows;
var xPadding = 0.02;
@@ -176,29 +185,54 @@
scope.layout.margin.b = scope.layout.height - (scope.minHeight - bottomMargin);
return;
}
scope.layout.height = Math.max(scope.minHeight, pixelsPerLegendRow * scope.series.length);
scope.layout.margin.b = scope.layout.height - (scope.minHeight - bottomMargin);
var hasY2 = false;
var sortX = scope.options.sortX === true || scope.options.sortX === undefined;
var useUnifiedXaxis = sortX && scope.options.xAxis.type === 'category';
var unifiedX = null;
if (useUnifiedXaxis) {
unifiedX = _.sortBy(_.union.apply(_, _.map(scope.series, function(s) { return _.pluck(s.data, 'x'); })), _.identity);
}
_.each(scope.series, function(series, index) {
var seriesOptions = scope.options.seriesOptions[series.name] || {};
var plotlySeries = {x: [],
y: [],
name: seriesOptions.name || series.name,
marker: {color: seriesOptions.color ? seriesOptions.color : getColor(index)}};
if (seriesOptions.yAxis == 1 && (scope.options.series.stacking == null || seriesOptions.type == 'line')) {
if (seriesOptions.yAxis === 1 && (scope.options.series.stacking === null || seriesOptions.type === 'line')) {
hasY2 = true;
plotlySeries.yaxis = 'y2';
}
setType(plotlySeries, seriesOptions.type);
var data = series.data;
if (scope.options.sortX) {
if (sortX) {
data = _.sortBy(data, 'x');
}
_.each(data, function(row) {
plotlySeries.x.push(normalizeValue(row.x));
plotlySeries.y.push(normalizeValue(row.y));
});
scope.data.push(plotlySeries)
if (useUnifiedXaxis && index === 0) {
var values = {};
_.each(data, function(row) {
values[row.x] = row.y;
});
_.each(unifiedX, function(x) {
plotlySeries.x.push(normalizeValue(x));
plotlySeries.y.push(normalizeValue(values[x] || null));
});
} else {
_.each(data, function(row) {
plotlySeries.x.push(normalizeValue(row.x));
plotlySeries.y.push(normalizeValue(row.y));
});
}
scope.data.push(plotlySeries);
});
var getTitle = function(axis) {
@@ -206,7 +240,7 @@
return axis.title.text;
}
return null;
}
};
scope.layout.xaxis = {title: getTitle(scope.options.xAxis),
type: getScaleType(scope.options.xAxis.type)};
@@ -225,20 +259,21 @@
} else {
delete scope.layout.yaxis2;
}
if (scope.options.series.stacking == 'normal') {
if (scope.options.series.stacking === 'normal') {
scope.layout.barmode = 'stack';
if (scope.options.globalSeriesType == 'area') {
if (scope.options.globalSeriesType === 'area') {
normalAreaStacking(scope.data);
}
} else if (scope.options.series.stacking == 'percent') {
} else if (scope.options.series.stacking === 'percent') {
scope.layout.barmode = 'stack';
if (scope.options.globalSeriesType == 'area') {
if (scope.options.globalSeriesType === 'area') {
percentAreaStacking(scope.data);
} else if (scope.options.globalSeriesType == 'column') {
} else if (scope.options.globalSeriesType === 'column') {
percentBarStacking(scope.data);
}
}
}
};
scope.$watch('series', redraw);
scope.$watch('options', redraw, true);
@@ -246,6 +281,6 @@
scope.plotlyOptions = {showLink: false, displaylogo: false};
scope.data = [];
}
}
};
});
})();

View File

@@ -8,7 +8,7 @@
'query': '=',
'visualization': '=?'
},
template: '<small><span class="glyphicon glyphicon-link"></span></small> <a ng-href="{{link}}" class="query-link">{{query.name}}</a>',
template: '<a ng-href="{{link}}" class="query-link">{{query.name}}</a>',
link: function(scope, element) {
scope.link = 'queries/' + scope.query.id;
if (scope.visualization) {
@@ -38,10 +38,12 @@
}
}
function queryResultCSVLink() {
function queryResultLink() {
return {
restrict: 'A',
link: function (scope, element) {
link: function (scope, element, attrs) {
var fileType = attrs.fileType ? attrs.fileType : "csv";
scope.$watch('queryResult && queryResult.getData()', function(data) {
if (!data) {
return;
@@ -50,8 +52,8 @@
if (scope.queryResult.getId() == null) {
element.attr('href', '');
} else {
element.attr('href', 'api/queries/' + scope.query.id + '/results/' + scope.queryResult.getId() + '.csv');
element.attr('download', scope.query.name.replace(" ", "_") + moment(scope.queryResult.getUpdatedAt()).format("_YYYY_MM_DD") + ".csv");
element.attr('href', 'api/queries/' + scope.query.id + '/results/' + scope.queryResult.getId() + '.' + fileType);
element.attr('download', scope.query.name.replace(" ", "_") + moment(scope.queryResult.getUpdatedAt()).format("_YYYY_MM_DD") + "." + fileType);
}
});
}
@@ -286,7 +288,7 @@
angular.module('redash.directives')
.directive('queryLink', queryLink)
.directive('querySourceLink', querySourceLink)
.directive('queryResultLink', queryResultCSVLink)
.directive('queryResultLink', queryResultLink)
.directive('queryEditor', queryEditor)
.directive('queryRefreshSelect', queryRefreshSelect)
.directive('queryTimePicker', queryTimePicker)

View File

@@ -4,7 +4,6 @@ angular.module('redash', [
'redash.controllers',
'redash.filters',
'redash.services',
'redash.renderers',
'redash.visualization',
'plotly',
'plotly-chart',

View File

@@ -69,6 +69,12 @@ angular.module('redash.filters', []).
}
})
.filter('dateTime', function() {
return function(value) {
return moment(value).format(clientConfig.dateTimeFormat);
}
})
.filter('linkify', function () {
return function (text) {
return text.replace(urlPattern, "$1<a href='$2' target='_blank'>$2</a>");

View File

@@ -327,13 +327,15 @@
QueryResult.prototype.prepareFilters = function () {
var filters = [];
var filterTypes = ['filter', 'multi-filter', 'multiFilter'];
_.each(this.getColumnNames(), function (col) {
var type = col.split('::')[1] || col.split('__')[1];
_.each(this.getColumns(), function (col) {
var name = col.name;
var type = name.split('::')[1] || name.split('__')[1];
if (_.contains(filterTypes, type)) {
// filter found
var filter = {
name: col,
friendlyName: this.getColumnFriendlyName(col),
name: name,
friendlyName: this.getColumnFriendlyName(name),
column: col,
values: [],
multiple: (type=='multiFilter') || (type=='multi-filter')
}

View File

@@ -6,7 +6,7 @@
defaultOptions: {},
skipTypes: false,
editorTemplate: null
}
};
this.registerVisualization = function (config) {
var visualization = _.extend({}, defaultConfig, config);
@@ -21,11 +21,10 @@
if (!config.skipTypes) {
this.visualizationTypes[config.name] = config.type;
}
;
};
this.getSwitchTemplate = function (property) {
var pattern = /(<[a-zA-Z0-9-]*?)( |>)/
var pattern = /(<[a-zA-Z0-9-]*?)( |>)/;
var mergedTemplates = _.reduce(this.visualizations, function (templates, visualization) {
if (visualization[property]) {
@@ -41,7 +40,7 @@
mergedTemplates = '<div ng-switch on="visualization.type">' + mergedTemplates + "</div>";
return mergedTemplates;
}
};
this.$get = ['$resource', function ($resource) {
var Visualization = $resource('api/visualizations/:id', {id: '@id'});
@@ -64,12 +63,12 @@
template: '<small>{{name}}</small>',
replace: false,
link: function (scope) {
if (Visualization.visualizations[scope.visualization.type].name != scope.visualization.name) {
if (Visualization.visualizations[scope.visualization.type].name !== scope.visualization.name) {
scope.name = scope.visualization.name;
}
}
}
}
};
};
var VisualizationRenderer = function ($location, Visualization) {
return {
@@ -90,7 +89,7 @@
}
});
}
}
};
};
var VisualizationOptionsEditor = function (Visualization) {
@@ -98,15 +97,36 @@
restrict: 'E',
template: Visualization.editorTemplate,
replace: false
}
};
};
var Filters = function () {
return {
restrict: 'E',
templateUrl: '/views/visualizations/filters.html'
}
}
};
};
var FilterValueFilter = function() {
return function(value, filter) {
if (_.isArray(value)) {
value = value[0];
}
// TODO: deduplicate code with table.js:
if (filter.column.type === 'date') {
if (value && moment.isMoment(value)) {
return value.format(clientConfig.dateFormat);
}
} else if (filter.column.type === 'datetime') {
if (value && moment.isMoment(value)) {
return value.format(clientConfig.dateTimeFormat);
}
}
return value;
};
};
var EditVisualizationForm = function (Events, Visualization, growl) {
return {
@@ -120,7 +140,7 @@
openEditor: '@',
onNewSuccess: '=?'
},
link: function (scope, element, attrs) {
link: function (scope) {
scope.editRawOptions = currentUser.hasPermission('edit_raw_chart');
scope.visTypes = Visualization.visualizationTypes;
@@ -131,7 +151,7 @@
'description': '',
'options': Visualization.defaultVisualization.defaultOptions
};
}
};
if (!scope.visualization) {
var unwatch = scope.$watch('query.id', function (queryId) {
@@ -145,14 +165,13 @@
scope.$watch('visualization.type', function (type, oldType) {
// if not edited by user, set name to match type
if (type && oldType != type && scope.visualization && !scope.visForm.name.$dirty) {
if (type && oldType !== type && scope.visualization && !scope.visForm.name.$dirty) {
scope.visualization.name = _.string.titleize(scope.visualization.type);
}
if (type && oldType != type && scope.visualization) {
if (type && oldType !== type && scope.visualization) {
scope.visualization.options = Visualization.visualizations[scope.visualization.type].defaultOptions;
}
});
scope.submit = function () {
@@ -183,15 +202,44 @@
});
};
}
}
};
};
var EmbedCode = function () {
return {
restrict: 'E',
scope: {
visualization: '=',
query: '='
},
template:
'<div class="col-lg-8 embed-code">' +
'<i class="fa fa-code" ng-click="showCode = showCode==true ? false : true;"></i>' +
'<div ng-show="showCode">' +
'<span class="text-muted">Embed code for this visualization: <small>(height should be adjusted)</small></span>' +
'<code>&lt;iframe src="{{ embedUrl }}"<br/>' +
'&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;' +
'width="720" height="391"&gt;&lt;/iframe&gt;</code>' +
'</div>' +
'</div>',
replace: true,
link: function (scope) {
scope.$watch('visualization', function(visualization) {
if (visualization) {
scope.embedUrl = basePath + 'embed/query/' + scope.query.id + '/visualization/' + scope.visualization.id + '?api_key=' + scope.query.api_key;
}
});
}
};
};
angular.module('redash.visualization', [])
.provider('Visualization', VisualizationProvider)
.directive('visualizationRenderer', ['$location', 'Visualization', VisualizationRenderer])
.directive('visualizationOptionsEditor', ['Visualization', VisualizationOptionsEditor])
.directive('visualizationName', ['Visualization', VisualizationName])
.directive('embedCode', EmbedCode)
.directive('filters', Filters)
.directive('editVisulatizationForm', ['Events', 'Visualization', 'growl', EditVisualizationForm])
.filter('filterValue', FilterValueFilter)
.directive('editVisulatizationForm', ['Events', 'Visualization', 'growl', EditVisualizationForm]);
})();

View File

@@ -40,21 +40,22 @@
var reloadChart = function() {
reloadData();
$scope.plotlyOptions = $scope.options;
}
};
var reloadData = function() {
if (angular.isDefined($scope.queryResult)) {
$scope.chartSeries = _.sortBy($scope.queryResult.getChartData($scope.options.columnMapping),
function(series) {
if ($scope.options.seriesOptions[series.name])
if ($scope.options.seriesOptions[series.name]) {
return $scope.options.seriesOptions[series.name].zIndex;
}
return 0;
});
}
}
};
$scope.$watch('options', reloadChart, true)
$scope.$watch('queryResult && queryResult.getData()', reloadData)
$scope.$watch('options', reloadChart, true);
$scope.$watch('queryResult && queryResult.getData()', reloadData);
}]
};
});
@@ -88,7 +89,7 @@
_.each(scope.options.seriesOptions, function(options) {
options.type = scope.options.globalSeriesType;
});
}
};
scope.xAxisScales = ['datetime', 'linear', 'logarithmic', 'category'];
scope.yAxisScales = ['linear', 'logarithmic'];
@@ -130,12 +131,17 @@
});
};
scope.$watch('options.columnMapping', refreshSeries, true);
scope.$watch('options.columnMapping', function() {
if (scope.queryResult.status === "done") {
refreshSeries();
}
}, true);
scope.$watch(function() {return [scope.queryResult.getId(), scope.queryResult.status]}, function(changed) {
if (!changed[0]) {
scope.$watch(function() {return [scope.queryResult.getId(), scope.queryResult.status];}, function(changed) {
if (!changed[0] || changed[1] !== "done") {
return;
}
refreshColumnsAndForm();
refreshSeries();
}, true);

View File

@@ -1,29 +1,64 @@
var renderers = angular.module('redash.renderers', []);
(function() {
var module = angular.module('redash.visualization');
renderers.directive('pivotTableRenderer', function () {
module.directive('pivotTableRenderer', function () {
return {
restrict: 'E',
scope: {
queryResult: '='
},
template: "",
replace: false,
link: function($scope, element, attrs) {
$scope.$watch('queryResult && queryResult.getData()', function (data) {
if (!data) {
return;
}
restrict: 'E',
scope: {
queryResult: '=',
visualization: '='
},
template: "",
replace: false,
link: function($scope, element) {
$scope.$watch('queryResult && queryResult.getData()', function (data) {
if (!data) {
return;
}
if ($scope.queryResult.getData() == null) {
} else {
// We need to give the pivot table its own copy of the data, because its change
// it which interferes with other visualizations.
var data = $.extend(true, [], $scope.queryResult.getData());
$(element).pivotUI(data, {
renderers: $.pivotUtilities.renderers
}, true);
if ($scope.queryResult.getData() === null) {
} else {
// We need to give the pivot table its own copy of the data, because it changes
// it which interferes with other visualizations.
data = $.extend(true, [], $scope.queryResult.getRawData());
var options = {
renderers: $.pivotUtilities.renderers,
onRefresh: function(config) {
var configCopy = $.extend(true, {}, config);
//delete some values which are functions
delete configCopy.aggregators;
delete configCopy.renderers;
//delete some bulky default values
delete configCopy.rendererOptions;
delete configCopy.localeStrings;
if ($scope.visualization) {
$scope.visualization.options = configCopy;
}
});
}
}
});
}
};
if ($scope.visualization) {
$.extend(options, $scope.visualization.options);
}
$(element).pivotUI(data, options, true);
}
});
}
};
});
module.config(['VisualizationProvider', function (VisualizationProvider) {
var editTemplate = '<div/>';
var defaultOptions = {
};
VisualizationProvider.registerVisualization({
type: 'PIVOT',
name: 'Pivot Table',
renderTemplate: '<pivot-table-renderer visualization="visualization" query-result="queryResult"></pivot-table-renderer>',
editorTemplate: editTemplate,
defaultOptions: defaultOptions
});
}]);
})();

View File

@@ -2,6 +2,10 @@ body {
padding-top: 70px;
}
a[ng-click]{
cursor: pointer;
}
a.link {
cursor: pointer;
}

View File

@@ -0,0 +1,45 @@
<!-- build:js /scripts/plugins.js -->
<script src="/bower_components/jquery/jquery.js"></script>
<script src="/bower_components/angular/angular.js"></script>
<script src="/bower_components/angular-sanitize/angular-sanitize.js"></script>
<script src="/bower_components/jquery-ui/ui/jquery-ui.js"></script>
<script src="/bower_components/bootstrap/js/collapse.js"></script>
<script src="/bower_components/bootstrap/js/modal.js"></script>
<script src="/bower_components/angular-resource/angular-resource.js"></script>
<script src="/bower_components/angular-route/angular-route.js"></script>
<script src="/bower_components/underscore/underscore.js"></script>
<script src="/bower_components/moment/moment.js"></script>
<script src="/bower_components/angular-moment/angular-moment.js"></script>
<script src="/bower_components/codemirror/lib/codemirror.js"></script>
<script src="/bower_components/codemirror/addon/edit/matchbrackets.js"></script>
<script src="/bower_components/codemirror/addon/edit/closebrackets.js"></script>
<script src="/bower_components/codemirror/addon/hint/show-hint.js"></script>
<script src="/bower_components/codemirror/addon/hint/anyword-hint.js"></script>
<script src="/bower_components/codemirror/mode/sql/sql.js"></script>
<script src="/bower_components/codemirror/mode/python/python.js"></script>
<script src="/bower_components/codemirror/mode/javascript/javascript.js"></script>
<script src="/bower_components/gridster/dist/jquery.gridster.js"></script>
<script src="/bower_components/angular-growl/build/angular-growl.js"></script>
<script src="/bower_components/pivottable/dist/pivot.js"></script>
<script src="/bower_components/cornelius/src/cornelius.js"></script>
<script src="/bower_components/mousetrap/mousetrap.js"></script>
<script src="/bower_components/mousetrap/plugins/global-bind/mousetrap-global-bind.js"></script>
<script src="/bower_components/angular-ui-select/dist/select.js"></script>
<script src="/bower_components/underscore.string/lib/underscore.string.js"></script>
<script src="/bower_components/marked/lib/marked.js"></script>
<script src="/bower_components/angular-base64-upload/dist/angular-base64-upload.js"></script>
<script src="/bower_components/plotly/plotly.js"></script>
<script src="/bower_components/angular-plotly/src/angular-plotly.js"></script>
<script src="/scripts/directives/plotly.js"></script>
<script src="/scripts/ng_smart_table.js"></script>
<script src="/bower_components/angular-ui-bootstrap-bower/ui-bootstrap-tpls.js"></script>
<script src="/bower_components/pace/pace.js"></script>
<script src="/bower_components/mustache/mustache.js"></script>
<script src="/bower_components/canvg/rgbcolor.js"></script>
<script src="/bower_components/canvg/StackBlur.js"></script>
<script src="/bower_components/canvg/canvg.js"></script>
<script src="/bower_components/leaflet/dist/leaflet.js"></script>
<script src="/bower_components/angular-bootstrap-show-errors/src/showErrors.js"></script>
<script src="/bower_components/d3/d3.js"></script>
<script src="/bower_components/angular-ui-sortable/sortable.js"></script>
<!-- endbuild -->

View File

@@ -6,8 +6,9 @@
<h2 id="dashboard_title">
{{dashboard.name}}
<span ng-if="!dashboard.is_archived">
<span ng-if="!dashboard.is_archived" class="hidden-print">
<button type="button" class="btn btn-default btn-xs" ng-class="{active: refreshEnabled}" tooltip="Enable/Disable Auto Refresh" ng-click="triggerRefresh()"><span class="glyphicon glyphicon-refresh"></span></button>
<button type="button" class="btn btn-default btn-xs" ng-class="{active: isFullscreen}" tooltip="Enable/Disable Fullscreen display" ng-click="toggleFullscreen()"><span class="glyphicon glyphicon-picture"></span></button>
<div class="btn-group" role="group" ng-show="dashboard.canEdit()">
<button type="button" class="btn btn-default btn-xs" data-toggle="modal" href="#edit_dashboard_dialog" tooltip="Edit Dashboard (Name/Layout)"><span
class="glyphicon glyphicon-cog"></span></button>
@@ -20,8 +21,7 @@
</h2>
<filters ng-if="dashboard.dashboard_filters_enabled"></filters>
</div>
<div class="container" id="dashboard">
<div ng-class="isFullscreen ? 'container-fluid' : 'container'" id="dashboard">
<div ng-repeat="row in dashboard.widgets" class="row">
<div ng-repeat="widget in row" class="col-lg-{{widget.width | colWidth}}"
ng-controller='WidgetCtrl'>
@@ -29,11 +29,15 @@
<div class="panel panel-default" ng-if="type=='visualization'">
<div class="panel-heading">
<h3 class="panel-title">
<p>
<p class="hidden-print">
<span ng-hide="currentUser.hasPermission('view_query')">{{query.name}}</span>
<query-link query="query" visualization="widget.visualization" ng-show="currentUser.hasPermission('view_query')"></query-link>
<visualization-name visualization="widget.visualization"/>
</p>
<p class="visible-print">
{{query.name}}
<visualization-name visualization="widget.visualization"/>
</p>
<div class="text-muted" ng-bind-html="query.description | markdown"></div>
</h3>
</div>
@@ -41,19 +45,26 @@
<visualization-renderer visualization="widget.visualization" query-result="queryResult"></visualization-renderer class="panel-body">
<div class="panel-footer">
<span class="label label-default"
<span class="label label-default hidden-print"
tooltip="(query runtime: {{queryResult.getRuntime() | durationHumanize}})"
tooltip-placement="bottom">Updated: <span am-time-ago="queryResult.getUpdatedAt()"></span></span>
<span class="pull-right">
<a class="btn btn-default btn-xs" ng-href="queries/{{query.id}}#{{widget.visualization.id}}" ng-show="currentUser.hasPermission('view_query')"><span class="glyphicon glyphicon-link"></span></a>
<button type="button" class="btn btn-default btn-xs" ng-show="dashboard.canEdit()" ng-click="deleteWidget()" title="Remove Widget"><span class="glyphicon glyphicon-trash"></span></button>
<span class="visible-print">
Updated: {{queryResult.getUpdatedAt() | dateTime}}
</span>
<span class="pull-right">
<a class="btn btn-default btn-xs" ng-disabled="!queryResult.getData()" query-result-link target="_self">
<span class="glyphicon glyphicon-cloud-download"></span>
<span class="pull-right hidden-print">
<div class="btn-group">
<a class="btn btn-default btn-xs" ng-disabled="!queryResult.getData()" query-result-link target="_self" title="Download as CSV File">
<span class="fa fa-file-o"></span>
</a>
<a class="btn btn-default btn-xs" ng-disabled="!queryResult.getData()" file-type="xlsx" query-result-link target="_self" title="Download as Excel File">
<i class="fa fa-file-excel-o"></i>
</a>
</div>
<a class="btn btn-default btn-xs" ng-href="queries/{{query.id}}#{{widget.visualization.id}}" ng-show="currentUser.hasPermission('view_query')"><span class="glyphicon glyphicon-link"></span></a>
<button type="button" class="btn btn-default btn-xs" ng-show="dashboard.canEdit()" ng-click="deleteWidget()" title="Remove Widget"><span class="glyphicon glyphicon-trash"></span></button>
</span>
</div>
</div>

View File

@@ -4,7 +4,7 @@
<div class="row voffset1">
<div class="col-md-12">
<p ng-if="currentUser.hasPermission('admin')">
<a href="#" ng-click="newGroup()" class="btn btn-default"><i class="fa fa-plus"></i> New Group</a>
<a ng-click="newGroup()" class="btn btn-default"><i class="fa fa-plus"></i> New Group</a>
</p>
<smart-table rows="groups" columns="gridColumns"
@@ -12,4 +12,4 @@
class="table table-condensed table-hover"></smart-table>
</div>
</div>
</div>
</div>

View File

@@ -42,8 +42,8 @@
</button>
<ul class="dropdown-menu" role="menu">
<li><a href="#" ng-click="changePermission(dataSource, false)"><small ng-if="!dataSource.view_only"><span class="glyphicon glyphicon-ok"/></small> Full Access<br/></a></li>
<li><a href="#" ng-click="changePermission(dataSource, true)"><small ng-if="dataSource.view_only"><span class="glyphicon glyphicon-ok"/></small> View Only</a></li>
<li><a ng-click="changePermission(dataSource, false)"><small ng-if="!dataSource.view_only"><span class="glyphicon glyphicon-ok"/></small> Full Access<br/></a></li>
<li><a ng-click="changePermission(dataSource, true)"><small ng-if="dataSource.view_only"><span class="glyphicon glyphicon-ok"/></small> View Only</a></li>
</ul>
</div>
@@ -54,4 +54,4 @@
</table>
</div>
</div>
</div>
</div>

View File

@@ -146,7 +146,7 @@
<p>
<span class="glyphicon glyphicon-refresh"></span>
<span class="text-muted">Refresh Schedule</span>
<a href="" ng-click="openScheduleForm()">{{query.schedule | scheduleHumanize}}</a>
<a ng-click="openScheduleForm()">{{query.schedule | scheduleHumanize}}</a>
</p>
<p>
@@ -158,10 +158,25 @@
<hr>
<p>
<a class="btn btn-primary btn-sm" ng-disabled="queryExecuting || !queryResult.getData()" query-result-link target="_self">
<span class="glyphicon glyphicon-cloud-download"></span>
<span>Download Dataset</span>
</a>
<p>
<div class="btn-group" dropdown>
<button type="button" class="btn btn-primary btn-sm dropdown-toggle" ng-disabled="queryExecuting || !queryResult.getData()" aria-haspopup="true" dropdown-toggle
aria-expanded="false">
Download Dataset <span class="caret"></span>
</button>
<ul class="dropdown-menu" dropdown-menu>
<li>
<a query-result-link target="_self">
<span class="fa fa-file-o"></span> Download as CSV File
</a>
</li>
<li>
<a query-result-link file-type="xlsx" target="_self">
<span class="fa fa-file-excel-o"></span> Download as Excel File
</a>
</li>
</ul>
</div>
<a class="btn btn-warning btn-sm" ng-disabled="queryExecuting" data-toggle="modal" data-target="#archive-confirmation-modal"
ng-show="!query.is_archived && query.id != undefined && (isQueryOwner || currentUser.hasPermission('admin'))">
@@ -171,6 +186,7 @@
<button class="btn btn-default btn-sm" ng-show="query.id != undefined" ng-click="showApiKey()">
<i class="fa fa-key" title="Show API Key"></i>
</button>
</p>
<div class="modal fade" id="archive-confirmation-modal" tabindex="-1" role="dialog" aria-labelledby="archiveConfirmationModal" aria-hidden="true">
<div class="modal-dialog">
@@ -236,33 +252,21 @@
<grid-renderer query-result="queryResult" items-per-page="50"></grid-renderer>
<div class="row" ng-if="vis.type=='TABLE'" ng-repeat="vis in query.visualizations">
<div class="col-lg-8 embed-code">
<i class="fa fa-code" ng-click="show_code = show_code==true ? false : true;"></i>
<div ng-show="show_code">
<span class="text-muted">Embed code for this table: <small>(height should be adjusted)</small></span>
<code>&lt;iframe src="{{ base_url }}/embed/query/{{query.id}}/visualization/{{ vis.id }}?api_key={{query.api_key}}"<br/>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
width="720" height="1650"&gt;&lt;/iframe&gt;</code>
</div>
</div>
<embed-code visualization="vis" query="query"/>
</div>
</div>
<pivot-table-renderer ng-show="selectedTab == 'pivot'" query-result="queryResult"></pivot-table-renderer>
<div ng-show="selectedTab == 'pivot'">
<h3>
Pivot tables are now regular visualization, which you can create from the <a hash="add" hash-link>"New Visualization" tab</a> and <strong>save</strong>.
</h3>
</div>
<div ng-show="selectedTab == vis.id" ng-repeat="vis in query.visualizations">
<visualization-renderer visualization="vis" query-result="queryResult"></visualization-renderer>
<div class="row">
<div class="col-lg-8 embed-code">
<i class="fa fa-code" ng-click="show_code = show_code==true ? false : true;"></i>
<div ng-show="show_code">
<span class="text-muted">Embed code for this chart: <small>(height should be adjusted)</small></span>
<code>&lt;iframe src="{{ base_url }}/embed/query/{{query.id}}/visualization/{{ vis.id }}?api_key={{query.api_key}}"<br/>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
width="720" height="391"&gt;&lt;/iframe&gt;</code>
</div>
</div>
<embed-code visualization="vis" query="query"/>
</div>
<edit-visulatization-form visualization="vis" query="query" query-result="queryResult" ng-show="canEdit"></edit-visulatization-form>

View File

@@ -21,25 +21,20 @@
API Key:
<input type="text" value="{{user.api_key}}" size="44" readonly/>
</tab>
<tab heading="Settings" ng-if="showSettings || currentUser.hasPermission('admin')" active="tabs['settings']" select="setTab('settings')">
<tab heading="Settings" ng-if="showSettings" active="tabs['settings']" select="setTab('settings')">
<div class="col-md-6">
<form class="form" name="userSettingsForm" ng-submit="updateUser(userSettingsForm)" novalidate>
<div class="form-group required" ng-if="showSettings" show-errors>
<div class="form-group required" show-errors>
<label class="control-label">Name</label>
<input name="name" type="text" class="form-control" ng-model="user.name" required/>
<input-errors errors="userSettingsForm.name.$error"/>
</div>
<div class="form-group required" ng-if="showSettings" show-errors>
<div class="form-group required" show-errors>
<label class="control-label">Email</label>
<input name="email" type="email" class="form-control" ng-model="user.email" required/>
<input-errors errors="userSettingsForm.email.$error"/>
</div>
<div class="checkbox" ng-if="currentUser.hasPermission('admin')">
<label>
<input type="checkbox" ng-model="user.admin"> Admin
</label>
</div>
<div class="form-gruup">
<div class="form-group">
<button class="btn btn-primary">Save</button>
</div>
</form>

View File

@@ -6,6 +6,7 @@
<div class="panel-heading">
<h3 class="panel-title">
<p>
<img src="/images/redash_icon_small.png" style="height: 24px;"/>
<visualization-name visualization="visualization"/>
</p>
<div class="text-muted" ng-bind-html="query.description | markdown"></div>
@@ -18,10 +19,6 @@
<div class="panel-footer">
<span class="label label-default">Updated: <span am-time-ago="queryResult.getUpdatedAt()"></span></span>
<span class="pull-right">
<a class="btn btn-default btn-xs" ng-href="queries/{{query.id}}#{{widget.visualization.id}}" target="_blank"><span class="glyphicon glyphicon-link"></span></a>
</span>
<span class="pull-right">
<a class="btn btn-default btn-xs" ng-disabled="!queryResult.getData()" query-result-link target="_self">
<span class="glyphicon glyphicon-cloud-download"></span>

View File

@@ -1,16 +1,16 @@
<div class="well well-sm filters-container" ng-show="filters">
<div class="filter" ng-repeat="filter in filters">
<ui-select ng-model="filter.current" ng-if="!filter.multiple">
<ui-select-match placeholder="Select value for {{filter.friendlyName}}...">{{filter.friendlyName}}: {{$select.selected}}</ui-select-match>
<ui-select-match placeholder="Select value for {{filter.friendlyName}}...">{{filter.friendlyName}}: {{$select.selected | filterValue:filter}}</ui-select-match>
<ui-select-choices repeat="value in filter.values | filter: $select.search">
{{value}}
{{value | filterValue:filter }}
</ui-select-choices>
</ui-select>
<ui-select ng-model="filter.current" multiple ng-if="filter.multiple">
<ui-select-match placeholder="Select value for {{filter.friendlyName}}...">{{filter.friendlyName}}: {{$item}}</ui-select-match>
<ui-select-match placeholder="Select value for {{filter.friendlyName}}...">{{filter.friendlyName}}: {{$item | filterValue:filter}}</ui-select-match>
<ui-select-choices repeat="value in filter.values | filter: $select.search">
{{value}}
{{value | filterValue:filter }}
</ui-select-choices>
</ui-select>
</div>

View File

@@ -14,7 +14,7 @@
"moment": "~2.8.0",
"codemirror": "4.8.0",
"underscore": "1.5.1",
"pivottable": "~1.1.1",
"pivottable": "2.0.2",
"cornelius": "https://github.com/restorando/cornelius.git",
"gridster": "0.2.0",
"mousetrap": "~1.4.6",
@@ -34,7 +34,7 @@
"d3": "3.5.6",
"angular-ui-sortable": "~0.13.4",
"angular-plotly": "~0.1.2",
"plotly": "~0.0.2"
"plotly": "~1.4.1"
},
"devDependencies": {
"angular-mocks": "1.2.18",

120
rd_ui/gulpfile.js Normal file
View File

@@ -0,0 +1,120 @@
// Generated on 2016-02-09 using generator-angular 0.15.1
'use strict';
var gulp = require('gulp');
var $ = require('gulp-load-plugins')();
var lazypipe = require('lazypipe');
var rimraf = require('rimraf');
var wiredep = require('wiredep').stream;
var runSequence = require('run-sequence');
var yeoman = {
app: 'app',
dist: 'dist'
};
var paths = {
scripts: [yeoman.app + '/scripts/**/*.js'],
styles: [yeoman.app + '/styles/**/*.css'],
views: {
main: [yeoman.app + '/index.html', 'app/vendor_scripts.html', 'app/login.html', 'app/embed.html'],
files: [yeoman.app + '/views/**/*.html']
}
};
////////////////////////
// Reusable pipelines //
////////////////////////
var lintScripts = lazypipe()
.pipe($.jshint, '.jshintrc')
.pipe($.jshint.reporter, 'jshint-stylish');
var styles = lazypipe()
.pipe($.autoprefixer, 'last 1 version')
.pipe(gulp.dest, '.tmp/styles');
///////////
// Tasks //
///////////
gulp.task('styles', function () {
return gulp.src(paths.styles)
.pipe(styles());
});
gulp.task('lint:scripts', function () {
return gulp.src(paths.scripts)
.pipe(lintScripts());
});
gulp.task('clean:tmp', function (cb) {
rimraf('./.tmp', cb);
});
// inject bower components
gulp.task('bower', function () {
return gulp.src(paths.views.main)
.pipe(wiredep({
directory: yeoman.app + '/bower_components',
ignorePath: '..'
}))
.pipe(gulp.dest(yeoman.app + '/views'));
});
///////////
// Build //
///////////
gulp.task('clean:dist', function (cb) {
rimraf('./dist', cb);
});
gulp.task('client:build', ['html', 'styles'], function () {
var jsFilter = $.filter('**/*.js');
var cssFilter = $.filter('**/*.css');
return gulp.src(paths.views.main)
.pipe($.useref({searchPath: [yeoman.app, '.tmp']}))
.pipe(jsFilter)
.pipe($.ngAnnotate())
.pipe($.uglify())
.pipe(jsFilter.restore())
.pipe($.print())
.pipe(cssFilter)
.pipe($.minifyCss({cache: true}))
.pipe(cssFilter.restore())
.pipe(new $.revAll({dontRenameFile: ['.html'], dontUpdateReference: ['vendor_scripts.html']}).revision())
.pipe(gulp.dest(yeoman.dist));
});
gulp.task('html', function () {
return gulp.src(yeoman.app + '/views/**/*')
.pipe(gulp.dest(yeoman.dist + '/views'));
});
gulp.task('images', function () {
return gulp.src(yeoman.app + '/images/**/*')
.pipe($.cache($.imagemin({
optimizationLevel: 5,
progressive: true,
interlaced: true
})))
.pipe(gulp.dest(yeoman.dist + '/images'));
});
gulp.task('copy:extras', function () {
return gulp.src([yeoman.app + '/*/.*', 'app/google_login.png', 'favicon.ico', 'robots.txt'], { dot: true })
.pipe(gulp.dest(yeoman.dist));
});
gulp.task('copy:fonts', function () {
return gulp.src([yeoman.app + '/fonts/**/*', 'app/bower_components/bootstrap/dist/fonts/**/*', 'app/bower_components/font-awesome/fonts/*'])
.pipe(gulp.dest(yeoman.dist + '/fonts'));
});
gulp.task('build', ['clean:dist'], function () {
runSequence(['images', 'copy:extras', 'copy:fonts', 'client:build']);
});
gulp.task('default', ['build']);

View File

@@ -1,44 +1,41 @@
{
"name": "rdui",
"version": "0.0.0",
"dependencies": {},
"name": "redash",
"devDependencies": {
"grunt": "^0.4.1",
"grunt-autoprefixer": "^0.7.3",
"grunt-concurrent": "^0.5.0",
"grunt-contrib-clean": "^0.5.0",
"grunt-contrib-concat": "^0.4.0",
"grunt-contrib-connect": "^0.7.1",
"grunt-contrib-copy": "^0.5.0",
"grunt-contrib-cssmin": "^0.9.0",
"grunt-contrib-htmlmin": "^0.3.0",
"grunt-contrib-jshint": "^0.10.0",
"grunt-contrib-uglify": "^0.4.0",
"grunt-contrib-watch": "^0.6.1",
"grunt-filerev": "^0.2.1",
"grunt-google-cdn": "^0.4.0",
"grunt-newer": "^0.7.0",
"grunt-ngmin": "^0.0.3",
"grunt-svgmin": "^0.4.0",
"grunt-usemin": "^2.1.1",
"grunt-wiredep": "^1.7.0",
"jshint-stylish": "^0.2.0",
"load-grunt-tasks": "^0.4.0",
"time-grunt": "^0.3.1",
"karma-jasmine": "~0.1.5",
"grunt-karma": "~0.8.3",
"karma-phantomjs-launcher": "~0.1.4",
"karma": "~0.12.19",
"karma-ng-html2js-preprocessor": "~0.1.0",
"gulp": "^3.9.0",
"gulp-connect": "^2.2.0",
"gulp-autoprefixer": "2.3.1",
"gulp-cache": "^0.2.10",
"rimraf": "^2.4.0",
"gulp-filter": "^2.0.2",
"gulp-imagemin": "^2.3.0",
"gulp-jshint": "^1.11.1",
"gulp-karma": "0.0.4",
"gulp-load-plugins": "^0.10.0",
"gulp-plumber": "^1.0.1",
"gulp-minify-css": "^1.2.0",
"gulp-uglify": "^1.2.0",
"gulp-useref": "^3.0.0",
"gulp-util": "^3.0.6",
"gulp-watch": "^4.2.4",
"run-sequence": "^1.1.1",
"wiredep": "^2.2.2",
"lazypipe": "^0.2.4",
"gulp-ng-annotate": "^1.0.0",
"open": "0.0.5",
"jshint-stylish": "^1.0.0",
"gulp-print": "^2.0.1",
"gulp-rev-all": "^0.8.22",
"bower": "~1.7.1",
"grunt-cli": "~0.1.13"
"gulp-cli": "~1.2.0"
},
"engines": {
"node": ">=0.10.0"
},
"scripts": {
"test": "grunt test",
"build": "grunt build",
"test": "echo 'No tests.'",
"build": "gulp build",
"bower": "bower"
},
"dependencies": {
}
}

View File

@@ -1,7 +1,7 @@
import json
from flask_admin import Admin
from flask_admin.base import MenuLink
from flask_admin.contrib.peewee import ModelView
from flask.ext.admin import Admin
from flask.ext.admin.base import MenuLink
from flask_admin.contrib.peewee.form import CustomModelConverter
from flask_admin.form.widgets import DateTimePickerWidget
from playhouse.postgres_ext import ArrayField, DateTimeTZField
@@ -92,4 +92,4 @@ def init_admin(app):
for m in (models.Visualization, models.Widget, models.Event, models.Organization):
admin.add_view(BaseModelView(m))
admin.add_link(logout_link)
admin.add_link(logout_link)

View File

@@ -1,11 +1,10 @@
from flask_login import LoginManager, user_logged_in
import hashlib
import hmac
import time
import logging
from flask import redirect, request, jsonify
from flask.ext.login import LoginManager
from flask.ext.login import user_logged_in
from redash import models, settings
from redash.authentication import google_oauth, saml_auth
@@ -136,5 +135,3 @@ def setup_authentication(app):
else:
logger.warning("Unknown authentication type ({}). Using default (HMAC).".format(settings.AUTH_TYPE))
login_manager.request_loader(hmac_load_user_from_request)

View File

@@ -1,7 +1,7 @@
import logging
from flask.ext.login import login_user
import requests
from flask import redirect, url_for, Blueprint, flash, request, session
from flask_login import login_user
from flask_oauthlib.client import OAuth
from redash import models, settings
from redash.authentication.org_resolving import current_org

View File

@@ -1,6 +1,8 @@
"""
This module implements different strategies to resolve the current Organization we are using. By default we use the simple
single_org strategy, which assumes you have a single Organization in your installation.
This module implements different strategies to resolve the current Organization we are using.
By default we use the simple single_org strategy, which assumes you have a
single Organization in your installation.
"""
import logging
@@ -18,5 +20,3 @@ def _get_current_org():
# TODO: move to authentication
current_org = LocalProxy(_get_current_org)

View File

@@ -14,11 +14,11 @@ blueprint = Blueprint('saml_auth', __name__)
def get_saml_client():
'''
Return saml configuation.
The configuration is a hash for use by saml2.config.Config
'''
"""
Return SAML configuration.
The configuration is a hash for use by saml2.config.Config
"""
if settings.SAML_CALLBACK_SERVER_NAME:
acs_url = settings.SAML_CALLBACK_SERVER_NAME + url_for("saml_auth.idp_initiated")
else:

View File

@@ -1,18 +1,19 @@
import json
import click
from flask.ext.script import Manager
from flask_script import Manager
from redash import models
from redash.query_runner import query_runners, validate_configuration
from redash.query_runner import query_runners, get_configuration_schema_for_type
from redash.utils.configuration import ConfigurationContainer
manager = Manager(help="Data sources management commands.")
@manager.command
def list():
"""List currently configured data sources"""
"""List currently configured data sources."""
for i, ds in enumerate(models.DataSource.select()):
if i > 0:
print "-"*20
print "-" * 20
print "Id: {}\nName: {}\nType: {}\nOptions: {}".format(ds.id, ds.name, ds.type, ds.options)
@@ -23,35 +24,29 @@ def validate_data_source_type(type):
exit()
def validate_data_source_options(type, options):
if not validate_configuration(type, options):
print "Error: invalid configuration."
exit()
@manager.command
def new(name=None, type=None, options=None):
"""Create new data source"""
"""Create new data source."""
if name is None:
name = click.prompt("Name")
if type is None:
print "Select type:"
for i, query_runner_name in enumerate(query_runners.keys()):
print "{}. {}".format(i+1, query_runner_name)
print "{}. {}".format(i + 1, query_runner_name)
idx = 0
while idx < 1 or idx > len(query_runners.keys()):
idx = click.prompt("[{}-{}]".format(1, len(query_runners.keys())), type=int)
type = query_runners.keys()[idx-1]
type = query_runners.keys()[idx - 1]
else:
validate_data_source_type(type)
if options is None:
query_runner = query_runners[type]
schema = query_runner.configuration_schema()
query_runner = query_runners[type]
schema = query_runner.configuration_schema()
if options is None:
types = {
'string': unicode,
'number': int,
@@ -76,11 +71,15 @@ def new(name=None, type=None, options=None):
if value != default_value:
options_obj[k] = value
options = json.dumps(options_obj)
options = ConfigurationContainer(options_obj, schema)
else:
options = ConfigurationContainer(json.loads(options), schema)
validate_data_source_options(type, options)
if not options.is_valid():
print "Error: invalid configuration."
exit()
print "Creating {} data source ({}) with options:\n{}".format(type, name, options)
print "Creating {} data source ({}) with options:\n{}".format(type, name, options.to_json())
data_source = models.DataSource.create(name=name,
type=type,
@@ -91,11 +90,11 @@ def new(name=None, type=None, options=None):
@manager.command
def delete(name):
"""Deletes data source by name"""
"""Delete data source by name."""
try:
data_source = models.DataSource.get(models.DataSource.name==name)
print "Deleting data source: {} (id={})".format(name, data_source.id)
data_source.delete_instance()
data_source.delete_instance(recursive=True)
except models.DataSource.DoesNotExist:
print "Couldn't find data source named: {}".format(name)
@@ -112,7 +111,7 @@ def update_attr(obj, attr, new_value):
@manager.option('--options', dest='options', default=None, help="updated options for the data source")
@manager.option('--type', dest='type', default=None, help="new type for the data source")
def edit(name, new_name=None, options=None, type=None):
"""Edit data source settings (name, options, type)"""
"""Edit data source settings (name, options, type)."""
try:
if type is not None:
validate_data_source_type(type)
@@ -120,7 +119,10 @@ def edit(name, new_name=None, options=None, type=None):
data_source = models.DataSource.get(models.DataSource.name==name)
if options is not None:
validate_data_source_options(data_source.type, options)
schema = get_configuration_schema_for_type(data_source.type)
options = json.loads(options)
data_source.options.set_schema(schema)
data_source.options.update(options)
update_attr(data_source, "name", new_name)
update_attr(data_source, "type", type)
@@ -129,4 +131,3 @@ def edit(name, new_name=None, options=None, type=None):
except models.DataSource.DoesNotExist:
print "Couldn't find data source named: {}".format(name)

View File

@@ -1,10 +1,10 @@
from flask.ext.script import Manager
from flask_script import Manager
manager = Manager(help="Manages the database (create/drop tables).")
manager = Manager(help="Manage the database (create/drop tables).")
@manager.command
def create_tables():
"""Creates the database tables."""
"""Create the database tables."""
from redash.models import create_db, init_db
create_db(True, False)
@@ -16,4 +16,3 @@ def drop_tables():
from redash.models import create_db
create_db(False, True)

View File

@@ -1,4 +1,4 @@
from flask.ext.script import Manager
from flask_script import Manager
from redash import models
manager = Manager(help="Organization management commands.")

View File

@@ -1,4 +1,4 @@
from flask.ext.script import Manager, prompt_pass
from flask_script import Manager, prompt_pass
from redash import models
manager = Manager(help="Users management commands. This commands assume single organization operation.")
@@ -7,12 +7,17 @@ manager = Manager(help="Users management commands. This commands assume single o
@manager.option('email', help="email address of the user to grant admin to")
def grant_admin(email):
try:
user = models.User.get_by_email_and_org(email, models.Organization.get_by_slug('default'))
org = models.Organization.get_by_slug('default')
admin_group = org.admin_group
user = models.User.get_by_email_and_org(email, org)
user.groups.append('admin')
user.save()
if admin_group.id in user.groups:
print "User is already an admin."
else:
user.groups.append(org.admin_group.id)
user.save()
print "User updated."
print "User updated."
except models.User.DoesNotExist:
print "User [%s] not found." % email
@@ -76,6 +81,6 @@ def list():
"""List all users"""
for i, user in enumerate(models.User.select()):
if i > 0:
print "-"*20
print "-" * 20
print "Id: {}\nName: {}\nEmail: {}".format(user.id, user.name.encode('utf-8'), user.email)

View File

@@ -1,4 +1,4 @@
from flask.ext.restful import Resource, abort
from flask_restful import Resource, abort
from flask_login import current_user, login_required
from peewee import DoesNotExist

View File

@@ -65,4 +65,3 @@ class DashboardAPI(BaseResource):
api.add_org_resource(DashboardListAPI, '/api/dashboards', endpoint='dashboards')
api.add_org_resource(DashboardRecentAPI, '/api/dashboards/recent', endpoint='recent_dashboards')
api.add_org_resource(DashboardAPI, '/api/dashboards/<dashboard_slug>', endpoint='dashboard')

View File

@@ -1,13 +1,12 @@
import json
from flask import make_response, request
from flask.ext.restful import abort
from flask_restful import abort
from funcy import project
from redash import models
from redash.wsgi import api
from redash.utils.configuration import ConfigurationContainer, ValidationError
from redash.permissions import require_admin
from redash.query_runner import query_runners, validate_configuration
from redash.query_runner import query_runners, get_configuration_schema_for_type
from redash.handlers.base import BaseResource, get_object_or_404
@@ -30,14 +29,18 @@ class DataSourceAPI(BaseResource):
data_source = models.DataSource.get_by_id_and_org(data_source_id, self.current_org)
req = request.get_json(True)
data_source.replace_secret_placeholders(req['options'])
if not validate_configuration(req['type'], req['options']):
schema = get_configuration_schema_for_type(req['type'])
if schema is None:
abort(400)
data_source.name = req['name']
data_source.options = json.dumps(req['options'])
try:
data_source.options.set_schema(schema)
data_source.options.update(req['options'])
except ValidationError:
abort(400)
data_source.type = req['type']
data_source.name = req['name']
data_source.save()
return data_source.to_dict(all=True)
@@ -76,12 +79,18 @@ class DataSourceListAPI(BaseResource):
if f not in req:
abort(400)
if not validate_configuration(req['type'], req['options']):
schema = get_configuration_schema_for_type(req['type'])
if schema is None:
abort(400)
config = ConfigurationContainer(req['options'], schema)
if not config.is_valid():
abort(400)
datasource = models.DataSource.create_with_group(org=self.current_org,
name=req['name'],
type=req['type'], options=json.dumps(req['options']))
type=req['type'],
options=config)
return datasource.to_dict(all=True)

View File

@@ -1,7 +1,7 @@
from flask import render_template
from flask.ext.restful import abort
from funcy import project
from flask import render_template, url_for
from flask_login import login_required
from flask_restful import abort
from redash import models, settings
from redash.wsgi import app
@@ -31,8 +31,18 @@ def embed(query_id, visualization_id, org_slug=None):
client_config = {}
client_config.update(settings.COMMON_CLIENT_CONFIG)
qr = project(qr, ('data', 'id', 'retrieved_at'))
vis = project(vis, ('description', 'name', 'id', 'options', 'query', 'type', 'updated_at'))
vis['query'] = project(vis, ('created_at', 'description', 'name', 'id', 'latest_query_data_id', 'name', 'updated_at'))
if settings.MULTI_ORG:
base_href = url_for('index', _external=True, org_slug=current_org.slug)
else:
base_href = url_for('index', _external=True)
return render_template("embed.html",
name=settings.NAME,
base_href=base_href,
client_config=json_dumps(client_config),
visualization=json_dumps(vis),
query_result=json_dumps(qr),

View File

@@ -1,6 +1,6 @@
import time
from flask import request
from flask.ext.restful import abort
from flask_restful import abort
from redash import models
from redash.wsgi import api
from redash.permissions import require_admin, require_permission

View File

@@ -1,15 +1,17 @@
from flask import request
from flask.ext.restful import abort
from flask_restful import abort
from flask_login import login_required
import sqlparse
from funcy import distinct, take
from itertools import chain
from redash.handlers.query_results import run_query
from redash import models
from redash.wsgi import app, api
from redash.permissions import require_permission, require_access, require_admin_or_owner, not_view_only, view_only
from redash.handlers.base import BaseResource, get_object_or_404
from redash.utils import collect_parameters_from_request
@app.route('/api/queries/format', methods=['POST'])
@@ -105,7 +107,18 @@ class QueryAPI(BaseResource):
query.archive()
class QueryRefreshResource(BaseResource):
def post(self, query_id):
query = get_object_or_404(models.Query.get_by_id_and_org, query_id, self.current_org)
require_access(query.groups, self.current_user, not_view_only)
parameter_values = collect_parameters_from_request(request.args)
return run_query(query.data_source, parameter_values, query.query, query.id)
api.add_org_resource(QuerySearchAPI, '/api/queries/search', endpoint='queries_search')
api.add_org_resource(QueryRecentAPI, '/api/queries/recent', endpoint='recent_queries')
api.add_org_resource(QueryListAPI, '/api/queries', endpoint='queries')
api.add_org_resource(QueryRefreshResource, '/api/queries/<query_id>/refresh', endpoint='query_refresh')
api.add_org_resource(QueryAPI, '/api/queries/<query_id>', endpoint='query')

View File

@@ -3,20 +3,52 @@ import json
import cStringIO
import time
import pystache
from flask import make_response, request
from flask.ext.restful import abort
from flask_login import current_user
from flask_restful import abort
import xlsxwriter
from redash import models, settings, utils
from redash.wsgi import api
from redash.tasks import QueryTask, record_event
from redash.permissions import require_permission, not_view_only, has_access
from redash.handlers.base import BaseResource, get_object_or_404
from redash.utils import collect_query_parameters, collect_parameters_from_request
def run_query(data_source, parameter_values, query_text, query_id, max_age=0):
query_parameters = set(collect_query_parameters(query_text))
missing_params = set(query_parameters) - set(parameter_values.keys())
if missing_params:
return {'job': {'status': 4,
'error': 'Missing parameter value for: {}'.format(", ".join(missing_params))}}, 400
if query_parameters:
query_text = pystache.render(query_text, parameter_values)
if max_age == 0:
query_result = None
else:
query_result = models.QueryResult.get_latest(data_source, query_text, max_age)
if query_result:
return {'query_result': query_result.to_dict()}
else:
job = QueryTask.add_task(query_text, data_source,
metadata={"Username": current_user.name, "Query ID": query_id})
return {'job': job.to_dict()}
class QueryResultListAPI(BaseResource):
@require_permission('execute_query')
def post(self):
params = request.get_json(force=True)
parameter_values = collect_parameters_from_request(request.args)
query = params['query']
max_age = int(params.get('max_age', -1))
query_id = params.get('query_id', 'adhoc')
data_source = models.DataSource.get_by_id_and_org(params.get('data_source_id'), self.current_org)
if not has_access(data_source.groups, self.current_user, not_view_only):
@@ -27,23 +59,10 @@ class QueryResultListAPI(BaseResource):
'timestamp': int(time.time()),
'object_id': data_source.id,
'object_type': 'data_source',
'query': params['query']
'query': query
})
max_age = int(params.get('max_age', -1))
if max_age == 0:
query_result = None
else:
query_result = models.QueryResult.get_latest(data_source, params['query'], max_age)
if query_result:
return {'query_result': query_result.to_dict()}
else:
query_id = params.get('query_id', 'adhoc')
job = QueryTask.add_task(params['query'], data_source,
metadata={"Username": self.current_user.name, "Query ID": query_id})
return {'job': job.to_dict()}
return run_query(data_source, parameter_values, query, query_id, max_age)
ONE_YEAR = 60 * 60 * 24 * 365.25
@@ -74,6 +93,10 @@ class QueryResultAPI(BaseResource):
@require_permission('view_query')
def get(self, query_id=None, query_result_id=None, filetype='json'):
# TODO:
# This method handles two cases: retrieving result by id & retrieving result by query id.
# They need to be split, as they have different logic (for example, retrieving by query id
# should check for query parameters and shouldn't cache the result).
should_cache = query_result_id is not None
if query_result_id is None and query_id is not None:
query = get_object_or_404(models.Query.get_by_id_and_org, query_id, self.current_org)
@@ -105,6 +128,8 @@ class QueryResultAPI(BaseResource):
if filetype == 'json':
response = self.make_json_response(query_result)
elif filetype == 'xlsx':
response = self.make_excel_response(query_result)
else:
response = self.make_csv_response(query_result)
@@ -137,6 +162,28 @@ class QueryResultAPI(BaseResource):
headers = {'Content-Type': "text/csv; charset=UTF-8"}
return make_response(s.getvalue(), 200, headers)
@staticmethod
def make_excel_response(query_result):
s = cStringIO.StringIO()
query_data = json.loads(query_result.data)
book = xlsxwriter.Workbook(s)
sheet = book.add_worksheet("result")
column_names = []
for (c, col) in enumerate(query_data['columns']):
sheet.write(0, c, col['name'])
column_names.append(col['name'])
for (r, row) in enumerate(query_data['rows']):
for (c, name) in enumerate(column_names):
sheet.write(r + 1, c, row[name])
book.close()
headers = {'Content-Type': "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet"}
return make_response(s.getvalue(), 200, headers)
api.add_org_resource(QueryResultListAPI, '/api/query_results', endpoint='query_results')
api.add_org_resource(QueryResultAPI,

View File

@@ -89,5 +89,3 @@ rules = ['/admin/<anything>/<whatever>',
'/personal']
register_static_routes(rules)

View File

@@ -1,6 +1,6 @@
import time
from flask import request
from flask.ext.restful import abort
from flask_restful import abort
from funcy import project
from peewee import IntegrityError
@@ -48,7 +48,7 @@ class UserResource(BaseResource):
def get(self, user_id):
require_permission_or_owner('list_users', user_id)
user = get_object_or_404(models.User.get_by_id_and_org, user_id, self.current_org)
return user.to_dict(with_api_key=is_admin_or_owner(user_id))
def post(self, user_id):
@@ -95,5 +95,3 @@ class UserResource(BaseResource):
api.add_org_resource(UserListResource, '/api/users', endpoint='users')
api.add_org_resource(UserResource, '/api/users/<user_id>', endpoint='user')

View File

@@ -1 +0,0 @@

View File

@@ -50,7 +50,7 @@ def patch_query_execute():
result = real_execute(self, *args, **kwargs)
return result
finally:
duration = (time.time() - start_time)*1000
duration = (time.time() - start_time) * 1000
statsd_client.timing('db.{}.{}'.format(name, action), duration)
metrics_logger.debug("model=%s query=%s duration=%.2f", name, action, duration)
@@ -100,5 +100,3 @@ class MeteredModel(Model):
result = getattr(super(MeteredModel, cls), action)(*args, **kwargs)
setattr(result, 'model_action', action)
return result

View File

@@ -14,6 +14,9 @@ def record_requets_start_time():
def calculate_metrics(response):
if 'start_time' not in g:
return response
request_duration = (time.time() - g.start_time) * 1000
metrics_logger.info("method=%s path=%s endpoint=%s status=%d content_type=%s content_length=%d duration=%.2f query_count=%d query_duration=%.2f",
@@ -43,4 +46,3 @@ def provision_app(app):
app.before_request(record_requets_start_time)
app.after_request(calculate_metrics)
app.teardown_request(calculate_metrics_on_exception)

View File

@@ -1,4 +1,5 @@
import json
from flask_login import UserMixin, AnonymousUserMixin
import hashlib
import logging
import os
@@ -10,14 +11,16 @@ from funcy import project
import peewee
from passlib.apps import custom_app_context as pwd_context
from playhouse.postgres_ext import ArrayField, DateTimeTZField, PostgresqlExtDatabase
from flask.ext.login import UserMixin, AnonymousUserMixin
from playhouse.postgres_ext import ArrayField, DateTimeTZField
from permissions import has_access, view_only
from redash import utils, settings, redis_connection
from redash.query_runner import get_query_runner
from redash.query_runner import get_query_runner, get_configuration_schema_for_type
from redash.metrics.database import MeteredPostgresqlExtDatabase, MeteredModel
from utils import generate_token
from redash.utils import generate_token
from redash.utils.configuration import ConfigurationContainer
class Database(object):
def __init__(self):
@@ -313,14 +316,20 @@ class User(ModelTimestampsMixin, BaseModel, BelongsToOrgMixin, UserMixin, Permis
return self.password_hash and pwd_context.verify(password, self.password_hash)
class DataSource(BelongsToOrgMixin, BaseModel):
SECRET_PLACEHOLDER = '--------'
class ConfigurationField(peewee.TextField):
def db_value(self, value):
return value.to_json()
def python_value(self, value):
return ConfigurationContainer.from_json(value)
class DataSource(BelongsToOrgMixin, BaseModel):
id = peewee.PrimaryKeyField()
org = peewee.ForeignKeyField(Organization, related_name="data_sources")
name = peewee.CharField()
type = peewee.CharField()
options = peewee.TextField()
options = ConfigurationField()
queue_name = peewee.CharField(default="queries")
scheduled_queue_name = peewee.CharField(default="scheduled_queries")
created_at = DateTimeTZField(default=datetime.datetime.now)
@@ -341,7 +350,9 @@ class DataSource(BelongsToOrgMixin, BaseModel):
}
if all:
d['options'] = self.configuration
schema = get_configuration_schema_for_type(self.type)
self.options.set_schema(schema)
d['options'] = self.options.to_dict(mask_secrets=True)
d['queue_name'] = self.queue_name
d['scheduled_queue_name'] = self.scheduled_queue_name
d['groups'] = self.groups
@@ -360,23 +371,6 @@ class DataSource(BelongsToOrgMixin, BaseModel):
DataSourceGroup.create(data_source=data_source, group=data_source.org.default_group)
return data_source
@property
def configuration(self):
configuration = json.loads(self.options)
schema = self.query_runner.configuration_schema()
for prop in schema.get('secret', []):
if prop in configuration and configuration[prop]:
configuration[prop] = self.SECRET_PLACEHOLDER
return configuration
def replace_secret_placeholders(self, configuration):
current_configuration = json.loads(self.options)
schema = self.query_runner.configuration_schema()
for prop in schema.get('secret', []):
if prop in configuration and configuration[prop] == self.SECRET_PLACEHOLDER:
configuration[prop] = current_configuration[prop]
def get_schema(self, refresh=False):
key = "data_source:schema:{}".format(self.id)
@@ -985,7 +979,7 @@ class Widget(ModelTimestampsMixin, BaseModel):
d['visualization'] = self.visualization.to_dict()
return d
def __unicode__(self):
return u"%s" % self.id

View File

@@ -1,4 +1,4 @@
from redash import redis_connection, models, __version__
from redash import redis_connection, models, __version__
def get_status():

View File

@@ -1,6 +1,6 @@
from flask_login import current_user
from flask_restful import abort
import functools
from flask.ext.login import current_user
from flask.ext.restful import abort
from funcy import any, flatten
view_only = True

View File

@@ -1,14 +1,11 @@
import logging
import json
import jsonschema
from jsonschema import ValidationError
from redash import settings
logger = logging.getLogger(__name__)
__all__ = [
'ValidationError',
'BaseQueryRunner',
'InterruptException',
'BaseSQLQueryRunner',
@@ -41,12 +38,13 @@ SUPPORTED_COLUMN_TYPES = set([
TYPE_DATE
])
class InterruptException(Exception):
pass
class BaseQueryRunner(object):
def __init__(self, configuration):
jsonschema.validate(configuration, self.configuration_schema())
self.syntax = 'sql'
self.configuration = configuration
@@ -142,29 +140,20 @@ def register(query_runner_class):
logger.warning("%s query runner enabled but not supported, not registering. Either disable or install missing dependencies.", query_runner_class.name())
def get_query_runner(query_runner_type, configuration_json):
def get_query_runner(query_runner_type, configuration):
query_runner_class = query_runners.get(query_runner_type, None)
if query_runner_class is None:
return None
return query_runner_class(json.loads(configuration_json))
return query_runner_class(configuration)
def validate_configuration(query_runner_type, configuration_json):
def get_configuration_schema_for_type(query_runner_type):
query_runner_class = query_runners.get(query_runner_type, None)
if query_runner_class is None:
return False
return None
try:
if isinstance(configuration_json, basestring):
configuration = json.loads(configuration_json)
else:
configuration = configuration_json
jsonschema.validate(configuration, query_runner_class.configuration_schema())
except (ValidationError, ValueError):
return False
return True
return query_runner_class.configuration_schema()
def import_query_runners(query_runner_imports):

View File

@@ -105,8 +105,8 @@ class BigQuery(BaseQueryRunner):
'secret': ['jsonKeyFile']
}
def __init__(self, configuration_json):
super(BigQuery, self).__init__(configuration_json)
def __init__(self, configuration):
super(BigQuery, self).__init__(configuration)
def _get_bigquery_service(self):
scope = [

View File

@@ -71,8 +71,8 @@ class BaseElasticSearch(BaseQueryRunner):
def enabled(cls):
return False
def __init__(self, configuration_json):
super(BaseElasticSearch, self).__init__(configuration_json)
def __init__(self, configuration):
super(BaseElasticSearch, self).__init__(configuration)
self.syntax = "json"
@@ -164,8 +164,8 @@ class BaseElasticSearch(BaseQueryRunner):
class Kibana(BaseElasticSearch):
def __init__(self, configuration_json):
super(Kibana, self).__init__(configuration_json)
def __init__(self, configuration):
super(Kibana, self).__init__(configuration)
@classmethod
def enabled(cls):
@@ -201,6 +201,7 @@ class Kibana(BaseElasticSearch):
index_name = query_params["index"]
query_data = query_params["query"]
size = int(query_params.get("size", 500))
limit = int(query_params.get("limit", 500))
result_fields = query_params.get("fields", None)
sort = query_params.get("sort", None)
@@ -215,9 +216,6 @@ class Kibana(BaseElasticSearch):
logger.debug(json.dumps(mappings, indent=4))
if size:
url += "&size={0}".format(size)
if sort:
url += "&sort={0}".format(urllib.quote_plus(sort))
@@ -231,9 +229,10 @@ class Kibana(BaseElasticSearch):
if isinstance(query_data, str) or isinstance(query_data, unicode):
_from = 0
while True:
total = self._execute_simple_query(url, self.auth, _from, mappings, result_fields, result_columns, result_rows)
query_size = size if limit >= (_from + size) else (limit - _from)
total = self._execute_simple_query(url + "&size={0}".format(query_size), self.auth, _from, mappings, result_fields, result_columns, result_rows)
_from += size
if _from >= total:
if _from >= limit:
break
else:
# TODO: Handle complete ElasticSearch queries (JSON based sent over HTTP POST)
@@ -254,8 +253,8 @@ class Kibana(BaseElasticSearch):
class ElasticSearch(BaseElasticSearch):
def __init__(self, configuration_json):
super(ElasticSearch, self).__init__(configuration_json)
def __init__(self, configuration):
super(ElasticSearch, self).__init__(configuration)
@classmethod
def enabled(cls):

View File

@@ -22,6 +22,8 @@ def _load_key(filename):
def _guess_type(value):
if value == '':
return TYPE_STRING
try:
val = int(value)
return TYPE_INTEGER
@@ -45,6 +47,10 @@ def _guess_type(value):
def _value_eval_list(value):
value_list = []
for member in value:
if member == '' or member == None:
val = None
value_list.append(val)
continue
try:
val = int(member)
value_list.append(val)
@@ -100,8 +106,8 @@ class GoogleSpreadsheet(BaseQueryRunner):
'secret': ['jsonKeyFile']
}
def __init__(self, configuration_json):
super(GoogleSpreadsheet, self).__init__(configuration_json)
def __init__(self, configuration):
super(GoogleSpreadsheet, self).__init__(configuration)
def _get_spreadsheet_service(self):
scope = [
@@ -130,9 +136,9 @@ class GoogleSpreadsheet(BaseQueryRunner):
columns.append({
'name': column_name,
'friendly_name': column_name,
'type': _guess_type(all_data[self.HEADER_INDEX+1][j])
'type': _guess_type(all_data[self.HEADER_INDEX + 1][j])
})
rows = [dict(zip(column_names, _value_eval_list(row))) for row in all_data[self.HEADER_INDEX+1:]]
rows = [dict(zip(column_names, _value_eval_list(row))) for row in all_data[self.HEADER_INDEX + 1:]]
data = {'columns': columns, 'rows': rows}
json_data = json.dumps(data, cls=JSONEncoder)
error = None

View File

@@ -52,8 +52,8 @@ class Graphite(BaseQueryRunner):
def annotate_query(cls):
return False
def __init__(self, configuration_json):
super(Graphite, self).__init__(configuration_json)
def __init__(self, configuration):
super(Graphite, self).__init__(configuration)
if "username" in self.configuration and self.configuration["username"]:
self.auth = (self.configuration["username"], self.configuration["password"])

View File

@@ -64,8 +64,8 @@ class Hive(BaseSQLQueryRunner):
def type(cls):
return "hive"
def __init__(self, configuration_json):
super(Hive, self).__init__(configuration_json)
def __init__(self, configuration):
super(Hive, self).__init__(configuration)
def _get_tables(self, schema_dict):
try:
@@ -91,7 +91,7 @@ class Hive(BaseSQLQueryRunner):
connection = None
try:
connection = hive.connect(**self.configuration)
connection = hive.connect(**self.configuration.to_dict())
cursor = connection.cursor()

View File

@@ -74,8 +74,8 @@ class Impala(BaseSQLQueryRunner):
def type(cls):
return "impala"
def __init__(self, configuration_json):
super(Impala, self).__init__(configuration_json)
def __init__(self, configuration):
super(Impala, self).__init__(configuration)
def _get_tables(self, schema_dict):
try:

View File

@@ -13,6 +13,7 @@ try:
except ImportError:
enabled = False
def _transform_result(results):
result_columns = []
result_rows = []
@@ -30,6 +31,7 @@ def _transform_result(results):
"rows" : result_rows
}, cls=JSONEncoder)
class InfluxDB(BaseQueryRunner):
@classmethod
def configuration_schema(cls):
@@ -55,8 +57,8 @@ class InfluxDB(BaseQueryRunner):
def type(cls):
return "influxdb"
def __init__(self, configuration_json):
super(InfluxDB, self).__init__(configuration_json)
def __init__(self, configuration):
super(InfluxDB, self).__init__(configuration)
def run_query(self, query):
client = InfluxDBClusterClient.from_DSN(self.configuration['url'])

View File

@@ -12,6 +12,7 @@ logger = logging.getLogger(__name__)
try:
import pymongo
from bson.objectid import ObjectId
from bson.timestamp import Timestamp
from bson.son import SON
enabled = True
@@ -34,6 +35,8 @@ class MongoDBJSONEncoder(JSONEncoder):
def default(self, o):
if isinstance(o, ObjectId):
return str(o)
elif isinstance(o, Timestamp):
return super(MongoDBJSONEncoder, self).default(o.as_datetime())
return super(MongoDBJSONEncoder, self).default(o)
@@ -86,8 +89,8 @@ class MongoDB(BaseQueryRunner):
def annotate_query(cls):
return False
def __init__(self, configuration_json):
super(MongoDB, self).__init__(configuration_json)
def __init__(self, configuration):
super(MongoDB, self).__init__(configuration)
self.syntax = 'json'
@@ -116,13 +119,13 @@ class MongoDB(BaseQueryRunner):
columns.append(property)
def _get_collection_fields(self, db, collection_name):
# Since MongoDB is a document based database and each document doesn't have
# Since MongoDB is a document based database and each document doesn't have
# to have the same fields as another documet in the collection its a bit hard to
# show these attributes as fields in the schema.
#
# For now, the logic is to take the first and last documents (last is determined
# by the Natural Order (http://www.mongodb.org/display/DOCS/Sorting+and+Natural+Order)
# as we don't know the correct order. In most single server installations it would be
# as we don't know the correct order. In most single server installations it would be
# find. In replicaset when reading from non master it might not return the really last
# document written.
first_document = None
@@ -210,6 +213,9 @@ class MongoDB(BaseQueryRunner):
if "limit" in query_data:
cursor = cursor.limit(query_data["limit"])
if "count" in query_data:
cursor = cursor.count()
elif aggregate:
r = db[collection].aggregate(aggregate)
@@ -223,16 +229,25 @@ class MongoDB(BaseQueryRunner):
else:
cursor = r
for r in cursor:
for k in r:
if self._get_column_by_name(columns, k) is None:
columns.append({
"name": k,
"friendly_name": k,
"type": TYPES_MAP.get(type(r[k]), TYPE_STRING)
})
if "count" in query_data:
columns.append({
"name" : "count",
"friendly_name" : "count",
"type" : TYPE_INTEGER
})
rows.append(r)
rows.append({ "count" : cursor })
else:
for r in cursor:
for k in r:
if self._get_column_by_name(columns, k) is None:
columns.append({
"name": k,
"friendly_name": k,
"type": TYPES_MAP.get(type(r[k]), TYPE_STRING)
})
rows.append(r)
if f:
ordered_columns = []

View File

@@ -21,8 +21,8 @@ def deduce_columns(rows):
class MQL(BaseQueryRunner):
def __init__(self, configuration_json):
super(MQL, self).__init__(configuration_json)
def __init__(self, configuration):
super(MQL, self).__init__(configuration)
self.syntax = 'sql'
@classmethod

View File

@@ -0,0 +1,150 @@
import json
import logging
import sys
from redash.query_runner import *
from redash.utils import JSONEncoder
logger = logging.getLogger(__name__)
try:
import pymssql
enabled = True
except ImportError:
enabled = False
# from _mssql.pyx ## DB-API type definitions & http://www.freetds.org/tds.html#types ##
types_map = {
1: TYPE_STRING,
2: TYPE_BOOLEAN,
3: TYPE_INTEGER,
4: TYPE_DATETIME,
5: TYPE_FLOAT,
}
class SqlServer(BaseSQLQueryRunner):
@classmethod
def configuration_schema(cls):
return {
"type": "object",
"properties": {
"user": {
"type": "string"
},
"password": {
"type": "string"
},
"server": {
"type": "string",
"default": "127.0.0.1"
},
"port": {
"type": "number",
"default": 1433
},
"db": {
"type": "string",
"title": "Database Name"
}
},
"required": ["db"],
"secret": ["password"]
}
@classmethod
def enabled(cls):
return enabled
@classmethod
def type(cls):
return "mssql"
def __init__(self, configuration):
super(SqlServer, self).__init__(configuration)
def _get_tables(self, schema):
query = """
SELECT table_schema, table_name, column_name
FROM information_schema.columns
WHERE table_schema NOT IN ('guest','INFORMATION_SCHEMA','sys','db_owner','db_accessadmin'
,'db_securityadmin','db_ddladmin','db_backupoperator','db_datareader'
,'db_datawriter','db_denydatareader','db_denydatawriter'
);
"""
results, error = self.run_query(query)
if error is not None:
raise Exception("Failed getting schema.")
results = json.loads(results)
for row in results['rows']:
if row['table_schema'] != self.configuration['db']:
table_name = '{}.{}'.format(row['table_schema'], row['table_name'])
else:
table_name = row['table_name']
if table_name not in schema:
schema[table_name] = {'name': table_name, 'columns': []}
schema[table_name]['columns'].append(row['column_name'])
return schema.values()
def run_query(self, query):
connection = None
try:
server = self.configuration.get('server', '')
user = self.configuration.get('user', '')
password = self.configuration.get('password', '')
db = self.configuration['db']
port = self.configuration.get('port', 1433)
if port != 1433:
server = server + ':' + str(port)
connection = pymssql.connect(server, user, password, db)
cursor = connection.cursor()
logger.debug("SqlServer running query: %s", query)
cursor.execute(query)
data = cursor.fetchall()
if cursor.description is not None:
columns = self.fetch_columns([(i[0], types_map.get(i[1], None)) for i in cursor.description])
rows = [dict(zip((c['name'] for c in columns), row)) for row in data]
data = {'columns': columns, 'rows': rows}
json_data = json.dumps(data, cls=JSONEncoder)
error = None
else:
error = "No data was returned."
json_data = None
cursor.close()
except pymssql.Error as e:
logging.exception(e)
try:
# Query errors are at `args[1]`
error = e.args[1]
except IndexError:
# Connection errors are `args[0][1]`
error = e.args[0][1]
json_data = None
except KeyboardInterrupt:
connection.cancel()
error = "Query cancelled by user."
json_data = None
except Exception as e:
raise sys.exc_info()[1], None, sys.exc_info()[2]
finally:
if connection:
connection.close()
return json_data, error
register(SqlServer)

View File

@@ -81,9 +81,6 @@ class Mysql(BaseSQLQueryRunner):
return True
def __init__(self, configuration_json):
super(Mysql, self).__init__(configuration_json)
def _get_tables(self, schema):
query = """
SELECT col.table_schema,

View File

@@ -78,8 +78,8 @@ class Oracle(BaseSQLQueryRunner):
def type(cls):
return "oracle"
def __init__(self, configuration_json):
super(Oracle, self).__init__(configuration_json)
def __init__(self, configuration):
super(Oracle, self).__init__(configuration)
dsn = cx_Oracle.makedsn(
self.configuration["host"],
@@ -88,7 +88,7 @@ class Oracle(BaseSQLQueryRunner):
self.connection_string = "{}/{}@{}".format(self.configuration["user"], self.configuration["password"], dsn)
def _get_tables(self, schema_dict):
def _get_tables(self, schema):
query = """
SELECT
user_tables.TABLESPACE_NAME,

View File

@@ -77,8 +77,8 @@ class PostgreSQL(BaseSQLQueryRunner):
def type(cls):
return "pg"
def __init__(self, configuration_json):
super(PostgreSQL, self).__init__(configuration_json)
def __init__(self, configuration):
super(PostgreSQL, self).__init__(configuration)
values = []
for k, v in self.configuration.iteritems():

View File

@@ -6,6 +6,8 @@ from redash.query_runner import *
import logging
logger = logging.getLogger(__name__)
from collections import defaultdict
try:
from pyhive import presto
enabled = True
@@ -14,15 +16,15 @@ except ImportError:
enabled = False
PRESTO_TYPES_MAPPING = {
"integer" : TYPE_INTEGER,
"long" : TYPE_INTEGER,
"bigint" : TYPE_INTEGER,
"float" : TYPE_FLOAT,
"double" : TYPE_FLOAT,
"boolean" : TYPE_BOOLEAN,
"string" : TYPE_STRING,
"integer": TYPE_INTEGER,
"long": TYPE_INTEGER,
"bigint": TYPE_INTEGER,
"float": TYPE_FLOAT,
"double": TYPE_FLOAT,
"boolean": TYPE_BOOLEAN,
"string": TYPE_STRING,
"varchar": TYPE_STRING,
"date" : TYPE_DATE,
"date": TYPE_DATE,
}
@@ -63,8 +65,8 @@ class Presto(BaseQueryRunner):
def type(cls):
return "presto"
def __init__(self, configuration_json):
super(Presto, self).__init__(configuration_json)
def __init__(self, configuration):
super(Presto, self).__init__(configuration)
def run_query(self, query):
connection = presto.connect(
@@ -76,15 +78,12 @@ class Presto(BaseQueryRunner):
cursor = connection.cursor()
try:
cursor.execute(query)
columns_data = [(row[0], row[1]) for row in cursor.description]
columns = [{'name': col[0],
'friendly_name': col[0],
'type': PRESTO_TYPES_MAPPING.get(col[1], None)} for col in columns_data]
rows = [dict(zip(([c[0] for c in columns_data]), r)) for i, r in enumerate(cursor.fetchall())]
column_tuples = [(i[0], PRESTO_TYPES_MAPPING.get(i[1], None)) for i in cursor.description]
columns = self.fetch_columns(column_tuples)
rows = [dict(zip(([c['name'] for c in columns]), r)) for i, r in enumerate(cursor.fetchall())]
data = {'columns': columns, 'rows': rows}
json_data = json.dumps(data, cls=JSONEncoder)
error = None

Some files were not shown because too many files have changed in this diff Show More