Compare commits

...

175 Commits

Author SHA1 Message Date
Arik Fraimovich
939aae086f ADd changelog entry for favicons fix 2017-04-18 15:22:05 +03:00
Arik Fraimovich
742e38b08d Update CHANGELOG and bump version 2017-04-18 15:20:21 +03:00
Arik Fraimovich
3c7c93fc9f Fix: favicon wasn't showing up.
Closes #1719.
2017-04-18 15:19:57 +03:00
Arik Fraimovich
53ffff9759 Merge pull request #1716 from deecay/dashboard-tag-m17n
Fix: Non-ASCII dashboard tag
2017-04-18 15:02:27 +03:00
Arik Fraimovich
2e7fafc4d8 CHANGELOG update. 2017-04-18 14:59:44 +03:00
Arik Fraimovich
c66b09effe Merge pull request #1717 from getredash/fix_embeds
Fix: page freezes when rendering large result set.
2017-04-11 18:33:11 +03:00
Arik Fraimovich
a087fe4bcd Fix: page freezes when rendering large result set.
Closes #1711.
2017-04-11 18:05:43 +03:00
Arik Fraimovich
1f4946cc04 Merge pull request #1710 from getredash/fix_embeds
Fix: embeds were not rendering in PhantomJS.
2017-04-05 12:58:05 +03:00
Arik Fraimovich
08505a2208 Add changelog entry 2017-04-05 12:40:56 +03:00
Arik Fraimovich
e1c186bbf8 Fix: embeds were not rendering in PhantomJS.
Include polyfill for missing ArrayView functions.

Closes #1708.
2017-04-05 12:38:21 +03:00
Arik Fraimovich
c83d354eed Merge pull request #1707 from getredash/docker-compose
Update docker-compose configuration:
2017-04-03 18:30:55 +03:00
Arik Fraimovich
81063731c9 Update docker-compose configuration:
* Use newer versions of Redis & PostgreSQL
* Use image for production docker-compose.
2017-04-03 18:28:46 +03:00
Arik Fraimovich
f66fe5ff80 Update packer configuration to create GCE image 2017-04-03 18:07:19 +03:00
Arik Fraimovich
8425698583 Update env 2017-04-03 13:18:34 +03:00
Arik Fraimovich
8b08b1a563 Merge pull request #1704 from getredash/new_bootstrap
New bootstrap script for Ubuntu 16.04
2017-04-03 13:16:53 +03:00
Arik Fraimovich
15b228b754 Update README 2017-04-03 12:54:23 +03:00
Arik Fraimovich
1db4157b29 Fix bootstrap script to be headless 2017-04-03 12:54:17 +03:00
Arik Fraimovich
079530cf63 Remove unused files 2017-04-03 12:54:05 +03:00
Arik Fraimovich
d2370a94c7 New bootstrap script 2017-04-03 10:30:06 +03:00
Arik Fraimovich
903463972b Fix: handle the case when a scheduled query wasn't run before 2017-04-02 15:20:45 +03:00
Arik Fraimovich
2707e24f30 Update CHANGELOG & version 2017-04-02 14:43:02 +03:00
Arik Fraimovich
3df826692c Merge pull request #1703 from getredash/queries
Fix: optimize queries to avoid N+1 cases
2017-04-02 14:09:12 +03:00
Arik Fraimovich
1142a441fc Fix: optimize queries to avoid N+1 cases 2017-04-02 14:01:23 +03:00
Arik Fraimovich
53268989c5 Merge pull request #1701 from akiray03/refactor-next-to-next_path
Change: rename local variable `next` to `next_path`
2017-04-02 11:15:46 +03:00
deecay
83ed9fdc51 Fix: Dashboard tag for unicode dashboard names 2017-04-01 23:28:53 +09:00
Akira Yumiyama
0dc98e87a6 rename local variable next to next_path. because shadow built-in methods. 2017-04-01 20:25:23 +09:00
Arik Fraimovich
0cf4db1137 Update atsd_client version. 2017-03-30 15:19:57 +03:00
Arik Fraimovich
4e27069d07 Merge pull request #1696 from spasovski/percentstack
Fix: wrong percent stacking math
2017-03-30 12:15:05 +03:00
Davor Spasovski
3fcd07bc1c fix percent stacking math (issue 41) 2017-03-29 15:07:37 -04:00
Arik Fraimovich
3414ff7331 Update CHANGELOG.md 2017-03-27 15:07:11 +03:00
Arik Fraimovich
04cd798c48 Update changelog 2017-03-27 15:07:00 +03:00
Arik Fraimovich
50dcf23b1a Merge pull request #1665 from benmagro/filter_params
Fix: Set query filter to match dashboard level filters
2017-03-27 13:13:30 +03:00
Arik Fraimovich
1bb4d6d534 Fix condition to only have effect when there is a value in query string. 2017-03-27 13:12:50 +03:00
Arik Fraimovich
66a5e394de Merge pull request #1688 from akiray03/refactoring-query-results-export
[Refactoring] make_{csv,excel}_content move to models from handlers
2017-03-27 12:42:58 +03:00
Arik Fraimovich
c4ab0916cc Merge pull request #1682 from denisov-vlad/clickhouse-types-fix
[Clickhouse] Fix: better support for types
2017-03-27 12:18:17 +03:00
Arik Fraimovich
73cb6925d3 Merge pull request #1689 from getredash/feature/bubble
Fix: user can't edit their own alert
2017-03-26 11:55:39 +03:00
Arik Fraimovich
aaf0da4b70 Fix: user can't edit their own alert 2017-03-26 11:45:13 +03:00
Arik Fraimovich
c99bd03d99 Merge pull request #1666 from deecay/pivot-getdata
Change: add support for filtered data in Pivot table visualization
2017-03-26 11:33:57 +03:00
Akira Yumiyama
7fbb1b9229 [Refactoring] make_{csv,excel}_content move to models from handlers
I want to be able to use like: `python manage.py queries export <query_id>`
2017-03-26 12:24:10 +09:00
Arik Fraimovich
ba54d68513 Merge pull request #1686 from msnider/salesforce-sandbox
[Salesforce] Change: Sandbox cannot be required or it will force it to be True.
2017-03-25 21:08:22 +03:00
Matt Snider
f73cbf3b51 Sandbox cannot be required or it will force it to be True. Also, dont annotate SOQL queries as they dont allow comments 2017-03-25 12:19:53 -05:00
deecay
3f047348e2 Pivottable shows filtered data 2017-03-23 21:08:57 +09:00
deecay
10fe3c5373 Revert "Pivottable shows filtered and formatted data"
This reverts commit f011d3060a.
2017-03-23 21:07:56 +09:00
Vladislav Denisov
9c8755c9ae clickhouse: added support for nullable columns 2017-03-22 10:28:44 +03:00
Vladislav Denisov
e8908d04bb issues/1616: fixed clickhouse types 2017-03-22 09:44:21 +03:00
Arik Fraimovich
293f9dcaf6 Merge pull request #1680 from getredash/feature/bubble
Add: bubble charts support
2017-03-21 10:58:36 +02:00
Arik Fraimovich
ce31b13ff6 Add: bubble charts support 2017-03-21 10:46:11 +02:00
Arik Fraimovich
a033dc4569 Fix: angular minification issue in textbox editor 2017-03-20 17:41:52 +02:00
Arik Fraimovich
6ff338964b Fix: angular minification issue in schema browser 2017-03-20 17:37:32 +02:00
Arik Fraimovich
97a7701879 Merge pull request #1617 from 44px/refresh-schema-button
Add "Refresh Schema" button to the datasource
2017-03-20 14:11:05 +02:00
Arik Fraimovich
7558b391a9 Merge pull request #1673 from 44px/editorconfig
Add: .editorconfig to keep code style consistent
2017-03-20 10:54:58 +02:00
Alexander Shepelin
b6bed112ee Add .editorconfig to keep code style consistent 2017-03-17 01:09:44 +03:00
Alexander Shepelin
9417dcb2c2 preserve old schema if we get error on refresh 2017-03-16 23:56:56 +03:00
Alexander Shepelin
5f106a1eee Merge remote-tracking branch 'origin/master' into refresh-schema-button 2017-03-16 23:20:21 +03:00
Arik Fraimovich
cda05c73c7 Merge pull request #1657 from deecay/ie-scrollbar
Show vertical scrollbar for IE
2017-03-16 11:59:42 +02:00
deecay
95398697cb Set ms-overflow-style to auto for html 2017-03-16 18:29:25 +09:00
Arik Fraimovich
dc019cc37a Merge pull request #1649 from getredash/fixes201602
Fix: datetime parameters were not using a date picker.
2017-03-15 18:08:32 +02:00
Arik Fraimovich
72cb5babe6 Change datetime serialization format 2017-03-15 18:00:25 +02:00
Arik Fraimovich
ebc2e12621 Merge pull request #1622 from axibase/master
[Data Sources] Add: ATSD query runner
2017-03-15 16:31:03 +02:00
deecay
f011d3060a Pivottable shows filtered and formatted data 2017-03-10 19:10:31 +09:00
Ben Magro
8c5f71a0a1 set filter paramas in query to match dashboard level filters when they are present 2017-03-10 11:28:39 +11:00
Arik Fraimovich
da00e74491 Merge pull request #1660 from akiray03/docker-entrypoint-support-manage-py
Add: docker-entrypoint for manage.py's commands.
2017-03-08 15:49:14 +02:00
Akira Yumiyama
b56ff1357e docker-entrypoint supports manage.py's any commands. 2017-03-08 22:41:15 +09:00
Arik Fraimovich
ecd4d659a8 Merge pull request #1494 from yuua/impala-schema
[Impala] enable schema browser
2017-03-07 14:37:42 +02:00
Arik Fraimovich
fec5565396 Merge pull request #1650 from deecay/v1-ie11
Add babel-plugin-transform-object-assign for IE support
2017-03-07 14:32:41 +02:00
Arik Fraimovich
6ec5ea5c28 Resume to building Docker images 2017-03-07 12:26:00 +02:00
Arik Fraimovich
3f8e32cc1f Merge pull request #1656 from getredash/docker
Fix Docker file ownership issues:
2017-03-07 11:40:18 +02:00
Arik Fraimovich
be6426014d Fix Docker file ownership issues:
1. Simplify user creation to create a non system user (so the uid will usually
   match the host user).
2. Set the user to redash & remove the need to change user in docker entrypoint.
2017-03-07 11:37:31 +02:00
Arik Fraimovich
8b4643d6ac Remove nosiy log 2017-03-06 21:22:41 +02:00
Arik Fraimovich
d8a0885953 Fix: tests were using old method signature 2017-03-06 21:22:29 +02:00
Arik Fraimovich
83e6b6f50c Tests use the same session as the tested code, and we can't use the same
objects after the tested code calls commit() without disabling expire
on commit.

It seems like a safe thing in our case.
2017-03-06 13:49:29 +02:00
rmakulov
928bd83967 minor change 2017-03-06 13:16:02 +03:00
Arik Fraimovich
230fe15cde Merge pull request #1653 from r0fls/fix-embed-close
Fix: query embed dialog close button wasn't working
2017-03-05 20:08:14 +02:00
Arik Fraimovich
72ad16a8b3 Fix: use correct format string 2017-03-05 20:03:44 +02:00
Arik Fraimovich
23cc632d5a Duplicate favicons instead of symlinks 2017-03-05 09:15:11 +02:00
Raphael Deem
1cf2bb1bb2 fix query embed close button 2017-03-04 14:51:00 -08:00
yuua
181031957f impala get_table remove filtter and str to unicode 2017-03-03 17:58:13 +09:00
deecay
cfa9a45fc8 Add babel-plugin-transform-object-assign 2017-03-03 09:26:56 +09:00
Arik Fraimovich
9bb87e711a Fix: datetime parameters were not using a date picker. 2017-03-02 15:56:40 +02:00
Rustam Makulov
255a01f786 Merge branch 'master' into master 2017-03-01 12:29:09 +04:00
Arik Fraimovich
69c26f2c0d Merge pull request #1643 from msnider/salesforce
[Data Sources] Add: SalesForce query runner
2017-03-01 09:51:43 +02:00
Matt Snider
3650e21458 Move import to top of file 2017-02-28 22:06:34 -06:00
Matt Snider
8eefd0e9c4 Format to PEP8 2017-02-28 21:54:32 -06:00
Matt Snider
c72a097808 Added Salesforce SOQL query runner 2017-02-28 21:44:38 -06:00
rmakulov
2ffda6f5c5 code revised 2017-02-28 19:03:17 +03:00
Arik Fraimovich
ce8ffae152 Merge pull request #1584 from washort/scheduled-query-backoff
Scheduled query backoff
2017-02-28 13:19:34 +02:00
Arik Fraimovich
b54dd27959 Merge pull request #1624 from washort/presto-errors
Fix: make errors from Presto runner JSON-serializable
2017-02-28 13:04:46 +02:00
Arik Fraimovich
3e807e5b41 Merge pull request #1623 from washort/jobs-race
Bugfix: race condition in query task status reporting
2017-02-28 13:04:10 +02:00
Arik Fraimovich
20f1a60f90 Merge pull request #1619 from deecay/count-rows
Add: use results row count as the value for the counter visualization.
2017-02-28 13:03:36 +02:00
Arik Fraimovich
9d2619b856 Merge pull request #1641 from getredash/fixes201602
UI Fixes
2017-02-28 12:59:41 +02:00
Arik Fraimovich
a2c7f6df7a Friendlier labels for archived state of dashboard/query 2017-02-28 12:50:27 +02:00
Arik Fraimovich
15a87db5d5 Fix: remove 29402hashKey from Pivot table 2017-02-28 12:46:58 +02:00
Arik Fraimovich
2f86466309 Merge pull request #1639 from getredash/fixes201602
Small UI fixes
2017-02-28 12:10:22 +02:00
Arik Fraimovich
bccfef533e Fix: wrong timestamps in admin tasks page 2017-02-27 17:51:00 +02:00
Arik Fraimovich
ef020e88e7 Fix: word cloud visualization didn't show column names 2017-02-27 17:42:44 +02:00
Arik Fraimovich
222a6069cb Fix: pemrission dialog wasn't rendering.
Closes #1633.
2017-02-27 15:44:19 +02:00
Arik Fraimovich
6b6df84bce Fix: map visualization had severe performance issue.
Close #1603.
2017-02-27 15:30:30 +02:00
rmakulov
fcfd204ec6 atsd query runner 2017-02-24 17:31:01 +03:00
Arik Fraimovich
57e6c5f05e Merge pull request #1630 from MichaelJAndy/MichaelJAndy-sorting-patch
Fix: dashboard-list.js to sort dashboards and update page-header
2017-02-23 15:10:34 +02:00
Michael Andy
683e369d86 dashboard-list.js sorts dashboards and updates page-header 2017-02-23 23:41:58 +11:00
Arik Fraimovich
f12596a6fc Merge pull request #1629 from getredash/fix_too_many_connections
Fix: keyboard shortcuts didn't work in parameter inputs
2017-02-23 14:38:22 +02:00
Arik Fraimovich
09239439ae Fix: keyboard shortcuts didn't work in parameter inputs 2017-02-23 12:51:38 +02:00
Arik Fraimovich
2bb11dffca add v1-rc.2 release date. 2017-02-22 22:11:26 +02:00
Arik Fraimovich
2f019a0897 Update CHANGELOG.md 2017-02-22 21:58:12 +02:00
Arik Fraimovich
1350555931 Update CHANGELOG. 2017-02-22 21:57:16 +02:00
Arik Fraimovich
6d3aa3b53c Merge pull request #1627 from getredash/fix_too_many_connections
Dashboard page fixes
2017-02-22 21:57:02 +02:00
Allen Short
2407b115e4 Exponential backoff for failing queries 2017-02-22 10:29:08 -06:00
Allen Short
ca3e125da8 Refactor outdated_queries/refresh_queries tests 2017-02-22 10:28:35 -06:00
Arik Fraimovich
2d82c4dc98 Fix: multi filters broken on dashboards 2017-02-22 16:24:11 +02:00
Arik Fraimovich
84ca02be09 Fix: download links broken on dashboards 2017-02-22 16:14:41 +02:00
Arik Fraimovich
61244dead3 Merge pull request #1626 from getredash/fix_too_many_connections
Fix: Redash is using too many db connections
2017-02-22 15:18:49 +02:00
Arik Fraimovich
907b33b5a0 Fix: tests failling because they run on different app contextes 2017-02-22 14:06:47 +02:00
Arik Fraimovich
e6fc73f444 Fix: Redash was using too many db connections
Fixes #1561.

The issue was caused by creating a new Flask app instance on every task execution,
resulting in creating a new SQLALchemy connection pool and a new database connection.
2017-02-22 12:40:22 +02:00
Arik Fraimovich
672347ba8b Merge pull request #1621 from kopanitsa/fix-dynamo-count
[DynamoDB] Fix: handle count(*) queries
2017-02-22 10:06:36 +02:00
Arik Fraimovich
db465ffe58 Rename res_dict to result and add comment. 2017-02-22 10:06:00 +02:00
Allen Short
2a447137d4 Fix: make errors from Presto runner JSON-serializable 2017-02-21 13:11:32 -06:00
Takahiro Okada
18a157ac84 handle count(*) in dynamo db 2017-02-21 22:26:50 +09:00
deecay
3864f11694 Add: Counter option to count resultset row count. 2017-02-21 16:56:45 +09:00
Alexander Shepelin
8b59815bf2 add 'refresh schema' button to schema-browser 2017-02-20 23:56:21 +03:00
Allen Short
a98df94399 Fix race condition in query task status reporting 2017-02-20 12:20:55 -06:00
Arik Fraimovich
2e751b3dad Fix: dashboards without tags break rendering 2017-02-19 09:20:25 +02:00
Alexander Shepelin
b2e747caef refactor schema-browser directive to component style 2017-02-18 23:10:15 +03:00
Arik Fraimovich
f9da6ddcdd Merge pull request #1611 from getredash/circleci
Improve CircleCI configuration
2017-02-18 14:15:20 +02:00
Arik Fraimovich
99cef97c89 Add #1612 to the CHANGELOG. 2017-02-18 14:14:30 +02:00
Arik Fraimovich
2071ca1bc8 Merge pull request #1612 from getredash/rc1-fixes
Change: Improvements to the dashboards list page
2017-02-18 14:14:04 +02:00
Arik Fraimovich
5815d635a0 Update CHANGELOG for v1.0.0-rc.2.
@44px @janusd @btmc @deecay @aslotnick thanks!
2017-02-18 14:13:49 +02:00
Arik Fraimovich
517e5bcddb Improvements to the dashboards list page:
- Fix search to be case insensitive.
- Fix tag extraction to support multiple words prefixes.
- Change selection behavior to use OR by default and AND on shift+click.
2017-02-18 12:56:56 +02:00
Arik Fraimovich
eaa2ec877c Merge pull request #1607 from getredash/rc1-fixes
v1.0.0-rc.1 fixes
2017-02-18 12:51:45 +02:00
Arik Fraimovich
0e68228e6e Remove Procfile.heroku - it's no longer up to date. 2017-02-18 12:50:02 +02:00
Arik Fraimovich
6e5435261b Improve CircleCI configuration 2017-02-18 12:48:39 +02:00
Arik Fraimovich
1cc9b87ead Fix PEP8 section name 2017-02-18 12:41:57 +02:00
Arik Fraimovich
bd9ad3140d Allow longer lines in pep8 2017-02-18 12:37:00 +02:00
Arik Fraimovich
3e23143910 Fix: alert subscriptions were not triggered 2017-02-18 12:30:02 +02:00
Arik Fraimovich
0f95d12e83 Fix: use the correct model for delete. 2017-02-18 12:25:54 +02:00
Arik Fraimovich
fd37fd8545 Fix: remove whitespace created by the filters component 2017-02-18 12:19:22 +02:00
Arik Fraimovich
742199f05c Change: change Plotly's default hover mode to compare 2017-02-16 22:33:34 +02:00
Arik Fraimovich
902fb44782 Change: remove send to Plotly button 2017-02-16 20:08:55 +02:00
Arik Fraimovich
10e78bb4e4 Fix: sunburst & sankey visualization broke because of Angular values 2017-02-16 19:06:12 +02:00
Arik Fraimovich
e4659d0485 Fix: when adding text widget global parameters break.
Closes #1587.
2017-02-16 18:54:58 +02:00
Arik Fraimovich
a25302773d Fix: google button missing on invite link 2017-02-16 18:51:23 +02:00
Arik Fraimovich
2188baca16 Fix: schema browser got dark background 2017-02-16 18:42:28 +02:00
Arik Fraimovich
8940533176 Fix: embeds don't work after minifcation 2017-02-16 18:28:56 +02:00
Arik Fraimovich
1367b63ae1 Fix: shared dashboards didn't work after minifcation 2017-02-16 18:19:23 +02:00
Arik Fraimovich
c2a9e2e960 Fix: public widget serializer was referncing wrong name 2017-02-16 18:16:19 +02:00
Arik Fraimovich
8a41328033 Fix: share dashboard dialog not working after minification 2017-02-16 18:08:46 +02:00
Arik Fraimovich
ce16124d7b Fix: API keys were not logged properly in events table. 2017-02-15 22:13:33 +02:00
Arik Fraimovich
c547752dc6 Merge pull request #1606 from getredash/rc1-fixes
Change: move the message about unavailable data source types to be
2017-02-15 21:58:45 +02:00
Arik Fraimovich
3981c1c8a7 Fix: wrong use of Events.record function 2017-02-15 21:52:24 +02:00
Arik Fraimovich
f732f30bf0 Change: move the message about unavailable data source types to be
debug level message instead of warning. It seems to confuse people
and basically do more harm than add value.
2017-02-15 21:47:14 +02:00
Arik Fraimovich
9c0f0cb044 Merge pull request #1601 from aslotnick/1529_presto_error_message
[Presto] Change: better formatted error messages
2017-02-15 21:40:44 +02:00
Arik Fraimovich
f01399c33d Merge pull request #1604 from getredash/rc1-fixes
Fix: dashboard owner can't edit the dashboard.
2017-02-15 20:17:11 +02:00
Arik Fraimovich
253f2e613c Fix: dashboard owner can't edit the dashboard. 2017-02-15 19:50:04 +02:00
Andrew Slotnick
f2879a1e3d better formatting for presto error messages 2017-02-14 17:14:52 -05:00
deecay
af978e966d Show vertical scrollbar for IE 2017-02-14 16:53:11 +09:00
Arik Fraimovich
0151360fdf Merge pull request #1597 from deecay/fix-1571
[SQL Server] Fix: schema won't load if the server is case sensitive
2017-02-14 09:37:05 +02:00
Arik Fraimovich
d1b0a9580d Merge pull request #1592 from deecay/master
Change: use https instead of git protocol for git repositories in npm-shrinkwrap.json
2017-02-13 16:34:26 +02:00
deecay
9d796b20df Fix for #1571 2017-02-13 14:10:48 +09:00
deecay
574a7573ce Udate shrinkwrap to work behind some proxies 2017-02-13 12:05:04 +09:00
Arik Fraimovich
db9c925cbb Merge pull request #1564 from janusd/viz-map-column-fix
Fix: options selection in the map visualization editor (lat, long, groupBy)
2017-02-10 20:10:00 +02:00
Arik Fraimovich
9ef6836175 Remove unused setup files 2017-02-10 20:06:30 +02:00
Arik Fraimovich
351cd62189 Docker entrypoint: update user's uid 2017-02-09 23:48:33 +02:00
Arik Fraimovich
7bbc782e5d Fix npm build command. 2017-02-09 22:22:41 +02:00
Arik Fraimovich
50d20ff277 Merge pull request #1582 from getredash/json_webhook
Updates to Dockerfile:
2017-02-09 17:07:46 +02:00
Arik Fraimovich
964926aaab Change external-links to links. 2017-02-09 17:07:34 +02:00
Arik Fraimovich
b67ecde107 Merge pull request #1570 from btmc/master
Update docker-compose.production.yml
2017-02-09 17:06:56 +02:00
Arik Fraimovich
6338596710 Updates to Dockerfile:
- Build frontend assets (needed for standalone use of the image).
- Change ownership of the files to redash user.
2017-02-09 16:53:25 +02:00
btmc
5361a99b22 Update docker-compose.production.yml
Nginx doesn't have "redash" upstream. This fixes the error.
2017-02-06 23:53:46 +03:00
Arik Fraimovich
01a8075a67 Merge pull request #1563 from getredash/json_webhook
Change: send events to webhook as JSON with a schema.
2017-02-05 10:36:55 +02:00
Arik Fraimovich
209e714084 Add changelog entry 2017-02-05 10:36:28 +02:00
Janus
e3d0b4075e fix column options selection in the map visualisation editor (lat, long and groupBy) 2017-02-02 15:39:52 +01:00
Arik Fraimovich
ad18128794 Fix test 2017-02-02 10:39:21 +02:00
Arik Fraimovich
71fa013970 Change: send events to webhook as JSON with a schema.
Closes #1552.
2017-02-02 10:21:01 +02:00
Arik Fraimovich
dd6028384d Merge pull request #1555 from 44px/original-source-sourcemaps
Change: generate sourcemaps for modules (original source files)
2017-02-01 23:17:37 +02:00
Alexander Shepelin
9b9a752f78 Generate sourcemaps for modules (original source) 2017-01-31 09:30:27 +03:00
yuua
78408e50c5 impala query_runner get_tables error fix 2016-12-29 10:54:57 +09:00
112 changed files with 1510 additions and 1992 deletions

14
.editorconfig Normal file
View File

@@ -0,0 +1,14 @@
root = true
[*]
end_of_line = lf
insert_final_newline = true
trim_trailing_whitespace = true
[*.py]
indent_style = space
indent_size = 4
[*.{js,css,html}]
indent_style = space
indent_size = 2

View File

@@ -1,5 +1,96 @@
# Change Log # Change Log
## v1.0.2 - 2017-04-18
### Fixed
- Fix: favicon wasn't showing up.
- Fix: support for unicode in dashboard tags. @deecay
- Fix: page freezes when rendering large result set.
- Fix: chart embeds were not rendering in PhantomJS.
## v1.0.1 - 2017-04-02
### Added
- Add: bubble charts support.
- Add "Refresh Schema" button to the datasource @44px
- [Data Sources] Add: ATSD query runner @rmakulov
- [Data Sources] Add: SalesForce query runner @msnider
- Add: scheduled query backoff in case of errors @washort
- Add: use results row count as the value for the counter visualization. @deecay
### Changed
- Moved CSV/Excel query results generation code to models. @akiray03
- Add support for filtered data in Pivot table visualization @deecay
- Friendlier labels for archived state of dashboard/query
### Fixed
- Fix: optimize queries to avoid N+1 queries.
- Fix: percent stacking math was wrong. @spasovski
- Fix: set query filter to match value from URL query string. @benmargo
- [Clickhouse] Fix: detection of various data types. @denisov-vlad
- Fix: user can't edit their own alert.
- Fix: angular minification issue in textbox editor and schema browser.
- Fixes to better support IE11 (add polyfill for Object.assign and show vertical scrollbar). @deecay
- Fix: datetime parameters were not using a date picker.
- Fix: Impala schema wasn't loading.
- Fix: query embed dialog close button wasn't working @r0fls
- Fix: make errors from Presto runner JSON-serializable @washort
- Fix: race condition in query task status reporting @washort
- Fix: remove $$hashKey from Pivot table
- Fix: map visualization had severe performance issue.
- Fix: pemrission dialog wasn't rendering.
- Fix: word cloud visualization didn't show column names.
- Fix: wrong timestamps in admin tasks page.
- Fix: page header wasn't updating on dashboards page @MichaelJAndy
- Fix: keyboard shortcuts didn't work in parameter inputs
## v1.0.0-rc.2 - 2017-02-22
### Changed
- [#1563](https://github.com/getredash/redash/pull/1563) Send events to webhook as JSON with a schema.
- [#1601] [Presto] friendlier error messages. (@aslotnick)
- Move the query runner unavailable log message to be DEBUG level instead of WARNING, as it was mainly confusing people.
- Remove "Send to Cloud" button from Plotly based visualizations.
- Change Plotly's default hover mode to "Compare".
- [#1612] Change: Improvements to the dashboards list page.
### Fixed
- [#1564] Fix: map visualization column picker wasn't populated. (@janusd)
- [#1597] [SQL Server] Fix: schema wasn't loading on case sensitive servers. (@deecay)
- Fix: dashbonard owner couldn't edit his dashboard.
- Fix: toggle_publish event wasn't logged properly.
- Fix: events with API keys were not logged.
- Fix: share dashboard dialog was broken after code minification.
- Fix: public dashboard endpoint was broken.
- Fix: public dashboard page was broken after code minification.
- Fix: visualization embed page was broken after code minification.
- Fix: schema browser has dark background.
- Fix: Google button missing on invite page.
- Fix: global parameters don't render on dashboards with text boxes.
- Fix: sunburst / Sankey visualizations have bad data.
- Fix: extra whitespace created by the filters component.
- Fix: query results cleanup task was trying to delete query objects.
- Fix: alert subscriptions were not triggered.
- [DynamoDB] Fix: count(*) queries were broken. (@kopanitsa)
- Fix: Redash is using too many database connections.
- Fix: download links were not working in dashboards.
- Fix: the first selection in multi filters was broken in dashboards.
### Other
- [#1555] Change sourcemaps to generate a sourcemap per module. (@44px)
- [#1570] Fix Docker Compose configuration for nginx. (@btmc)
- [#1582] Update Dockerfile to build frontend assets and update the folder ownership.
- Dockerfile: change the uid of the redash user to match host user uid.
- Update npm-shrinkwrap.json file to use http proctocol instead of git. (@deecay)
## v1.0.0-rc.1 - 2017-01-31 ## v1.0.0-rc.1 - 2017-01-31
This version has two big changes behind the scenes: This version has two big changes behind the scenes:

View File

@@ -6,5 +6,8 @@ COPY requirements.txt requirements_dev.txt requirements_all_ds.txt ./
RUN pip install -r requirements.txt -r requirements_dev.txt -r requirements_all_ds.txt RUN pip install -r requirements.txt -r requirements_dev.txt -r requirements_all_ds.txt
COPY . ./ COPY . ./
RUN npm install && npm run build && rm -rf node_modules
RUN chown -R redash /app
USER redash
ENTRYPOINT ["/app/bin/docker-entrypoint"] ENTRYPOINT ["/app/bin/docker-entrypoint"]

View File

@@ -1,2 +0,0 @@
web: ./manage.py runserver -d -r -p $PORT --host 0.0.0.0
worker: celery worker --app=redash.worker -c${REDASH_HEROKU_CELERY_WORKER_COUNT:-2} --beat -Q queries,celery,scheduled_queries

View File

@@ -6,7 +6,7 @@ worker() {
QUEUES=${QUEUES:-queries,scheduled_queries,celery} QUEUES=${QUEUES:-queries,scheduled_queries,celery}
echo "Starting $WORKERS_COUNT workers for queues: $QUEUES..." echo "Starting $WORKERS_COUNT workers for queues: $QUEUES..."
exec sudo -E -u redash /usr/local/bin/celery worker --app=redash.worker -c$WORKERS_COUNT -Q$QUEUES -linfo --maxtasksperchild=10 -Ofair exec /usr/local/bin/celery worker --app=redash.worker -c$WORKERS_COUNT -Q$QUEUES -linfo --maxtasksperchild=10 -Ofair
} }
scheduler() { scheduler() {
@@ -15,11 +15,11 @@ scheduler() {
echo "Starting scheduler and $WORKERS_COUNT workers for queues: $QUEUES..." echo "Starting scheduler and $WORKERS_COUNT workers for queues: $QUEUES..."
exec sudo -E -u redash /usr/local/bin/celery worker --app=redash.worker --beat -c$WORKERS_COUNT -Q$QUEUES -linfo --maxtasksperchild=10 -Ofair exec /usr/local/bin/celery worker --app=redash.worker --beat -c$WORKERS_COUNT -Q$QUEUES -linfo --maxtasksperchild=10 -Ofair
} }
server() { server() {
exec sudo -E -u redash /usr/local/bin/gunicorn -b 0.0.0.0:5000 --name redash -w4 redash.wsgi:app exec /usr/local/bin/gunicorn -b 0.0.0.0:5000 --name redash -w4 redash.wsgi:app
} }
help() { help() {
@@ -35,11 +35,12 @@ help() {
echo "shell -- open shell" echo "shell -- open shell"
echo "dev_server -- start Flask development server with debugger and auto reload" echo "dev_server -- start Flask development server with debugger and auto reload"
echo "create_db -- create database tables" echo "create_db -- create database tables"
echo "manage -- CLI to manage redash"
} }
tests() { tests() {
export REDASH_DATABASE_URL="postgresql://postgres@postgres/tests" export REDASH_DATABASE_URL="postgresql://postgres@postgres/tests"
exec sudo -E -u redash make test exec make test
} }
case "$1" in case "$1" in
@@ -56,13 +57,17 @@ case "$1" in
scheduler scheduler
;; ;;
dev_server) dev_server)
exec sudo -E -u redash /app/manage.py runserver --debugger --reload -h 0.0.0.0 exec /app/manage.py runserver --debugger --reload -h 0.0.0.0
;; ;;
shell) shell)
exec sudo -E -u redash /app/manage.py shell exec /app/manage.py shell
;; ;;
create_db) create_db)
exec sudo -E -u redash /app/manage.py database create_tables exec /app/manage.py database create_tables
;;
manage)
shift
exec /app/manage.py $*
;; ;;
tests) tests)
tests tests

View File

@@ -5,11 +5,8 @@ machine:
node: node:
version: version:
6.9.1 6.9.1
python:
version:
2.7.3
dependencies: dependencies:
pre: override:
- pip install --upgrade setuptools - pip install --upgrade setuptools
- pip install -r requirements_dev.txt - pip install -r requirements_dev.txt
- pip install -r requirements.txt - pip install -r requirements.txt
@@ -28,9 +25,9 @@ deployment:
# - make upload # - make upload
#- echo "client/app" >> .dockerignore #- echo "client/app" >> .dockerignore
#- docker pull redash/redash:latest #- docker pull redash/redash:latest
#- docker build -t redash/redash:$(./manage.py version | sed -e "s/\+/./") . - docker login -e $DOCKER_EMAIL -u $DOCKER_USER -p $DOCKER_PASS
#- docker login -e $DOCKER_EMAIL -u $DOCKER_USER -p $DOCKER_PASS - docker build -t redash/redash:$(./manage.py version | sed -e "s/\+/./") .
#- docker push redash/redash:$(./manage.py version | sed -e "s/\+/./") - docker push redash/redash:$(./manage.py version | sed -e "s/\+/./")
notify: notify:
webhooks: webhooks:
- url: https://webhooks.gitter.im/e/895d09c3165a0913ac2f - url: https://webhooks.gitter.im/e/895d09c3165a0913ac2f

View File

@@ -1,3 +1,4 @@
{ {
"presets": ["es2015", "stage-2"] "presets": ["es2015", "stage-2"],
"plugins": ["transform-object-assign"]
} }

View File

@@ -4,12 +4,17 @@ body {
body.headless { body.headless {
padding-top: 0px; padding-top: 0px;
padding-bottom: 0px;
} }
body.headless nav.app-header { body.headless nav.app-header {
display: none; display: none;
} }
body.headless div#footer {
display: none;
}
a[ng-click] { a[ng-click] {
cursor: pointer; cursor: pointer;
} }
@@ -415,6 +420,16 @@ counter-renderer counter-name {
background-color: white; background-color: white;
} }
.schema-control {
display: flex;
padding: 5px 0;
}
.schema-control .form-control {
height: 30px;
margin-right: 5px;
}
.schema-browser { .schema-browser {
height: calc(100% - 45px); height: calc(100% - 45px);
overflow-y: auto; overflow-y: auto;
@@ -597,9 +612,16 @@ div.table-name:hover {
.collapsing, .collapsing,
.collapse.in { .collapse.in {
background: #222; padding: 5px 10px;
padding: 5px 10px; transition: all 0.35s ease;
transition: all 0.35s ease; }
.schema-browser .collapse.in {
background: #f4f4f4;
}
.navbar .collapse.in {
background: #222;
} }
/* Fixes for SuperFlat */ /* Fixes for SuperFlat */
@@ -667,3 +689,12 @@ div.table-name:hover {
.m-2{ .m-2{
margin:2px; margin:2px;
} }
.dropdown-menu > .disabled{
cursor: not-allowed;
}
/* The real magic ;) */
.dropdown-menu > .disabled > a{
pointer-events: none;
}

View File

@@ -6430,7 +6430,7 @@ a {
} }
html { html {
overflow-x: hidden\0/; overflow-x: hidden\0/;
-ms-overflow-style: none; -ms-overflow-style: auto;
} }
html, html,
body { body {

View File

@@ -9,7 +9,7 @@
</thead> </thead>
<tbody> <tbody>
<tr ng-repeat="row in $ctrl.rows"> <tr ng-repeat="row in $ctrl.rowsToDisplay">
<td ng-repeat="column in $ctrl.columns" ng-bind-html="$ctrl.sanitize(column.formatFunction(row[column.name]))"> <td ng-repeat="column in $ctrl.columns" ng-bind-html="$ctrl.sanitize(column.formatFunction(row[column.name]))">
</td> </td>
</tr> </tr>

View File

@@ -15,7 +15,7 @@ function DynamicTable($sanitize) {
const first = this.count * (this.page - 1); const first = this.count * (this.page - 1);
const last = this.count * (this.page); const last = this.count * (this.page);
this.rows = this.allRows.slice(first, last); this.rowsToDisplay = this.rows.slice(first, last);
}; };
this.$onChanges = (changes) => { this.$onChanges = (changes) => {
@@ -24,10 +24,10 @@ function DynamicTable($sanitize) {
} }
if (changes.rows) { if (changes.rows) {
this.allRows = changes.rows.currentValue; this.rows = changes.rows.currentValue;
} }
this.rowsCount = this.allRows.length; this.rowsCount = this.rows.length;
this.pageChanged(); this.pageChanged();
}; };

View File

@@ -1,14 +1,14 @@
<div class="container bg-white p-5" ng-show="$ctrl.filters"> <div class="container bg-white p-5" ng-show="$ctrl.filters | notEmpty">
<div class="row"> <div class="row">
<div class="col-sm-6 m-t-5" ng-repeat="filter in $ctrl.filters"> <div class="col-sm-6 m-t-5" ng-repeat="filter in $ctrl.filters">
<ui-select ng-model="filter.current" ng-if="!filter.multiple" on-select="$ctrl.filterChangeListener(filter, $model)"> <ui-select ng-model="filter.current" ng-if="!filter.multiple" on-select="$ctrl.filterChangeListener(filter, $model)" on-remove="$ctrl.filterChangeListener(filter, $model)">
<ui-select-match placeholder="Select value for {{filter.friendlyName}}...">{{filter.friendlyName}}: {{$select.selected | filterValue:filter}}</ui-select-match> <ui-select-match placeholder="Select value for {{filter.friendlyName}}...">{{filter.friendlyName}}: {{$select.selected | filterValue:filter}}</ui-select-match>
<ui-select-choices repeat="value in filter.values | filter: $select.search"> <ui-select-choices repeat="value in filter.values | filter: $select.search">
{{value | filterValue:filter }} {{value | filterValue:filter }}
</ui-select-choices> </ui-select-choices>
</ui-select> </ui-select>
<ui-select ng-model="filter.current" multiple ng-if="filter.multiple" on-select="$ctrl.filterChangeListener(filter, $model)"> <ui-select ng-model="filter.current" multiple ng-if="filter.multiple" on-select="$ctrl.filterChangeListener(filter, $model)" on-remove="$ctrl.filterChangeListener(filter, $model)">
<ui-select-match placeholder="Select value for {{filter.friendlyName}}...">{{filter.friendlyName}}: {{$item | filterValue:filter}}</ui-select-match> <ui-select-match placeholder="Select value for {{filter.friendlyName}}...">{{filter.friendlyName}}: {{$item | filterValue:filter}}</ui-select-match>
<ui-select-choices repeat="value in filter.values | filter: $select.search"> <ui-select-choices repeat="value in filter.values | filter: $select.search">
{{value | filterValue:filter }} {{value | filterValue:filter }}

View File

@@ -3,10 +3,10 @@
<label>{{param.title}}</label> <label>{{param.title}}</label>
<button class="btn btn-default btn-xs" ng-click="showParameterSettings(param)" ng-if="editable"><i class="zmdi zmdi-settings"></i></button> <button class="btn btn-default btn-xs" ng-click="showParameterSettings(param)" ng-if="editable"><i class="zmdi zmdi-settings"></i></button>
<span ng-switch="param.type"> <span ng-switch="param.type">
<input ng-switch-when="datetime-with-seconds" type="datetime-local" step="1" class="form-control" ng-model="param.value"> <input ng-switch-when="datetime-with-seconds" type="datetime-local" step="1" class="form-control" ng-model="param.ngModel">
<input ng-switch-when="datetime" type="text" class="form-control" ng-model="param.value"> <input ng-switch-when="datetime-local" type="datetime-local" class="form-control" ng-model="param.ngModel">
<input ng-switch-when="date" type="text" class="form-control" ng-model="param.value"> <input ng-switch-when="date" type="date" class="form-control" ng-model="param.ngModel">
<input ng-switch-default type="{{param.type}}" class="form-control" ng-model="param.value"> <input ng-switch-default type="{{param.type}}" class="form-control" ng-model="param.ngModel">
</span> </span>
</div> </div>
</div> </div>

View File

@@ -9,6 +9,8 @@ const PermissionsEditorComponent = {
dismiss: '&', dismiss: '&',
}, },
controller($http, User) { controller($http, User) {
'ngInject';
this.grantees = []; this.grantees = [];
this.newGrantees = {}; this.newGrantees = {};
this.aclUrl = this.resolve.aclUrl.url; this.aclUrl = this.resolve.aclUrl.url;

View File

@@ -1,6 +1,8 @@
import moment from 'moment'; import moment from 'moment';
export default function (ngModule) { export default function (ngModule) {
ngModule.filter('toMilliseconds', () => value => value * 1000.0);
ngModule.filter('dateTime', clientConfig => ngModule.filter('dateTime', clientConfig =>
function dateTime(value) { function dateTime(value) {
if (!value) { if (!value) {

View File

@@ -6,9 +6,9 @@
<base href="/"> <base href="/">
<title>Redash</title> <title>Redash</title>
<link rel="icon" type="image/png" sizes="32x32" href="./assets/images/favicon-32x32.png"> <link rel="icon" type="image/png" sizes="32x32" href="/images/favicon-32x32.png">
<link rel="icon" type="image/png" sizes="96x96" href="./assets/images/favicon-96x96.png"> <link rel="icon" type="image/png" sizes="96x96" href="/images/favicon-96x96.png">
<link rel="icon" type="image/png" sizes="16x16" href="./assets/images/favicon-16x16.png"> <link rel="icon" type="image/png" sizes="16x16" href="/images/favicon-16x16.png">
</head> </head>
<body> <body>

View File

@@ -1,3 +1,6 @@
// This polyfill is needed to support PhantomJS which we use to generate PNGs from embeds.
import 'core-js/fn/typed/array-buffer';
import 'material-design-iconic-font/dist/css/material-design-iconic-font.css'; import 'material-design-iconic-font/dist/css/material-design-iconic-font.css';
import 'font-awesome/css/font-awesome.css'; import 'font-awesome/css/font-awesome.css';
import 'ui-select/dist/select.css'; import 'ui-select/dist/select.css';

View File

@@ -38,9 +38,9 @@
<td>{{row.query_id}}</td> <td>{{row.query_id}}</td>
<td>{{row.query_hash}}</td> <td>{{row.query_hash}}</td>
<td>{{row.run_time | durationHumanize}}</td> <td>{{row.run_time | durationHumanize}}</td>
<td>{{row.created_at | dateTime }}</td> <td>{{row.created_at | toMilliseconds | dateTime }}</td>
<td>{{row.started_at | dateTime }}</td> <td>{{row.started_at | toMilliseconds | dateTime }}</td>
<td>{{row.updated_at | dateTime }}</td> <td>{{row.updated_at | toMilliseconds | dateTime }}</td>
<td ng-if="selectedTab === 'in_progress'"> <td ng-if="selectedTab === 'in_progress'">
<cancel-query-button query-id="dataRow.query_id" task-id="dataRow.task_id"></cancel-query-button> <cancel-query-button query-id="dataRow.query_id" task-id="dataRow.task_id"></cancel-query-button>
</td> </td>

View File

@@ -27,8 +27,8 @@ function AlertCtrl($routeParams, $location, $sce, toastr, currentUser, Query, Ev
} else { } else {
this.alert = Alert.get({ id: this.alertId }, (alert) => { this.alert = Alert.get({ id: this.alertId }, (alert) => {
this.onQuerySelected(new Query(alert.query)); this.onQuerySelected(new Query(alert.query));
this.canEdit = currentUser.canEdit(this.alert);
}); });
this.canEdit = currentUser.canEdit(this.alert);
} }
this.ops = ['greater than', 'less than', 'equals']; this.ops = ['greater than', 'less than', 'equals'];

View File

@@ -0,0 +1,8 @@
/* Prevent text selection on shift+click. */
div.tags-list {
-webkit-user-select: none; /* webkit (safari, chrome) browsers */
-moz-user-select: none; /* mozilla browsers */
-khtml-user-select: none; /* webkit (konqueror) browsers */
-ms-user-select: none; /* IE10+ */
}

View File

@@ -3,10 +3,9 @@
<div class="col-lg-3"> <div class="col-lg-3">
<input type='text' class='form-control' placeholder="Search Dashboards..." <input type='text' class='form-control' placeholder="Search Dashboards..."
ng-change="$ctrl.update()" ng-model="$ctrl.searchText"/> ng-change="$ctrl.update()" ng-model="$ctrl.searchText"/>
<div class='list-group m-t-20'> <div class='list-group m-t-20 tags-list'>
<h3 class='list-group-item m-0'>Tags</h3> <a ng-repeat='tag in $ctrl.allTags' ng-class='{"active": $ctrl.tagIsSelected(tag)}'
<a ng-repeat='tag in $ctrl.allTags' ng-class='{"active": $ctrl.tagIsSelected(tag)}' class='list-group-item' ng-click='$ctrl.toggleTag($event, tag)'>
class='list-group-item' ng-click='$ctrl.toggleTag(tag)'>
{{ tag }} {{ tag }}
</a> </a>
</div> </div>
@@ -25,7 +24,7 @@
<td> <td>
<a href="dashboard/{{ dashboard.slug }}"> <a href="dashboard/{{ dashboard.slug }}">
<span class="label label-primary m-2" ng-bind="tag" ng-repeat="tag in dashboard.tags"></span> {{ dashboard.untagged_name }} <span class="label label-primary m-2" ng-bind="tag" ng-repeat="tag in dashboard.tags"></span> {{ dashboard.untagged_name }}
<span class="label label-warning" ng-if="dashboard.is_draft">Unpublished</span> <span class="label label-default" ng-if="dashboard.is_draft">Unpublished</span>
</a> </a>
</td> </td>
<td>{{ dashboard.created_at | dateTime }}</td> <td>{{ dashboard.created_at | dateTime }}</td>

View File

@@ -2,9 +2,11 @@ import _ from 'underscore';
import { Paginator } from '../../utils'; import { Paginator } from '../../utils';
import template from './dashboard-list.html'; import template from './dashboard-list.html';
import './dashboard-list.css';
function DashboardListCtrl(Dashboard, $location, clientConfig) { function DashboardListCtrl(Dashboard, $location, clientConfig) {
const self = this; const TAGS_REGEX = /(^([\w\s]|[^\u0000-\u007F])+):|(#([\w-]|[^\u0000-\u007F])+)/ig;
this.logoUrl = clientConfig.logoUrl; this.logoUrl = clientConfig.logoUrl;
const page = parseInt($location.search().page || 1, 10); const page = parseInt($location.search().page || 1, 10);
@@ -17,40 +19,48 @@ function DashboardListCtrl(Dashboard, $location, clientConfig) {
this.tagIsSelected = tag => this.selectedTags.indexOf(tag) > -1; this.tagIsSelected = tag => this.selectedTags.indexOf(tag) > -1;
this.toggleTag = (tag) => { this.toggleTag = ($event, tag) => {
if (this.tagIsSelected(tag)) { if (this.tagIsSelected(tag)) {
this.selectedTags = this.selectedTags.filter(e => e !== tag); if ($event.shiftKey) {
} else { this.selectedTags = this.selectedTags.filter(e => e !== tag);
} else {
this.selectedTags = [];
}
} else if ($event.shiftKey) {
this.selectedTags.push(tag); this.selectedTags.push(tag);
} else {
this.selectedTags = [tag];
} }
this.update(); this.update();
}; };
this.allTags = []; this.allTags = [];
this.dashboards.$promise.then((data) => { this.dashboards.$promise.then((data) => {
const out = data.map(dashboard => dashboard.name.match(/(^\w+):|(#\w+)/ig)); const out = data.map(dashboard => dashboard.name.match(TAGS_REGEX));
this.allTags = _.unique(_.flatten(out)).filter(e => e); this.allTags = _.unique(_.flatten(out)).filter(e => e).map(tag => tag.replace(/:$/, ''));
this.allTags.sort();
}); });
this.paginator = new Paginator([], { page }); this.paginator = new Paginator([], { page });
this.update = () => { this.update = () => {
self.dashboards.$promise.then((data) => { this.dashboards.$promise.then((data) => {
const filteredDashboards = data.map((dashboard) => { const filteredDashboards = data.map((dashboard) => {
dashboard.tags = dashboard.name.match(/(^\w+):|(#\w+)/ig); dashboard.tags = (dashboard.name.match(TAGS_REGEX) || []).map(tag => tag.replace(/:$/, ''));
dashboard.untagged_name = dashboard.name.replace(/(\w+):|(#\w+)/ig, '').trim(); dashboard.untagged_name = dashboard.name.replace(TAGS_REGEX, '').trim();
return dashboard; return dashboard;
}).filter((value) => { }).filter((value) => {
if (self.selectedTags.length) { if (this.selectedTags.length) {
const valueTags = new Set(value.tags); const valueTags = new Set(value.tags);
const tagMatch = self.selectedTags; const tagMatch = this.selectedTags;
const filteredMatch = tagMatch.filter(x => valueTags.has(x)); const filteredMatch = tagMatch.filter(x => valueTags.has(x));
if (tagMatch.length !== filteredMatch.length) { if (tagMatch.length !== filteredMatch.length) {
return false; return false;
} }
} }
if (self.searchText && self.searchText.length) { if (this.searchText && this.searchText.length) {
if (!value.untagged_name.toLowerCase().includes(self.searchText)) { if (!value.untagged_name.toLowerCase().includes(this.searchText.toLowerCase())) {
return false; return false;
} }
} }
@@ -73,6 +83,7 @@ export default function (ngModule) {
const route = { const route = {
template: '<page-dashboard-list></page-dashboard-list>', template: '<page-dashboard-list></page-dashboard-list>',
reloadOnSearch: false, reloadOnSearch: false,
title: 'Dashboards',
}; };
return { return {

View File

@@ -1,7 +1,10 @@
<div class="container"> <div class="container">
<div class="row bg-white p-t-10 p-b-10 m-b-10"> <div class="row bg-white p-t-10 p-b-10 m-b-10">
<div class="col-sm-9"> <div class="col-sm-9">
<h3>{{$ctrl.dashboard.name}} <span class="label label-warning" ng-if="$ctrl.dashboard.is_draft">Unpublished</span></h3> <h3>{{$ctrl.dashboard.name}}
<span class="label label-default" ng-if="$ctrl.dashboard.is_draft && !$ctrl.dashboard.is_archived">Unpublished</span>
<span class="label label-warning" ng-if="$ctrl.dashboard.is_archived" uib-popover="This dashboard is archived and and won't appear in the dashboards list or search results." popover-placement="right" popover-trigger="'mouseenter'">Archived</span>
</h3>
</div> </div>
<div class="col-sm-3 text-right"> <div class="col-sm-3 text-right">
<h3> <h3>
@@ -33,7 +36,7 @@
<span class="zmdi zmdi-share"></span> <span class="zmdi zmdi-share"></span>
</button> </button>
</span> </span>
<div class="btn-group hidden-print" role="group" ng-show="$ctrl.dashboard.canEdit()" uib-dropdown> <div class="btn-group hidden-print" role="group" ng-show="$ctrl.dashboard.canEdit()" uib-dropdown ng-if="!$ctrl.dashboard.is_archived">
<button class="btn btn-default btn-sm dropdown-toggle" uib-dropdown-toggle> <button class="btn btn-default btn-sm dropdown-toggle" uib-dropdown-toggle>
<span class="zmdi zmdi-more"></span> <span class="zmdi zmdi-more"></span>
</button> </button>
@@ -50,10 +53,6 @@
</div> </div>
</div> </div>
<div class="col-lg-12 p-5 m-b-10 bg-orange c-white" ng-if="$ctrl.dashboard.is_archived">
This dashboard is archived and won't appear in the dashboards list or search results.
</div>
<div class="m-b-5"> <div class="m-b-5">
<parameters parameters="$ctrl.globalParameters" on-change="$ctrl.onGlobalParametersChange()"></parameters> <parameters parameters="$ctrl.globalParameters" on-change="$ctrl.onGlobalParametersChange()"></parameters>
</div> </div>

View File

@@ -31,13 +31,15 @@ function DashboardCtrl($rootScope, $routeParams, $location, $timeout, $q, $uibMo
let globalParams = {}; let globalParams = {};
this.dashboard.widgets.forEach(row => this.dashboard.widgets.forEach(row =>
row.forEach((widget) => { row.forEach((widget) => {
widget.getQuery().getParametersDefs().filter(p => p.global).forEach((param) => { if (widget.getQuery()) {
const defaults = {}; widget.getQuery().getParametersDefs().filter(p => p.global).forEach((param) => {
defaults[param.name] = _.clone(param); const defaults = {};
defaults[param.name].locals = []; defaults[param.name] = _.create(Object.getPrototypeOf(param), param);
globalParams = _.defaults(globalParams, defaults); defaults[param.name].locals = [];
globalParams[param.name].locals.push(param); globalParams = _.defaults(globalParams, defaults);
}); globalParams[param.name].locals.push(param);
});
}
}) })
); );
this.globalParameters = _.values(globalParams); this.globalParameters = _.values(globalParams);
@@ -82,13 +84,14 @@ function DashboardCtrl($rootScope, $routeParams, $location, $timeout, $q, $uibMo
return; return;
} }
if (hasQueryStringValue) {
queryFilter.current = $location.search()[queryFilter.name];
}
if (!_.has(filters, queryFilter.name)) { if (!_.has(filters, queryFilter.name)) {
const filter = _.extend({}, queryFilter); const filter = _.extend({}, queryFilter);
filters[filter.name] = filter; filters[filter.name] = filter;
filters[filter.name].originFilters = []; filters[filter.name].originFilters = [];
if (hasQueryStringValue) {
filter.current = $location.search()[filter.name];
}
} }
// TODO: merge values. // TODO: merge values.
@@ -181,7 +184,7 @@ function DashboardCtrl($rootScope, $routeParams, $location, $timeout, $q, $uibMo
}; };
this.togglePublished = () => { this.togglePublished = () => {
Events.record(currentUser, 'toggle_published', 'dashboard', this.dashboard.id); Events.record('toggle_published', 'dashboard', this.dashboard.id);
this.dashboard.is_draft = !this.dashboard.is_draft; this.dashboard.is_draft = !this.dashboard.is_draft;
this.saveInProgress = true; this.saveInProgress = true;
Dashboard.save({ Dashboard.save({
@@ -218,6 +221,8 @@ const ShareDashboardComponent = {
dismiss: '&', dismiss: '&',
}, },
controller($http) { controller($http) {
'ngInject';
this.dashboard = this.resolve.dashboard; this.dashboard = this.resolve.dashboard;
this.toggleSharing = () => { this.toggleSharing = () => {

View File

@@ -7,6 +7,8 @@ const PublicDashboardPage = {
dashboard: '<', dashboard: '<',
}, },
controller($routeParams, Widget) { controller($routeParams, Widget) {
'ngInject';
// embed in params == headless // embed in params == headless
this.logoUrl = logoUrl; this.logoUrl = logoUrl;
this.headless = $routeParams.embed; this.headless = $routeParams.embed;
@@ -26,6 +28,8 @@ export default function (ngModule) {
ngModule.component('publicDashboardPage', PublicDashboardPage); ngModule.component('publicDashboardPage', PublicDashboardPage);
function loadPublicDashboard($http, $route) { function loadPublicDashboard($http, $route) {
'ngInject';
const token = $route.current.params.token; const token = $route.current.params.token;
return $http.get(`/api/dashboards/public/${token}`).then(response => return $http.get(`/api/dashboards/public/${token}`).then(response =>
response.data response.data

View File

@@ -17,8 +17,8 @@
<a data-toggle="dropdown" uib-dropdown-toggle><i class="zmdi zmdi-more"></i></a> <a data-toggle="dropdown" uib-dropdown-toggle><i class="zmdi zmdi-more"></i></a>
<ul class="dropdown-menu pull-right" uib-dropdown-menu style="z-index:1000000"> <ul class="dropdown-menu pull-right" uib-dropdown-menu style="z-index:1000000">
<li><a ng-disabled="!$ctrl.queryResult.getData()" query-result-link target="_self">Download as CSV File</a></li> <li ng-class="{'disabled': $ctrl.queryResult.isEmpty()}"><a ng-href="{{$ctrl.queryResult.getLink($ctrl.query.id, 'csv')}}" download="{{$ctrl.queryResult.getName($ctrl.query.name, 'csv')}}" target="_self">Download as CSV File</a></li>
<li><a ng-disabled="!$ctrl.queryResult.getData()" file-type="xlsx" query-result-link target="_self">Download as Excel File</a></li> <li ng-class="{'disabled': $ctrl.queryResult.isEmpty()}"><a ng-href="{{$ctrl.queryResult.getLink($ctrl.query.id, 'xlsx')}}" download="{{$ctrl.queryResult.getName($ctrl.query.name, 'xlsx')}}" target="_self">Download as Excel File</a></li>
<li><a ng-href="queries/{{$ctrl.query.id}}#{{$ctrl.widget.visualization.id}}" ng-show="$ctrl.canViewQuery">View Query</a></li> <li><a ng-href="queries/{{$ctrl.query.id}}#{{$ctrl.widget.visualization.id}}" ng-show="$ctrl.canViewQuery">View Query</a></li>
<li><a ng-show="$ctrl.dashboard.canEdit()" ng-click="$ctrl.deleteWidget()">Remove From Dashboard</a></li> <li><a ng-show="$ctrl.dashboard.canEdit()" ng-click="$ctrl.deleteWidget()">Remove From Dashboard</a></li>
</ul> </ul>

View File

@@ -9,6 +9,8 @@ const EditTextBoxComponent = {
dismiss: '&', dismiss: '&',
}, },
controller(toastr) { controller(toastr) {
'ngInject';
this.saveInProgress = false; this.saveInProgress = false;
this.widget = this.resolve.widget; this.widget = this.resolve.widget;
this.saveWidget = () => { this.saveWidget = () => {

View File

@@ -15,7 +15,7 @@
<p class="f-500 m-b-20 c-black">Recent Dashboards</p> <p class="f-500 m-b-20 c-black">Recent Dashboards</p>
<div class="list-group"> <div class="list-group">
<a ng-href="dashboard/{{dashboard.slug}}" class="list-group-item" ng-repeat="dashboard in $ctrl.recentDashboards"> <a ng-href="dashboard/{{dashboard.slug}}" class="list-group-item" ng-repeat="dashboard in $ctrl.recentDashboards">
{{dashboard.name}} <span class="label label-warning" ng-if="dashboard.is_draft">Unpublished</span> {{dashboard.name}} <span class="label label-default" ng-if="dashboard.is_draft">Unpublished</span>
</a> </a>
</div> </div>
</div> </div>
@@ -24,7 +24,7 @@
<p class="f-500 m-b-20 c-black">Recent Queries</p> <p class="f-500 m-b-20 c-black">Recent Queries</p>
<div class="list-group"> <div class="list-group">
<a ng-href="queries/{{query.id}}" class="list-group-item" <a ng-href="queries/{{query.id}}" class="list-group-item"
ng-repeat="query in $ctrl.recentQueries">{{query.name}} <span class="label label-warning" ng-if="query.is_draft">Unpublished</span></a> ng-repeat="query in $ctrl.recentQueries">{{query.name}} <span class="label label-default" ng-if="query.is_draft">Unpublished</span></a>
</div> </div>
</div> </div>
</div> </div>

View File

@@ -16,7 +16,7 @@
</thead> </thead>
<tbody> <tbody>
<tr ng-repeat="query in $ctrl.paginator.getPageRows()"> <tr ng-repeat="query in $ctrl.paginator.getPageRows()">
<td><a href="queries/{{query.id}}">{{query.name}}</a> <span class="label label-warning" ng-if="query.is_draft">Unpublished</span></td> <td><a href="queries/{{query.id}}">{{query.name}}</a> <span class="label label-default" ng-if="query.is_draft">Unpublished</span></td>
<td>{{query.user.name}}</td> <td>{{query.user.name}}</td>
<td>{{query.created_at | dateTime}}</td> <td>{{query.created_at | dateTime}}</td>
<td>{{query.runtime | durationHumanize}}</td> <td>{{query.runtime | durationHumanize}}</td>

View File

@@ -1,5 +1,5 @@
<div class="modal-header"> <div class="modal-header">
<button type="button" class="close" aria-label="Close" ng-click="close()"><span aria-hidden="true">&times;</span></button> <button type="button" class="close" aria-label="Close" ng-click="$ctrl.close()"><span aria-hidden="true">&times;</span></button>
<h4 class="modal-title">Embed Code</h4> <h4 class="modal-title">Embed Code</h4>
</div> </div>
<div class="modal-body"> <div class="modal-body">

View File

@@ -21,7 +21,7 @@
</thead> </thead>
<tbody> <tbody>
<tr ng-repeat="query in $ctrl.paginator.getPageRows()"> <tr ng-repeat="query in $ctrl.paginator.getPageRows()">
<td><a href="queries/{{query.id}}">{{query.name}}</a> <span class="label label-warning" ng-if="query.is_draft">Unpublished</span></td> <td><a href="queries/{{query.id}}">{{query.name}}</a> <span class="label label-default" ng-if="query.is_draft">Unpublished</span></td>
<td>{{query.user.name}}</td> <td>{{query.user.name}}</td>
<td>{{query.created_at | dateTime}}</td> <td>{{query.created_at | dateTime}}</td>
<td>{{query.schedule | scheduleHumanize}}</td> <td>{{query.schedule | scheduleHumanize}}</td>

View File

@@ -3,7 +3,7 @@ import 'brace/mode/python';
import 'brace/mode/sql'; import 'brace/mode/sql';
import 'brace/mode/json'; import 'brace/mode/json';
import 'brace/ext/language_tools'; import 'brace/ext/language_tools';
import { each, map } from 'underscore'; import { map } from 'underscore';
// By default Ace will try to load snippet files for the different modes and fail. // By default Ace will try to load snippet files for the different modes and fail.
// We don't need them, so we use these placeholders until we define our own. // We don't need them, so we use these placeholders until we define our own.
@@ -25,7 +25,6 @@ function queryEditor(QuerySnippet) {
query: '=', query: '=',
schema: '=', schema: '=',
syntax: '=', syntax: '=',
shortcuts: '=',
}, },
template: '<div ui-ace="editorOptions" ng-model="query.query"></div>', template: '<div ui-ace="editorOptions" ng-model="query.query"></div>',
link: { link: {
@@ -47,11 +46,6 @@ function queryEditor(QuerySnippet) {
editor.commands.bindKey('Cmd+L', null); editor.commands.bindKey('Cmd+L', null);
editor.commands.bindKey('Ctrl+L', null); editor.commands.bindKey('Ctrl+L', null);
each($scope.shortcuts, (fn, key) => {
key = key.replace('meta', 'Cmd').replace('ctrl', 'Ctrl');
editor.commands.bindKey(key, () => fn());
});
QuerySnippet.query((snippets) => { QuerySnippet.query((snippets) => {
window.ace.acequire(['ace/snippets'], (snippetsModule) => { window.ace.acequire(['ace/snippets'], (snippetsModule) => {
const snippetManager = snippetsModule.snippetManager; const snippetManager = snippetsModule.snippetManager;

View File

@@ -21,7 +21,8 @@
<div class="col-sm-9"> <div class="col-sm-9">
<h3> <h3>
<edit-in-place editable="canEdit" done="saveName" ignore-blanks="true" value="query.name"></edit-in-place> <edit-in-place editable="canEdit" done="saveName" ignore-blanks="true" value="query.name"></edit-in-place>
<span class="label label-warning" ng-if="query.is_draft">Unpublished</span> <span class="label label-default" ng-if="query.is_draft && !query.is_archived">Unpublished</span>
<span class="label label-warning" ng-if="query.is_archived" uib-popover="This query is archived and can't be used in dashboards, and won't appear in search results." popover-placement="right" popover-trigger="'mouseenter'">Archived</span>
</h3> </h3>
<p> <p>
<em> <em>
@@ -75,15 +76,16 @@
</span> </span>
</h3> </h3>
</div> </div>
<div class="col-lg-12 p-5 bg-orange c-white" ng-if="query.is_archived">
This query is archived and can't be used in dashboards, and won't appear in search results.
</div>
</div> </div>
<!-- editor --> <!-- editor -->
<div class="container"> <div class="container">
<div class="row bg-white p-b-5" ng-if="sourceMode" resizable r-directions="['bottom']" r-height="300" style="min-height:100px;"> <div class="row bg-white p-b-5" ng-if="sourceMode" resizable r-directions="['bottom']" r-height="300" style="min-height:100px;">
<schema-browser schema="schema" class="col-md-3 hidden-sm hidden-xs schema-container" ng-show="hasSchema"></schema-browser> <schema-browser class="col-md-3 hidden-sm hidden-xs schema-container"
schema="schema"
on-refresh="refreshSchema()"
ng-show="hasSchema">
</schema-browser>
<div ng-class="editorSize" style="height:100%;"> <div ng-class="editorSize" style="height:100%;">
<div class="p-5"> <div class="p-5">
@@ -128,8 +130,7 @@
<p style="height:calc(100% - 40px);"> <p style="height:calc(100% - 40px);">
<query-editor query="query" <query-editor query="query"
schema="schema" schema="schema"
syntax="dataSource.syntax" syntax="dataSource.syntax"></query-editor>
shortcuts="shortcuts"></query-editor>
</p> </p>
</div> </div>
</div> </div>

View File

@@ -1,13 +1,21 @@
<div class="schema-container"> <div class="schema-container">
<div class="p-t-5 p-b-5"> <div class="schema-control">
<input type="text" placeholder="Search schema..." class="form-control" ng-model="schemaFilter"> <input type="text" placeholder="Search schema..." class="form-control" ng-model="$ctrl.schemaFilter">
<button class="btn btn-default"
title="Refresh Schema"
ng-click="$ctrl.onRefresh()">
<span class="zmdi zmdi-refresh"></span>
</button>
</div> </div>
<div class="schema-browser" vs-repeat vs-size="getSize(table)"> <div class="schema-browser" vs-repeat vs-size="$ctrl.getSize(table)">
<div ng-repeat="table in schema | filter:schemaFilter track by table.name"> <div ng-repeat="table in $ctrl.schema | filter:$ctrl.schemaFilter track by table.name">
<div class="table-name" ng-click="showTable(table)"> <div class="table-name" ng-click="$ctrl.showTable(table)">
<i class="fa fa-table"></i> <strong><span title="{{table.name}}">{{table.name}}</span> <i class="fa fa-table"></i>
<span ng-if="table.size !== undefined"> ({{table.size}})</span></strong> <strong>
<span title="{{table.name}}">{{table.name}}</span>
<span ng-if="table.size !== undefined"> ({{table.size}})</span>
</strong>
</div> </div>
<div uib-collapse="table.collapsed"> <div uib-collapse="table.collapsed">
<div ng-repeat="column in table.columns track by column" style="padding-left:16px;">{{column}}</div> <div ng-repeat="column in table.columns track by column" style="padding-left:16px;">{{column}}</div>

View File

@@ -1,31 +1,33 @@
import template from './schema-browser.html'; import template from './schema-browser.html';
function schemaBrowser() { function SchemaBrowserCtrl($scope) {
return { 'ngInject';
restrict: 'E',
scope: {
schema: '=',
},
template,
link($scope) {
$scope.showTable = (table) => {
table.collapsed = !table.collapsed;
$scope.$broadcast('vsRepeatTrigger');
};
$scope.getSize = (table) => { this.showTable = (table) => {
let size = 18; table.collapsed = !table.collapsed;
$scope.$broadcast('vsRepeatTrigger');
};
if (!table.collapsed) { this.getSize = (table) => {
size += 18 * table.columns.length; let size = 18;
}
return size; if (!table.collapsed) {
}; size += 18 * table.columns.length;
}, }
return size;
}; };
} }
const SchemaBrowser = {
bindings: {
schema: '<',
onRefresh: '&',
},
controller: SchemaBrowserCtrl,
template,
};
export default function (ngModule) { export default function (ngModule) {
ngModule.directive('schemaBrowser', schemaBrowser); ngModule.component('schemaBrowser', SchemaBrowser);
} }

View File

@@ -29,24 +29,19 @@ function QuerySourceCtrl(Events, toastr, $controller, $scope, $location, $http,
}, },
}); });
$scope.shortcuts = { const shortcuts = {
'meta+s': function save() { 'mod+s': function save() {
if ($scope.canEdit) { if ($scope.canEdit) {
$scope.saveQuery(); $scope.saveQuery();
} }
}, },
'ctrl+s': function save() {
if ($scope.canEdit) {
$scope.saveQuery();
}
},
// Cmd+Enter for Mac
'meta+enter': $scope.executeQuery,
// Ctrl+Enter for PC
'ctrl+enter': $scope.executeQuery,
}; };
KeyboardShortcuts.bind($scope.shortcuts); KeyboardShortcuts.bind(shortcuts);
$scope.$on('$destroy', () => {
KeyboardShortcuts.unbind(shortcuts);
});
// @override // @override
$scope.saveQuery = (options, data) => { $scope.saveQuery = (options, data) => {
@@ -106,10 +101,6 @@ function QuerySourceCtrl(Events, toastr, $controller, $scope, $location, $http,
$scope.$watch('query.query', (newQueryText) => { $scope.$watch('query.query', (newQueryText) => {
$scope.isDirty = (newQueryText !== queryText); $scope.isDirty = (newQueryText !== queryText);
}); });
$scope.$on('$destroy', () => {
KeyboardShortcuts.unbind($scope.shortcuts);
});
} }
export default function (ngModule) { export default function (ngModule) {

View File

@@ -1,9 +1,9 @@
import { pick, any, some, find } from 'underscore'; import { pick, any, some, find } from 'underscore';
import template from './query.html'; import template from './query.html';
function QueryViewCtrl($scope, Events, $route, $routeParams, $http, $location, $window, $q, function QueryViewCtrl($scope, Events, $route, $routeParams, $location, $window, $q,
Title, AlertDialog, Notifications, clientConfig, toastr, $uibModal, currentUser, KeyboardShortcuts, Title, AlertDialog, Notifications, clientConfig, toastr, $uibModal,
Query, DataSource) { currentUser, Query, DataSource) {
const DEFAULT_TAB = 'table'; const DEFAULT_TAB = 'table';
function getQueryResult(maxAge) { function getQueryResult(maxAge) {
@@ -43,26 +43,36 @@ function QueryViewCtrl($scope, Events, $route, $routeParams, $http, $location, $
return dataSourceId; return dataSourceId;
} }
function updateSchema() { function toggleSchemaBrowser(hasSchema) {
$scope.hasSchema = false; $scope.hasSchema = hasSchema;
$scope.editorSize = 'col-md-12'; $scope.editorSize = hasSchema ? 'col-md-9' : 'col-md-12';
DataSource.getSchema({ id: $scope.query.data_source_id }, (data) => { }
if (data && data.length > 0) {
function getSchema(refresh = undefined) {
DataSource.getSchema({ id: $scope.query.data_source_id, refresh }, (data) => {
const hasPrevSchema = refresh ? ($scope.schema && ($scope.schema.length > 0)) : false;
const hasSchema = data && (data.length > 0);
if (hasSchema) {
$scope.schema = data; $scope.schema = data;
data.forEach((table) => { data.forEach((table) => {
table.collapsed = true; table.collapsed = true;
}); });
} else if (hasPrevSchema) {
$scope.editorSize = 'col-md-9'; toastr.error('Schema refresh failed. Please try again later.');
$scope.hasSchema = true;
} else {
$scope.schema = undefined;
$scope.hasSchema = false;
$scope.editorSize = 'col-md-12';
} }
toggleSchemaBrowser(hasSchema || hasPrevSchema);
}); });
} }
function updateSchema() {
toggleSchemaBrowser(false);
getSchema();
}
$scope.refreshSchema = () => getSchema(true);
function updateDataSources(dataSources) { function updateDataSources(dataSources) {
// Filter out data sources the user can't query (or used by current query): // Filter out data sources the user can't query (or used by current query):
$scope.dataSources = dataSources.filter(dataSource => $scope.dataSources = dataSources.filter(dataSource =>
@@ -85,11 +95,38 @@ function QueryViewCtrl($scope, Events, $route, $routeParams, $http, $location, $
updateSchema(); updateSchema();
} }
$scope.executeQuery = () => {
if (!$scope.canExecuteQuery()) {
return;
}
if (!$scope.query.query) {
return;
}
getQueryResult(0);
$scope.lockButton(true);
$scope.cancelling = false;
Events.record('execute', 'query', $scope.query.id);
Notifications.getPermissions();
};
$scope.currentUser = currentUser; $scope.currentUser = currentUser;
$scope.dataSource = {}; $scope.dataSource = {};
$scope.query = $route.current.locals.query; $scope.query = $route.current.locals.query;
$scope.showPermissionsControl = clientConfig.showPermissionsControl; $scope.showPermissionsControl = clientConfig.showPermissionsControl;
const shortcuts = {
'mod+enter': $scope.executeQuery,
};
KeyboardShortcuts.bind(shortcuts);
$scope.$on('$destroy', () => {
KeyboardShortcuts.unbind(shortcuts);
});
Events.record('view', 'query', $scope.query.id); Events.record('view', 'query', $scope.query.id);
if ($scope.query.hasResult() || $scope.query.paramsRequired()) { if ($scope.query.hasResult() || $scope.query.paramsRequired()) {
@@ -157,7 +194,7 @@ function QueryViewCtrl($scope, Events, $route, $routeParams, $http, $location, $
}; };
$scope.togglePublished = () => { $scope.togglePublished = () => {
Events.record(currentUser, 'toggle_published', 'query', $scope.query.id); Events.record('toggle_published', 'query', $scope.query.id);
$scope.query.is_draft = !$scope.query.is_draft; $scope.query.is_draft = !$scope.query.is_draft;
$scope.saveQuery(undefined, { is_draft: $scope.query.is_draft }); $scope.saveQuery(undefined, { is_draft: $scope.query.is_draft });
}; };
@@ -172,23 +209,6 @@ function QueryViewCtrl($scope, Events, $route, $routeParams, $http, $location, $
$scope.saveQuery(undefined, { name: $scope.query.name }); $scope.saveQuery(undefined, { name: $scope.query.name });
}; };
$scope.executeQuery = () => {
if (!$scope.canExecuteQuery()) {
return;
}
if (!$scope.query.query) {
return;
}
getQueryResult(0);
$scope.lockButton(true);
$scope.cancelling = false;
Events.record('execute', 'query', $scope.query.id);
Notifications.getPermissions();
};
$scope.cancelExecution = () => { $scope.cancelExecution = () => {
$scope.cancelling = true; $scope.cancelling = true;
$scope.queryResult.cancelExecution(); $scope.queryResult.cancelExecution();

View File

@@ -2,7 +2,7 @@
<div class="t-heading p-10"> <div class="t-heading p-10">
<h3 class="th-title"> <h3 class="th-title">
<p> <p>
<img src="{{$ctrl.logoUrl}}" style="height: 24px;"/> <img ng-src="{{$ctrl.logoUrl}}" style="height: 24px;"/>
{{$ctrl.query.name}} {{$ctrl.query.name}}
<small><visualization-name visualization="$ctrl.visualization"/></small> <small><visualization-name visualization="$ctrl.visualization"/></small>
</p> </p>

View File

@@ -8,6 +8,8 @@ const VisualizationEmbed = {
data: '<', data: '<',
}, },
controller($routeParams, Query, QueryResult) { controller($routeParams, Query, QueryResult) {
'ngInject';
document.querySelector('body').classList.add('headless'); document.querySelector('body').classList.add('headless');
const visualizationId = parseInt($routeParams.visualizationId, 10); const visualizationId = parseInt($routeParams.visualizationId, 10);
this.showQueryDescription = $routeParams.showDescription; this.showQueryDescription = $routeParams.showDescription;
@@ -24,6 +26,8 @@ export default function (ngModule) {
ngModule.component('visualizationEmbed', VisualizationEmbed); ngModule.component('visualizationEmbed', VisualizationEmbed);
function session($http, $route, Auth) { function session($http, $route, Auth) {
'ngInject';
const apiKey = $route.current.params.api_key; const apiKey = $route.current.params.api_key;
Auth.setApiKey(apiKey); Auth.setApiKey(apiKey);
return Auth.loadConfig(); return Auth.loadConfig();

View File

@@ -27,7 +27,10 @@ function Dashboard($resource, $http, currentUser, Widget) {
}, },
}); });
resource.prototype.canEdit = () => currentUser.canEdit(this) || this.can_edit; resource.prototype.canEdit = function canEdit() {
return currentUser.canEdit(this) || this.can_edit;
};
return resource; return resource;
} }

View File

@@ -3,7 +3,7 @@ function DataSource($resource) {
get: { method: 'GET', cache: false, isArray: false }, get: { method: 'GET', cache: false, isArray: false },
query: { method: 'GET', cache: false, isArray: true }, query: { method: 'GET', cache: false, isArray: true },
test: { method: 'POST', cache: false, isArray: false, url: 'api/data_sources/:id/test' }, test: { method: 'POST', cache: false, isArray: false, url: 'api/data_sources/:id/test' },
getSchema: { method: 'GET', cache: true, isArray: true, url: 'api/data_sources/:id/schema' }, getSchema: { method: 'GET', cache: false, isArray: true, url: 'api/data_sources/:id/schema' },
}; };
const DataSourceResource = $resource('api/data_sources/:id', { id: '@id' }, actions); const DataSourceResource = $resource('api/data_sources/:id', { id: '@id' }, actions);

View File

@@ -1,10 +1,12 @@
import { each } from 'underscore'; import { each } from 'underscore';
import Mousetrap from 'mousetrap'; import Mousetrap from 'mousetrap';
import 'mousetrap/plugins/global-bind/mousetrap-global-bind';
function KeyboardShortcuts() { function KeyboardShortcuts() {
this.bind = function bind(keymap) { this.bind = function bind(keymap) {
each(keymap, (fn, key) => { each(keymap, (fn, key) => {
Mousetrap.bind(key, (e) => { Mousetrap.bindGlobal(key, (e) => {
e.preventDefault(); e.preventDefault();
fn(); fn();
}); });

View File

@@ -216,15 +216,20 @@ function QueryResultService($resource, $timeout, $q) {
return this.filteredData; return this.filteredData;
} }
isEmpty() {
return this.getData() === null || this.getData().length === 0;
}
getChartData(mapping) { getChartData(mapping) {
const series = {}; const series = {};
this.getData().forEach((row) => { this.getData().forEach((row) => {
const point = {}; let point = {};
let seriesName; let seriesName;
let xValue = 0; let xValue = 0;
const yValues = {}; const yValues = {};
let eValue = null; let eValue = null;
let sizeValue = null;
each(row, (v, definition) => { each(row, (v, definition) => {
const name = definition.split('::')[0] || definition.split('__')[0]; const name = definition.split('::')[0] || definition.split('__')[0];
@@ -258,6 +263,11 @@ function QueryResultService($resource, $timeout, $q) {
seriesName = String(value); seriesName = String(value);
} }
if (type === 'size') {
point[type] = value;
sizeValue = value;
}
if (type === 'multiFilter' || type === 'multi-filter') { if (type === 'multiFilter' || type === 'multi-filter') {
seriesName = String(value); seriesName = String(value);
} }
@@ -265,11 +275,15 @@ function QueryResultService($resource, $timeout, $q) {
if (seriesName === undefined) { if (seriesName === undefined) {
each(yValues, (yValue, ySeriesName) => { each(yValues, (yValue, ySeriesName) => {
point = { x: xValue, y: yValue };
if (eValue !== null) { if (eValue !== null) {
addPointToSeries({ x: xValue, y: yValue, yError: eValue }, series, ySeriesName); point.yError = eValue;
} else {
addPointToSeries({ x: xValue, y: yValue }, series, ySeriesName);
} }
if (sizeValue !== null) {
point.size = sizeValue;
}
addPointToSeries(point, series, ySeriesName);
}); });
} else { } else {
addPointToSeries(point, series, seriesName); addPointToSeries(point, series, seriesName);
@@ -339,7 +353,11 @@ function QueryResultService($resource, $timeout, $q) {
filters.forEach((filter) => { filters.forEach((filter) => {
filter.values.push(row[filter.name]); filter.values.push(row[filter.name]);
if (filter.values.length === 1) { if (filter.values.length === 1) {
filter.current = row[filter.name]; if (filter.multiple) {
filter.current = [row[filter.name]];
} else {
filter.current = row[filter.name];
}
} }
}); });
}); });

View File

@@ -43,6 +43,43 @@ class QueryResultError {
} }
class Parameter {
constructor(parameter) {
this.title = parameter.title;
this.name = parameter.name;
this.type = parameter.type;
this.value = parameter.value;
this.global = parameter.global;
}
get ngModel() {
if (this.type === 'date' || this.type === 'datetime-local' || this.type === 'datetime-with-seconds') {
this.$$value = this.$$value || moment(this.value).toDate();
return this.$$value;
} else if (this.type === 'number') {
this.$$value = this.$$value || parseInt(this.value, 10);
return this.$$value;
}
return this.value;
}
set ngModel(value) {
if (value && this.type === 'date') {
this.value = moment(value).format('YYYY-MM-DD');
this.$$value = moment(this.value).toDate();
} else if (value && this.type === 'datetime-local') {
this.value = moment(value).format('YYYY-MM-DD HH:mm');
this.$$value = moment(this.value).toDate();
} else if (value && this.type === 'datetime-with-seconds') {
this.value = moment(value).format('YYYY-MM-DD HH:mm:ss');
this.$$value = moment(this.value).toDate();
} else {
this.value = this.$$value = value;
}
}
}
class Parameters { class Parameters {
constructor(query, queryString) { constructor(query, queryString) {
this.query = query; this.query = query;
@@ -84,7 +121,8 @@ class Parameters {
}); });
const parameterExists = p => contains(parameterNames, p.name); const parameterExists = p => contains(parameterNames, p.name);
this.query.options.parameters = this.query.options.parameters.filter(parameterExists); this.query.options.parameters =
this.query.options.parameters.filter(parameterExists).map(p => new Parameter(p));
} }
initFromQueryString(queryString) { initFromQueryString(queryString) {

View File

@@ -72,6 +72,18 @@
</ui-select> </ui-select>
</div> </div>
<div class="form-group" ng-if="showSizeColumnPicker()">
<label class="control-label">Bubble size column</label>
<ui-select name="sizeColumn" ng-model="form.sizeColumn">
<ui-select-match allow-clear="true" placeholder="Choose column...">{{$select.selected}}</ui-select-match>
<ui-select-choices repeat="column in columnNames | remove:form.yAxisColumns | remove:form.groupby">
<span ng-bind-html="column | highlight: $select.search"></span><span> </span>
<small class="text-muted" ng-bind="columns[column].type"></small>
</ui-select-choices>
</ui-select>
</div>
<div class="form-group" ng-if="options.globalSeriesType != 'custom'"> <div class="form-group" ng-if="options.globalSeriesType != 'custom'">
<label class="control-label">Errors column</label> <label class="control-label">Errors column</label>

View File

@@ -1,4 +1,4 @@
import { extend, has, partial, intersection, without, contains, isUndefined, sortBy, each, pluck, keys, difference } from 'underscore'; import { some, extend, has, partial, intersection, without, contains, isUndefined, sortBy, each, pluck, keys, difference } from 'underscore';
import plotly from './plotly'; import plotly from './plotly';
import template from './chart.html'; import template from './chart.html';
import editorTemplate from './chart-editor.html'; import editorTemplate from './chart-editor.html';
@@ -68,6 +68,7 @@ function ChartEditor(ColorPalette, clientConfig) {
area: { name: 'Area', icon: 'area-chart' }, area: { name: 'Area', icon: 'area-chart' },
pie: { name: 'Pie', icon: 'pie-chart' }, pie: { name: 'Pie', icon: 'pie-chart' },
scatter: { name: 'Scatter', icon: 'circle-o' }, scatter: { name: 'Scatter', icon: 'circle-o' },
bubble: { name: 'Bubble', icon: 'circle-o' },
}; };
if (clientConfig.allowCustomJSVisualizations) { if (clientConfig.allowCustomJSVisualizations) {
@@ -83,6 +84,8 @@ function ChartEditor(ColorPalette, clientConfig) {
}); });
}; };
scope.showSizeColumnPicker = () => some(scope.options.seriesOptions, options => options.type === 'bubble');
scope.options.customCode = `// Available variables are x, ys, element, and Plotly scope.options.customCode = `// Available variables are x, ys, element, and Plotly
// Type console.log(x, ys); for more info about x and ys // Type console.log(x, ys); for more info about x and ys
// To plot your graph call Plotly.plot(element, ...) // To plot your graph call Plotly.plot(element, ...)
@@ -191,6 +194,15 @@ function ChartEditor(ColorPalette, clientConfig) {
} }
}); });
scope.$watch('form.sizeColumn', (value, old) => {
if (old !== undefined) {
unsetColumn(old);
}
if (value !== undefined) {
setColumnRole('size', value);
}
});
scope.$watch('form.groupby', (value, old) => { scope.$watch('form.groupby', (value, old) => {
if (old !== undefined) { if (old !== undefined) {
@@ -222,6 +234,8 @@ function ChartEditor(ColorPalette, clientConfig) {
scope.form.groupby = key; scope.form.groupby = key;
} else if (value === 'yError') { } else if (value === 'yError') {
scope.form.errorColumn = key; scope.form.errorColumn = key;
} else if (value === 'size') {
scope.form.sizeColumn = key;
} }
}); });
} }

View File

@@ -8,6 +8,9 @@ import histogram from 'plotly.js/lib/histogram';
import moment from 'moment'; import moment from 'moment';
Plotly.register([bar, pie, histogram]); Plotly.register([bar, pie, histogram]);
Plotly.setPlotConfig({
modeBarButtonsToRemove: ['sendDataToCloud'],
});
// The following colors will be used if you pick "Automatic" color. // The following colors will be used if you pick "Automatic" color.
const BaseColors = { const BaseColors = {
@@ -137,7 +140,7 @@ function percentBarStacking(seriesList) {
sum += seriesList[j].y[i]; sum += seriesList[j].y[i];
} }
for (let j = 0; j < seriesList.length; j += 1) { for (let j = 0; j < seriesList.length; j += 1) {
const value = seriesList[j].y[i] / (sum * 100); const value = seriesList[j].y[i] / sum * 100;
seriesList[j].text.push(`Value: ${seriesList[j].y[i]}<br>Relative: ${value.toFixed(2)}%`); seriesList[j].text.push(`Value: ${seriesList[j].y[i]}<br>Relative: ${value.toFixed(2)}%`);
seriesList[j].y[i] = value; seriesList[j].y[i] = value;
} }
@@ -208,6 +211,8 @@ const PlotlyChart = () => {
} else if (type === 'scatter') { } else if (type === 'scatter') {
series.type = 'scatter'; series.type = 'scatter';
series.mode = 'markers'; series.mode = 'markers';
} else if (type === 'bubble') {
series.mode = 'markers';
} }
} }
@@ -330,6 +335,12 @@ const PlotlyChart = () => {
if (!plotlySeries.error_y.length) { if (!plotlySeries.error_y.length) {
delete plotlySeries.error_y.length; delete plotlySeries.error_y.length;
} }
if (seriesOptions.type === 'bubble') {
plotlySeries.marker = {
size: pluck(data, 'size'),
};
}
scope.data.push(plotlySeries); scope.data.push(plotlySeries);
}); });
@@ -400,7 +411,11 @@ const PlotlyChart = () => {
scope.$watch('series', recalculateOptions); scope.$watch('series', recalculateOptions);
scope.$watch('options', recalculateOptions, true); scope.$watch('options', recalculateOptions, true);
scope.layout = { margin: { l: 50, r: 50, b: bottomMargin, t: 20, pad: 4 }, height: calculateHeight(), autosize: true, hovermode: 'closest' }; scope.layout = {
margin: { l: 50, r: 50, b: bottomMargin, t: 20, pad: 4 },
height: calculateHeight(),
autosize: true,
};
scope.plotlyOptions = { showLink: false, displaylogo: false }; scope.plotlyOptions = { showLink: false, displaylogo: false };
scope.data = []; scope.data = [];

View File

@@ -2,13 +2,13 @@
<div class="form-group"> <div class="form-group">
<label class="col-lg-6">Counter Value Column Name</label> <label class="col-lg-6">Counter Value Column Name</label>
<div class="col-lg-6"> <div class="col-lg-6">
<select ng-options="name for name in queryResult.getColumnNames()" ng-model="visualization.options.counterColName" class="form-control"></select> <select ng-options="name for name in queryResult.getColumnNames()" ng-model="visualization.options.counterColName" class="form-control" ng-disabled="visualization.options.countRow"></select>
</div> </div>
</div> </div>
<div class="form-group"> <div class="form-group">
<label class="col-lg-6">Counter Value Row Number</label> <label class="col-lg-6">Counter Value Row Number</label>
<div class="col-lg-6"> <div class="col-lg-6">
<input type="number" ng-model="visualization.options.rowNumber" min="1" class="form-control"> <input type="number" ng-model="visualization.options.rowNumber" min="1" class="form-control" ng-disabled="visualization.options.countRow">
</div> </div>
</div> </div>
<div class="form-group"> <div class="form-group">
@@ -25,4 +25,10 @@
<input type="number" ng-model="visualization.options.targetRowNumber" min="1" class="form-control"> <input type="number" ng-model="visualization.options.targetRowNumber" min="1" class="form-control">
</div> </div>
</div> </div>
<div class="form-group">
<div class="col-lg-6">
<input type="checkbox" ng-model="visualization.options.countRow">
<i class="input-helper"></i> Count Rows
</div>
</div>
</div> </div>

View File

@@ -14,10 +14,11 @@ function CounterRenderer() {
const counterColName = $scope.visualization.options.counterColName; const counterColName = $scope.visualization.options.counterColName;
const targetColName = $scope.visualization.options.targetColName; const targetColName = $scope.visualization.options.targetColName;
if (counterColName) { if ($scope.visualization.options.countRow) {
$scope.counterValue = queryData.length;
} else if (counterColName) {
$scope.counterValue = queryData[rowNumber][counterColName]; $scope.counterValue = queryData[rowNumber][counterColName];
} }
if (targetColName) { if (targetColName) {
$scope.targetValue = queryData[targetRowNumber][targetColName]; $scope.targetValue = queryData[targetRowNumber][targetColName];

View File

@@ -204,7 +204,7 @@ function mapRenderer() {
} }
} }
$scope.$watch('queryResult && queryResult.getData()', render, true); $scope.$watch('queryResult && queryResult.getData()', render);
$scope.$watch('visualization.options', render, true); $scope.$watch('visualization.options', render, true);
angular.element(window).on('resize', resize); angular.element(window).on('resize', resize);
$scope.$watch('visualization.options.height', resize); $scope.$watch('visualization.options.height', resize);
@@ -218,7 +218,9 @@ function mapEditor() {
template: editorTemplate, template: editorTemplate,
link($scope) { link($scope) {
$scope.currentTab = 'general'; $scope.currentTab = 'general';
$scope.classify_columns = $scope.queryResult.columnNames.concat('none'); $scope.columns = $scope.queryResult.getColumns();
$scope.columnNames = _.pluck($scope.columns, 'name');
$scope.classify_columns = $scope.columnNames.concat('none');
$scope.mapTiles = [ $scope.mapTiles = [
{ {
name: 'OpenStreetMap', name: 'OpenStreetMap',

View File

@@ -8,20 +8,35 @@
<div ng-show="currentTab == 'general'"> <div ng-show="currentTab == 'general'">
<div class="form-group"> <div class="form-group">
<label class="control-label">Latitude Column Name</label> <label class="control-label">Latitude Column Name</label>
<select ng-options="name for name in queryResult.columnNames" ng-model="visualization.options.latColName" <ui-select name="form-control" required ng-model="visualization.options.latColName">
class="form-control"></select> <ui-select-match placeholder="Choose column...">{{$select.selected}}</ui-select-match>
<ui-select-choices repeat="column in columnNames | remove:visualization.options.classify | remove:visualization.options.lonColName">
<span ng-bind-html="column | highlight: $select.search"></span><span> </span>
<small class="text-muted" ng-bind="columns[column].type"></small>
</ui-select-choices>
</ui-select>
</div> </div>
<div class="form-group"> <div class="form-group">
<label class="control-label">Longitude Column Name</label> <label class="control-label">Longitude Column Name</label>
<select ng-options="name for name in queryResult.columnNames" ng-model="visualization.options.lonColName" <ui-select name="form-control" required ng-model="visualization.options.lonColName">
class="form-control"></select> <ui-select-match placeholder="Choose column...">{{$select.selected}}</ui-select-match>
<ui-select-choices repeat="column in columnNames | remove:visualization.options.classify | remove:visualization.options.latColName">
<span ng-bind-html="column | highlight: $select.search"></span><span> </span>
<small class="text-muted" ng-bind="columns[column].type"></small>
</ui-select-choices>
</ui-select>
</div> </div>
<div class="form-group"> <div class="form-group">
<label class="control-label">Group By</label> <label class="control-label">Group By</label>
<select ng-options="name for name in classify_columns" ng-model="visualization.options.classify" <ui-select name="form-control" required ng-model="visualization.options.classify">
class="form-control"></select> <ui-select-match placeholder="Choose column...">{{$select.selected}}</ui-select-match>
<ui-select-choices repeat="column in classify_columns | remove:visualization.options.lonColName | remove:visualization.options.latColName">
<span ng-bind-html="column | highlight: $select.search"></span><span> </span>
<small class="text-muted" ng-bind="columns[column].type"></small>
</ui-select-choices>
</ui-select>
</div> </div>
</div> </div>

View File

@@ -1,3 +1,4 @@
import angular from 'angular';
import $ from 'jquery'; import $ from 'jquery';
import 'pivottable'; import 'pivottable';
import 'pivottable/dist/pivot.css'; import 'pivottable/dist/pivot.css';
@@ -20,7 +21,7 @@ function pivotTableRenderer() {
if ($scope.queryResult.getData() !== null) { if ($scope.queryResult.getData() !== null) {
// We need to give the pivot table its own copy of the data, because it changes // We need to give the pivot table its own copy of the data, because it changes
// it which interferes with other visualizations. // it which interferes with other visualizations.
data = $.extend(true, [], $scope.queryResult.getRawData()); data = angular.copy($scope.queryResult.getData());
const options = { const options = {
renderers: $.pivotUtilities.renderers, renderers: $.pivotUtilities.renderers,
onRefresh(config) { onRefresh(config) {

View File

@@ -20,7 +20,8 @@ function graph(data) {
const links = {}; const links = {};
const nodes = []; const nodes = [];
const keys = _.sortBy(_.without(_.keys(data[0]), 'value'), _.identity); const validKey = key => key !== 'value' && key.indexOf('$$') !== 0;
const keys = _.sortBy(_.filter(_.keys(data[0]), validKey), _.identity);
function normalizeName(name) { function normalizeName(name) {
if (name) { if (name) {

View File

@@ -311,7 +311,8 @@ export default function Sunburst(scope, element) {
}; };
}); });
} else { } else {
const keys = _.sortBy(_.without(_.keys(raw[0]), 'value'), _.identity); const validKey = key => key !== 'value' && key.indexOf('$$') !== 0;
const keys = _.sortBy(_.filter(_.keys(raw[0]), validKey), _.identity);
values = _.map(raw, (row, sequence) => values = _.map(raw, (row, sequence) =>
({ ({

View File

@@ -2,7 +2,7 @@
<div class="form-group"> <div class="form-group">
<label class="col-lg-6">Word Cloud Column Name</label> <label class="col-lg-6">Word Cloud Column Name</label>
<div class="col-lg-6"> <div class="col-lg-6">
<select ng-options="name for name in queryResult.columnNames" ng-model="visualization.options.column" class="form-control"></select> <select ng-options="name for name in queryResult.getColumnNames()" ng-model="visualization.options.column" class="form-control"></select>
</div> </div>
</div> </div>
</div> </div>

View File

@@ -7,7 +7,7 @@
version: '2' version: '2'
services: services:
server: server:
build: . image: redash/redash:latest
command: server command: server
depends_on: depends_on:
- postgres - postgres
@@ -21,7 +21,7 @@ services:
REDASH_DATABASE_URL: "postgresql://postgres@postgres/postgres" REDASH_DATABASE_URL: "postgresql://postgres@postgres/postgres"
REDASH_COOKIE_SECRET: veryverysecret REDASH_COOKIE_SECRET: veryverysecret
worker: worker:
build: . image: redash/redash:latest
command: scheduler command: scheduler
environment: environment:
PYTHONUNBUFFERED: 0 PYTHONUNBUFFERED: 0
@@ -31,9 +31,9 @@ services:
QUEUES: "queries,scheduled_queries,celery" QUEUES: "queries,scheduled_queries,celery"
WORKERS_COUNT: 2 WORKERS_COUNT: 2
redis: redis:
image: redis:2.8 image: redis:3.0-alpine
postgres: postgres:
image: postgres:9.3 image: postgres:9.5.6-alpine
# volumes: # volumes:
# - /opt/postgres-data:/var/lib/postgresql/data # - /opt/postgres-data:/var/lib/postgresql/data
nginx: nginx:
@@ -42,3 +42,5 @@ services:
- "80:80" - "80:80"
depends_on: depends_on:
- server - server
links:
- server:redash

View File

@@ -32,9 +32,9 @@ services:
QUEUES: "queries,scheduled_queries,celery" QUEUES: "queries,scheduled_queries,celery"
WORKERS_COUNT: 2 WORKERS_COUNT: 2
redis: redis:
image: redis:2.8 image: redis:3.0-alpine
postgres: postgres:
image: postgres:9.3 image: postgres:9.5.6-alpine
# The following turns the DB into less durable, but gains significant performance improvements for the tests run (x3 # The following turns the DB into less durable, but gains significant performance improvements for the tests run (x3
# improvement on my personal machine). We should consider moving this into a dedicated Docker Compose configuration for # improvement on my personal machine). We should consider moving this into a dedicated Docker Compose configuration for
# tests. # tests.

View File

@@ -0,0 +1,25 @@
"""add Query.schedule_failures
Revision ID: d1eae8b9893e
Revises: 65fc9ede4746
Create Date: 2017-02-03 01:45:02.954923
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = 'd1eae8b9893e'
down_revision = '65fc9ede4746'
branch_labels = None
depends_on = None
def upgrade():
op.add_column('queries', sa.Column('schedule_failures', sa.Integer(),
nullable=False, server_default='0'))
def downgrade():
op.drop_column('queries', 'schedule_failures')

4
npm-shrinkwrap.json generated
View File

@@ -1960,12 +1960,12 @@
"mapbox-gl-shaders": { "mapbox-gl-shaders": {
"version": "1.0.0", "version": "1.0.0",
"from": "mapbox/mapbox-gl-shaders#de2ab007455aa2587c552694c68583f94c9f2747", "from": "mapbox/mapbox-gl-shaders#de2ab007455aa2587c552694c68583f94c9f2747",
"resolved": "git://github.com/mapbox/mapbox-gl-shaders.git#de2ab007455aa2587c552694c68583f94c9f2747" "resolved": "https://github.com/mapbox/mapbox-gl-shaders.git#de2ab007455aa2587c552694c68583f94c9f2747"
}, },
"mapbox-gl-style-spec": { "mapbox-gl-style-spec": {
"version": "8.8.0", "version": "8.8.0",
"from": "mapbox/mapbox-gl-style-spec#83b1a3e5837d785af582efd5ed1a212f2df6a4ae", "from": "mapbox/mapbox-gl-style-spec#83b1a3e5837d785af582efd5ed1a212f2df6a4ae",
"resolved": "git://github.com/mapbox/mapbox-gl-style-spec.git#83b1a3e5837d785af582efd5ed1a212f2df6a4ae" "resolved": "https://github.com/mapbox/mapbox-gl-style-spec.git#83b1a3e5837d785af582efd5ed1a212f2df6a4ae"
}, },
"mapbox-gl-supported": { "mapbox-gl-supported": {
"version": "1.2.0", "version": "1.2.0",

View File

@@ -1,6 +1,6 @@
{ {
"name": "redash-client", "name": "redash-client",
"version": "1.0.0", "version": "1.0.2",
"description": "The frontend part of Redash.", "description": "The frontend part of Redash.",
"main": "index.js", "main": "index.js",
"scripts": { "scripts": {
@@ -33,6 +33,7 @@
"angular-ui-bootstrap": "^2.2.0", "angular-ui-bootstrap": "^2.2.0",
"angular-vs-repeat": "^1.1.7", "angular-vs-repeat": "^1.1.7",
"brace": "^0.9.0", "brace": "^0.9.0",
"core-js": "https://registry.npmjs.org/core-js/-/core-js-2.4.1.tgz",
"cornelius": "git+https://github.com/restorando/cornelius.git", "cornelius": "git+https://github.com/restorando/cornelius.git",
"d3": "^3.5.17", "d3": "^3.5.17",
"d3-cloud": "^1.2.1", "d3-cloud": "^1.2.1",
@@ -59,6 +60,7 @@
"devDependencies": { "devDependencies": {
"babel-core": "^6.18.0", "babel-core": "^6.18.0",
"babel-loader": "^6.2.7", "babel-loader": "^6.2.7",
"babel-plugin-transform-object-assign": "^6.22.0",
"babel-preset-es2015": "^6.18.0", "babel-preset-es2015": "^6.18.0",
"babel-preset-stage-2": "^6.18.0", "babel-preset-stage-2": "^6.18.0",
"css-loader": "^0.25.0", "css-loader": "^0.25.0",

View File

@@ -16,7 +16,7 @@ from redash.query_runner import import_query_runners
from redash.destinations import import_destinations from redash.destinations import import_destinations
__version__ = '1.0.0' __version__ = '1.0.2'
def setup_logging(): def setup_logging():

View File

@@ -85,10 +85,10 @@ def org_login(org_slug):
@blueprint.route('/oauth/google', endpoint="authorize") @blueprint.route('/oauth/google', endpoint="authorize")
def login(): def login():
callback = url_for('.callback', _external=True) callback = url_for('.callback', _external=True)
next = request.args.get('next', url_for("redash.index", org_slug=session.get('org_slug'))) next_path = request.args.get('next', url_for("redash.index", org_slug=session.get('org_slug')))
logger.debug("Callback url: %s", callback) logger.debug("Callback url: %s", callback)
logger.debug("Next is: %s", next) logger.debug("Next is: %s", next_path)
return google_remote_app().authorize(callback=callback, state=next) return google_remote_app().authorize(callback=callback, state=next_path)
@blueprint.route('/oauth/google_callback', endpoint="callback") @blueprint.route('/oauth/google_callback', endpoint="callback")
@@ -118,6 +118,6 @@ def authorized():
create_and_login_user(org, profile['name'], profile['email']) create_and_login_user(org, profile['name'], profile['email'])
next = request.args.get('state') or url_for("redash.index", org_slug=org.slug) next_path = request.args.get('state') or url_for("redash.index", org_slug=org.slug)
return redirect(next) return redirect(next_path)

View File

@@ -70,7 +70,7 @@ def run_query_sync(data_source, parameter_values, query_text, max_age=0):
@routes.route(org_scoped_rule('/embed/query/<query_id>/visualization/<visualization_id>'), methods=['GET']) @routes.route(org_scoped_rule('/embed/query/<query_id>/visualization/<visualization_id>'), methods=['GET'])
@login_required @login_required
def embed(query_id, visualization_id, org_slug=None): def embed(query_id, visualization_id, org_slug=None):
record_event(current_org, current_user, { record_event(current_org, current_user._get_current_object(), {
'action': 'view', 'action': 'view',
'object_id': visualization_id, 'object_id': visualization_id,
'object_type': 'visualization', 'object_type': 'visualization',

View File

@@ -1,13 +1,10 @@
import csv
import json import json
import cStringIO
import time import time
import pystache import pystache
from flask import make_response, request from flask import make_response, request
from flask_login import current_user from flask_login import current_user
from flask_restful import abort from flask_restful import abort
import xlsxwriter
from redash import models, settings, utils from redash import models, settings, utils
from redash.tasks import QueryTask, record_event from redash.tasks import QueryTask, record_event
from redash.permissions import require_permission, not_view_only, has_access, require_access, view_only from redash.permissions import require_permission, not_view_only, has_access, require_access, view_only
@@ -189,39 +186,13 @@ class QueryResultResource(BaseResource):
@staticmethod @staticmethod
def make_csv_response(query_result): def make_csv_response(query_result):
s = cStringIO.StringIO()
query_data = json.loads(query_result.data)
writer = csv.DictWriter(s, fieldnames=[col['name'] for col in query_data['columns']])
writer.writer = utils.UnicodeWriter(s)
writer.writeheader()
for row in query_data['rows']:
writer.writerow(row)
headers = {'Content-Type': "text/csv; charset=UTF-8"} headers = {'Content-Type': "text/csv; charset=UTF-8"}
return make_response(s.getvalue(), 200, headers) return make_response(query_result.make_csv_content(), 200, headers)
@staticmethod @staticmethod
def make_excel_response(query_result): def make_excel_response(query_result):
s = cStringIO.StringIO()
query_data = json.loads(query_result.data)
book = xlsxwriter.Workbook(s)
sheet = book.add_worksheet("result")
column_names = []
for (c, col) in enumerate(query_data['columns']):
sheet.write(0, c, col['name'])
column_names.append(col['name'])
for (r, row) in enumerate(query_data['rows']):
for (c, name) in enumerate(column_names):
sheet.write(r + 1, c, row.get(name))
book.close()
headers = {'Content-Type': "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet"} headers = {'Content-Type': "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet"}
return make_response(s.getvalue(), 200, headers) return make_response(query_result.make_excel_content(), 200, headers)
class JobResource(BaseResource): class JobResource(BaseResource):

View File

@@ -4,6 +4,9 @@ import hashlib
import itertools import itertools
import json import json
import logging import logging
import cStringIO
import csv
import xlsxwriter
from funcy import project from funcy import project
from flask_sqlalchemy import SQLAlchemy from flask_sqlalchemy import SQLAlchemy
@@ -13,7 +16,7 @@ from sqlalchemy.event import listens_for
from sqlalchemy.inspection import inspect from sqlalchemy.inspection import inspect
from sqlalchemy.types import TypeDecorator from sqlalchemy.types import TypeDecorator
from sqlalchemy.ext.mutable import Mutable from sqlalchemy.ext.mutable import Mutable
from sqlalchemy.orm import object_session, backref from sqlalchemy.orm import object_session, backref, joinedload, subqueryload
# noinspection PyUnresolvedReferences # noinspection PyUnresolvedReferences
from sqlalchemy.orm.exc import NoResultFound from sqlalchemy.orm.exc import NoResultFound
from sqlalchemy import or_ from sqlalchemy import or_
@@ -28,7 +31,9 @@ from redash.utils import generate_token, json_dumps
from redash.utils.configuration import ConfigurationContainer from redash.utils.configuration import ConfigurationContainer
from redash.metrics import database from redash.metrics import database
db = SQLAlchemy() db = SQLAlchemy(session_options={
'expire_on_commit': False
})
Column = functools.partial(db.Column, nullable=False) Column = functools.partial(db.Column, nullable=False)
# AccessPermission and Change use a 'generic foreign key' approach to refer to # AccessPermission and Change use a 'generic foreign key' approach to refer to
@@ -424,6 +429,9 @@ class DataSource(BelongsToOrgMixin, db.Model):
__tablename__ = 'data_sources' __tablename__ = 'data_sources'
__table_args__ = (db.Index('data_sources_org_id_name', 'org_id', 'name'),) __table_args__ = (db.Index('data_sources_org_id_name', 'org_id', 'name'),)
def __eq__(self, other):
return self.id == other.id
def to_dict(self, all=False, with_permissions_for=None): def to_dict(self, all=False, with_permissions_for=None):
d = { d = {
'id': self.id, 'id': self.id,
@@ -641,8 +649,40 @@ class QueryResult(db.Model, BelongsToOrgMixin):
def groups(self): def groups(self):
return self.data_source.groups return self.data_source.groups
def make_csv_content(self):
s = cStringIO.StringIO()
def should_schedule_next(previous_iteration, now, schedule): query_data = json.loads(self.data)
writer = csv.DictWriter(s, fieldnames=[col['name'] for col in query_data['columns']])
writer.writer = utils.UnicodeWriter(s)
writer.writeheader()
for row in query_data['rows']:
writer.writerow(row)
return s.getvalue()
def make_excel_content(self):
s = cStringIO.StringIO()
query_data = json.loads(self.data)
book = xlsxwriter.Workbook(s)
sheet = book.add_worksheet("result")
column_names = []
for (c, col) in enumerate(query_data['columns']):
sheet.write(0, c, col['name'])
column_names.append(col['name'])
for (r, row) in enumerate(query_data['rows']):
for (c, name) in enumerate(column_names):
sheet.write(r + 1, c, row.get(name))
book.close()
return s.getvalue()
def should_schedule_next(previous_iteration, now, schedule, failures):
if schedule.isdigit(): if schedule.isdigit():
ttl = int(schedule) ttl = int(schedule)
next_iteration = previous_iteration + datetime.timedelta(seconds=ttl) next_iteration = previous_iteration + datetime.timedelta(seconds=ttl)
@@ -659,7 +699,8 @@ def should_schedule_next(previous_iteration, now, schedule):
previous_iteration = normalized_previous_iteration - datetime.timedelta(days=1) previous_iteration = normalized_previous_iteration - datetime.timedelta(days=1)
next_iteration = (previous_iteration + datetime.timedelta(days=1)).replace(hour=hour, minute=minute) next_iteration = (previous_iteration + datetime.timedelta(days=1)).replace(hour=hour, minute=minute)
if failures:
next_iteration += datetime.timedelta(minutes=2**failures)
return now > next_iteration return now > next_iteration
@@ -685,6 +726,7 @@ class Query(ChangeTrackingMixin, TimestampMixin, BelongsToOrgMixin, db.Model):
is_archived = Column(db.Boolean, default=False, index=True) is_archived = Column(db.Boolean, default=False, index=True)
is_draft = Column(db.Boolean, default=True, index=True) is_draft = Column(db.Boolean, default=True, index=True)
schedule = Column(db.String(10), nullable=True) schedule = Column(db.String(10), nullable=True)
schedule_failures = Column(db.Integer, default=0)
visualizations = db.relationship("Visualization", cascade="all, delete-orphan") visualizations = db.relationship("Visualization", cascade="all, delete-orphan")
options = Column(MutableDict.as_mutable(PseudoJSON), default={}) options = Column(MutableDict.as_mutable(PseudoJSON), default={})
@@ -764,12 +806,12 @@ class Query(ChangeTrackingMixin, TimestampMixin, BelongsToOrgMixin, db.Model):
@classmethod @classmethod
def all_queries(cls, group_ids, user_id=None, drafts=False): def all_queries(cls, group_ids, user_id=None, drafts=False):
q = (cls.query.join(User, Query.user_id == User.id) q = (cls.query
.outerjoin(QueryResult) .options(joinedload(Query.user),
joinedload(Query.latest_query_data).load_only('runtime', 'retrieved_at'))
.join(DataSourceGroup, Query.data_source_id == DataSourceGroup.data_source_id) .join(DataSourceGroup, Query.data_source_id == DataSourceGroup.data_source_id)
.filter(Query.is_archived == False) .filter(Query.is_archived == False)
.filter(DataSourceGroup.group_id.in_(group_ids))\ .filter(DataSourceGroup.group_id.in_(group_ids))\
.group_by(Query.id, User.id, QueryResult.id, QueryResult.retrieved_at, QueryResult.runtime)
.order_by(Query.created_at.desc())) .order_by(Query.created_at.desc()))
if not drafts: if not drafts:
@@ -784,15 +826,20 @@ class Query(ChangeTrackingMixin, TimestampMixin, BelongsToOrgMixin, db.Model):
@classmethod @classmethod
def outdated_queries(cls): def outdated_queries(cls):
queries = (db.session.query(Query) queries = (db.session.query(Query)
.join(QueryResult) .options(joinedload(Query.latest_query_data).load_only('retrieved_at'))
.join(DataSource) .filter(Query.schedule != None)
.filter(Query.schedule != None)) .order_by(Query.id))
now = utils.utcnow() now = utils.utcnow()
outdated_queries = {} outdated_queries = {}
for query in queries: for query in queries:
if should_schedule_next(query.latest_query_data.retrieved_at, now, query.schedule): if query.latest_query_data:
key = "{}:{}".format(query.query_hash, query.data_source.id) retrieved_at = query.latest_query_data.retrieved_at
else:
retrieved_at = now
if should_schedule_next(retrieved_at, now, query.schedule, query.schedule_failures):
key = "{}:{}".format(query.query_hash, query.data_source_id)
outdated_queries[key] = query outdated_queries[key] = query
return outdated_queries.values() return outdated_queries.values()
@@ -818,12 +865,11 @@ class Query(ChangeTrackingMixin, TimestampMixin, BelongsToOrgMixin, db.Model):
Query.data_source_id == DataSourceGroup.data_source_id) Query.data_source_id == DataSourceGroup.data_source_id)
.filter(where)).distinct() .filter(where)).distinct()
return Query.query.join(User, Query.user_id == User.id).filter( return Query.query.options(joinedload(Query.user)).filter(Query.id.in_(query_ids))
Query.id.in_(query_ids))
@classmethod @classmethod
def recent(cls, group_ids, user_id=None, limit=20): def recent(cls, group_ids, user_id=None, limit=20):
query = (cls.query.join(User, Query.user_id == User.id) query = (cls.query.options(subqueryload(Query.user))
.filter(Event.created_at > (db.func.current_date() - 7)) .filter(Event.created_at > (db.func.current_date() - 7))
.join(Event, Query.id == Event.object_id.cast(db.Integer)) .join(Event, Query.id == Event.object_id.cast(db.Integer))
.join(DataSourceGroup, Query.data_source_id == DataSourceGroup.data_source_id) .join(DataSourceGroup, Query.data_source_id == DataSourceGroup.data_source_id)
@@ -835,7 +881,7 @@ class Query(ChangeTrackingMixin, TimestampMixin, BelongsToOrgMixin, db.Model):
DataSourceGroup.group_id.in_(group_ids), DataSourceGroup.group_id.in_(group_ids),
or_(Query.is_draft == False, Query.user_id == user_id), or_(Query.is_draft == False, Query.user_id == user_id),
Query.is_archived == False) Query.is_archived == False)
.group_by(Event.object_id, Query.id, User.id) .group_by(Event.object_id, Query.id)
.order_by(db.desc(db.func.count(0)))) .order_by(db.desc(db.func.count(0))))
if user_id: if user_id:
@@ -889,6 +935,7 @@ class Query(ChangeTrackingMixin, TimestampMixin, BelongsToOrgMixin, db.Model):
@listens_for(Query.query_text, 'set') @listens_for(Query.query_text, 'set')
def gen_query_hash(target, val, oldval, initiator): def gen_query_hash(target, val, oldval, initiator):
target.query_hash = utils.gen_query_hash(val) target.query_hash = utils.gen_query_hash(val)
target.schedule_failures = 0
@listens_for(Query.user_id, 'set') @listens_for(Query.user_id, 'set')
@@ -1024,12 +1071,11 @@ class Alert(TimestampMixin, db.Model):
@classmethod @classmethod
def all(cls, group_ids): def all(cls, group_ids):
# TODO: there was a join with user here to prevent N+1 queries. need to revisit this.
return db.session.query(Alert)\ return db.session.query(Alert)\
.options(joinedload(Alert.user), joinedload(Alert.query_rel))\
.join(Query)\ .join(Query)\
.join(DataSourceGroup, DataSourceGroup.data_source_id==Query.data_source_id)\ .join(DataSourceGroup, DataSourceGroup.data_source_id==Query.data_source_id)\
.filter(DataSourceGroup.group_id.in_(group_ids))\ .filter(DataSourceGroup.group_id.in_(group_ids))
.group_by(Alert)
@classmethod @classmethod
def get_by_id_and_org(cls, id, org): def get_by_id_and_org(cls, id, org):
@@ -1316,7 +1362,7 @@ class Event(db.Model):
action = Column(db.String(255)) action = Column(db.String(255))
object_type = Column(db.String(255)) object_type = Column(db.String(255))
object_id = Column(db.String(255), nullable=True) object_id = Column(db.String(255), nullable=True)
additional_properties = Column(db.Text, nullable=True) additional_properties = Column(MutableDict.as_mutable(PseudoJSON), nullable=True, default={})
created_at = Column(db.DateTime(True), default=db.func.now()) created_at = Column(db.DateTime(True), default=db.func.now())
__tablename__ = 'events' __tablename__ = 'events'
@@ -1324,6 +1370,17 @@ class Event(db.Model):
def __unicode__(self): def __unicode__(self):
return u"%s,%s,%s,%s" % (self.user_id, self.action, self.object_type, self.object_id) return u"%s,%s,%s,%s" % (self.user_id, self.action, self.object_type, self.object_id)
def to_dict(self):
return {
'org_id': self.org_id,
'user_id': self.user_id,
'action': self.action,
'object_type': self.object_type,
'object_id': self.object_id,
'additional_properties': self.additional_properties,
'created_at': self.created_at.isoformat()
}
@classmethod @classmethod
def record(cls, event): def record(cls, event):
org_id = event.pop('org_id') org_id = event.pop('org_id')
@@ -1333,11 +1390,10 @@ class Event(db.Model):
object_id = event.pop('object_id', None) object_id = event.pop('object_id', None)
created_at = datetime.datetime.utcfromtimestamp(event.pop('timestamp')) created_at = datetime.datetime.utcfromtimestamp(event.pop('timestamp'))
additional_properties = json.dumps(event)
event = cls(org_id=org_id, user_id=user_id, action=action, event = cls(org_id=org_id, user_id=user_id, action=action,
object_type=object_type, object_id=object_id, object_type=object_type, object_id=object_id,
additional_properties=additional_properties, additional_properties=event,
created_at=created_at) created_at=created_at)
db.session.add(event) db.session.add(event)
return event return event

View File

@@ -147,7 +147,7 @@ def register(query_runner_class):
logger.debug("Registering %s (%s) query runner.", query_runner_class.name(), query_runner_class.type()) logger.debug("Registering %s (%s) query runner.", query_runner_class.name(), query_runner_class.type())
query_runners[query_runner_class.type()] = query_runner_class query_runners[query_runner_class.type()] = query_runner_class
else: else:
logger.warning("%s query runner enabled but not supported, not registering. Either disable or install missing dependencies.", query_runner_class.name()) logger.debug("%s query runner enabled but not supported, not registering. Either disable or install missing dependencies.", query_runner_class.name())
def get_query_runner(query_runner_type, configuration): def get_query_runner(query_runner_type, configuration):

View File

@@ -0,0 +1,201 @@
from io import StringIO
import json
import logging
import sys
import uuid
import csv
from redash.query_runner import *
from redash.utils import JSONEncoder
logger = logging.getLogger(__name__)
try:
import atsd_client
from atsd_client.exceptions import SQLException
from atsd_client.services import SQLService, MetricsService
enabled = True
except ImportError:
enabled = False
types_map = {
'long': TYPE_INTEGER,
'bigint': TYPE_INTEGER,
'integer': TYPE_INTEGER,
'smallint': TYPE_INTEGER,
'float': TYPE_FLOAT,
'double': TYPE_FLOAT,
'decimal': TYPE_FLOAT,
'string': TYPE_STRING,
'date': TYPE_DATE,
'xsd:dateTimeStamp': TYPE_DATETIME
}
def resolve_redash_type(type_in_atsd):
"""
Retrieve corresponding redash type
:param type_in_atsd: `str`
:return: redash type constant
"""
if isinstance(type_in_atsd, dict):
type_in_redash = types_map.get(type_in_atsd['base'])
else:
type_in_redash = types_map.get(type_in_atsd)
return type_in_redash
def generate_rows_and_columns(csv_response):
"""
Prepare rows and columns in redash format from ATSD csv response
:param csv_response: `str`
:return: prepared rows and columns
"""
meta, data = csv_response.split('\n', 1)
meta = meta[1:]
meta_with_padding = meta + '=' * (4 - len(meta) % 4)
meta_decoded = meta_with_padding.decode('base64')
meta_json = json.loads(meta_decoded)
meta_columns = meta_json['tableSchema']['columns']
reader = csv.reader(data.splitlines())
next(reader)
columns = [{'friendly_name': i['titles'],
'type': resolve_redash_type(i['datatype']),
'name': i['name']}
for i in meta_columns]
column_names = [c['name'] for c in columns]
rows = [dict(zip(column_names, row)) for row in reader]
return columns, rows
class AxibaseTSD(BaseQueryRunner):
noop_query = "SELECT 1"
@classmethod
def enabled(cls):
return enabled
@classmethod
def name(cls):
return "Axibase Time Series Database"
@classmethod
def configuration_schema(cls):
return {
'type': 'object',
'properties': {
'protocol': {
'type': 'string',
'title': 'Protocol',
'default': 'http'
},
'hostname': {
'type': 'string',
'title': 'Host',
'default': 'axibase_tsd_hostname'
},
'port': {
'type': 'number',
'title': 'Port',
'default': 8088
},
'username': {
'type': 'string'
},
'password': {
'type': 'string',
'title': 'Password'
},
'timeout': {
'type': 'number',
'default': 600,
'title': 'Connection Timeout'
},
'min_insert_date': {
'type': 'string',
'title': 'Metric Minimum Insert Date'
},
'expression': {
'type': 'string',
'title': 'Metric Filter'
},
'limit': {
'type': 'number',
'default': 5000,
'title': 'Metric Limit'
},
'trust_certificate': {
'type': 'boolean',
'title': 'Trust SSL Certificate'
}
},
'required': ['username', 'password', 'hostname', 'protocol', 'port'],
'secret': ['password']
}
def __init__(self, configuration):
super(AxibaseTSD, self).__init__(configuration)
self.url = '{0}://{1}:{2}'.format(self.configuration.get('protocol', 'http'),
self.configuration.get('hostname', 'localhost'),
self.configuration.get('port', 8088))
def run_query(self, query, user):
connection = atsd_client.connect_url(self.url,
self.configuration.get('username'),
self.configuration.get('password'),
verify=self.configuration.get('trust_certificate', False),
timeout=self.configuration.get('timeout', 600))
sql = SQLService(connection)
query_id = str(uuid.uuid4())
try:
logger.debug("SQL running query: %s", query)
data = sql.query_with_params(query, {'outputFormat': 'csv', 'metadataFormat': 'EMBED',
'queryId': query_id})
columns, rows = generate_rows_and_columns(data)
data = {'columns': columns, 'rows': rows}
json_data = json.dumps(data, cls=JSONEncoder)
error = None
except SQLException as e:
json_data = None
error = e.content
except (KeyboardInterrupt, InterruptException):
sql.cancel_query(query_id)
error = "Query cancelled by user."
json_data = None
except Exception:
raise sys.exc_info()[1], None, sys.exc_info()[2]
return json_data, error
def get_schema(self, get_stats=False):
connection = atsd_client.connect_url(self.url,
self.configuration.get('username'),
self.configuration.get('password'),
verify=self.configuration.get('trust_certificate', False),
timeout=self.configuration.get('timeout', 600))
metrics = MetricsService(connection)
ml = metrics.list(expression=self.configuration.get('expression', None),
minInsertDate=self.configuration.get('min_insert_date', None),
limit=self.configuration.get('limit', 5000))
metrics_list = [i.name.encode('utf-8') for i in ml]
metrics_list.append('atsd_series')
schema = {}
default_columns = ['entity', 'datetime', 'time', 'metric', 'value', 'text',
'tags', 'entity.tags', 'metric.tags']
for table_name in metrics_list:
schema[table_name] = {'name': "'{}'".format(table_name),
'columns': default_columns}
values = schema.values()
return values
register(AxibaseTSD)

View File

@@ -3,6 +3,7 @@ import logging
from redash.query_runner import * from redash.query_runner import *
from redash.utils import JSONEncoder from redash.utils import JSONEncoder
import requests import requests
import re
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@@ -74,13 +75,16 @@ class ClickHouse(BaseSQLQueryRunner):
@staticmethod @staticmethod
def _define_column_type(column): def _define_column_type(column):
c = column.lower() c = column.lower()
if 'int' in c: f = re.search(r'^nullable\((.*)\)$', c)
if f is not None:
c = f.group(1)
if c.startswith('int') or c.startswith('uint'):
return TYPE_INTEGER return TYPE_INTEGER
elif 'float' in c: elif c.startswith('float'):
return TYPE_FLOAT return TYPE_FLOAT
elif 'datetime' == c: elif c == 'datetime':
return TYPE_DATETIME return TYPE_DATETIME
elif 'date' == c: elif c == 'date':
return TYPE_DATE return TYPE_DATE
else: else:
return TYPE_STRING return TYPE_STRING

View File

@@ -2,7 +2,6 @@ import json
import logging import logging
import sys import sys
from redash.query_runner import * from redash.query_runner import *
from redash.utils import JSONEncoder from redash.utils import JSONEncoder
@@ -98,12 +97,17 @@ class DynamoDBSQL(BaseSQLQueryRunner):
try: try:
engine = self._connect() engine = self._connect()
res_dict = engine.execute(query if str(query).endswith(';') else str(query)+';') result = engine.execute(query if str(query).endswith(';') else str(query)+';')
columns = [] columns = []
rows = [] rows = []
for item in res_dict:
# When running a count query it returns the value as a string, in which case
# we transform it into a dictionary to be the same as regular queries.
if isinstance(result, basestring):
result = [{"value": result}]
for item in result:
if not columns: if not columns:
for k, v in item.iteritems(): for k, v in item.iteritems():
columns.append({ columns.append({

View File

@@ -82,11 +82,11 @@ class Impala(BaseSQLQueryRunner):
def _get_tables(self, schema_dict): def _get_tables(self, schema_dict):
schemas_query = "show schemas;" schemas_query = "show schemas;"
tables_query = "show tables in %s;" tables_query = "show tables in %s;"
columns_query = "show column stats %s;" columns_query = "show column stats %s.%s;"
for schema_name in map(lambda a: a['name'], self._run_query_internal(schemas_query)): for schema_name in map(lambda a: unicode(a['name']), self._run_query_internal(schemas_query)):
for table_name in map(lambda a: a['name'], self._run_query_internal(tables_query % schema_name)): for table_name in map(lambda a: unicode(a['name']), self._run_query_internal(tables_query % schema_name)):
columns = map(lambda a: a['Column'], self._run_query_internal(columns_query % table_name)) columns = map(lambda a: unicode(a['Column']), self._run_query_internal(columns_query % (schema_name, table_name)))
if schema_name != 'default': if schema_name != 'default':
table_name = '{}.{}'.format(schema_name, table_name) table_name = '{}.{}'.format(schema_name, table_name)

View File

@@ -94,7 +94,7 @@ class SqlServer(BaseSQLQueryRunner):
def _get_tables(self, schema): def _get_tables(self, schema):
query = """ query = """
SELECT table_schema, table_name, column_name SELECT table_schema, table_name, column_name
FROM information_schema.columns FROM INFORMATION_SCHEMA.COLUMNS
WHERE table_schema NOT IN ('guest','INFORMATION_SCHEMA','sys','db_owner','db_accessadmin' WHERE table_schema NOT IN ('guest','INFORMATION_SCHEMA','sys','db_owner','db_accessadmin'
,'db_securityadmin','db_ddladmin','db_backupoperator','db_datareader' ,'db_securityadmin','db_ddladmin','db_backupoperator','db_datareader'
,'db_datawriter','db_denydatareader','db_denydatawriter' ,'db_datawriter','db_denydatareader','db_denydatawriter'

View File

@@ -10,6 +10,7 @@ from collections import defaultdict
try: try:
from pyhive import presto from pyhive import presto
from pyhive.exc import DatabaseError
enabled = True enabled = True
except ImportError: except ImportError:
@@ -112,9 +113,16 @@ class Presto(BaseQueryRunner):
data = {'columns': columns, 'rows': rows} data = {'columns': columns, 'rows': rows}
json_data = json.dumps(data, cls=JSONEncoder) json_data = json.dumps(data, cls=JSONEncoder)
error = None error = None
except DatabaseError, db:
json_data = None
default_message = 'Unspecified DatabaseError: {0}'.format(db.message)
message = db.message.get('failureInfo', {'message', None}).get('message')
error = default_message if message is None else message
except Exception, ex: except Exception, ex:
json_data = None json_data = None
error = ex.message error = ex.message
if not isinstance(error, basestring):
error = unicode(error)
return json_data, error return json_data, error

View File

@@ -0,0 +1,181 @@
# -*- coding: utf-8 -*-
import re
import logging
from collections import OrderedDict
from redash.query_runner import BaseQueryRunner, register
from redash.query_runner import TYPE_STRING, TYPE_DATE, TYPE_DATETIME, TYPE_INTEGER, TYPE_FLOAT, TYPE_BOOLEAN
from redash.utils import json_dumps
logger = logging.getLogger(__name__)
try:
from simple_salesforce import Salesforce as SimpleSalesforce
from simple_salesforce.api import SalesforceError
enabled = True
except ImportError as e:
enabled = False
# See https://developer.salesforce.com/docs/atlas.en-us.api.meta/api/field_types.htm
TYPES_MAP = dict(
id=TYPE_STRING,
string=TYPE_STRING,
currency=TYPE_FLOAT,
reference=TYPE_STRING,
double=TYPE_FLOAT,
picklist=TYPE_STRING,
date=TYPE_DATE,
url=TYPE_STRING,
phone=TYPE_STRING,
textarea=TYPE_STRING,
int=TYPE_INTEGER,
datetime=TYPE_DATETIME,
boolean=TYPE_BOOLEAN,
percent=TYPE_FLOAT,
multipicklist=TYPE_STRING,
masterrecord=TYPE_STRING,
location=TYPE_STRING,
JunctionIdList=TYPE_STRING,
encryptedstring=TYPE_STRING,
email=TYPE_STRING,
DataCategoryGroupReference=TYPE_STRING,
combobox=TYPE_STRING,
calculated=TYPE_STRING,
anyType=TYPE_STRING,
address=TYPE_STRING
)
# Query Runner for Salesforce SOQL Queries
# For example queries, see:
# https://developer.salesforce.com/docs/atlas.en-us.soql_sosl.meta/soql_sosl/sforce_api_calls_soql_select_examples.htm
class Salesforce(BaseQueryRunner):
@classmethod
def enabled(cls):
return enabled
@classmethod
def annotate_query(cls):
return False
@classmethod
def configuration_schema(cls):
return {
"type": "object",
"properties": {
"username": {
"type": "string"
},
"password": {
"type": "string"
},
"token": {
"type": "string",
"title": "Security Token"
},
"sandbox": {
"type": "boolean"
}
},
"required": ["username", "password", "token"],
"secret": ["password", "token"]
}
def test_connection(self):
response = self._get_sf().describe()
if response is None:
raise Exception("Failed describing objects.")
pass
def _get_sf(self):
sf = SimpleSalesforce(username=self.configuration['username'],
password=self.configuration['password'],
security_token=self.configuration['token'],
sandbox=self.configuration['sandbox'],
client_id='Redash')
return sf
def _clean_value(self, value):
if isinstance(value, OrderedDict) and 'records' in value:
value = value['records']
for row in value:
row.pop('attributes', None)
return value
def _get_value(self, dct, dots):
for key in dots.split('.'):
dct = dct.get(key)
return dct
def _get_column_name(self, key, parents=[]):
return '.'.join(parents + [key])
def _build_columns(self, sf, child, parents=[]):
child_type = child['attributes']['type']
child_desc = sf.__getattr__(child_type).describe()
child_type_map = dict((f['name'], f['type'])for f in child_desc['fields'])
columns = []
for key in child.keys():
if key != 'attributes':
if isinstance(child[key], OrderedDict) and 'attributes' in child[key]:
columns.extend(self._build_columns(sf, child[key], parents + [key]))
else:
column_name = self._get_column_name(key, parents)
key_type = child_type_map.get(key, 'string')
column_type = TYPES_MAP.get(key_type, TYPE_STRING)
columns.append((column_name, column_type))
return columns
def _build_rows(self, columns, records):
rows = []
for record in records:
record.pop('attributes', None)
row = dict()
for column in columns:
key = column[0]
value = self._get_value(record, key)
row[key] = self._clean_value(value)
rows.append(row)
return rows
def run_query(self, query, user):
logger.debug("Salesforce is about to execute query: %s", query)
query = re.sub(r"/\*(.|\n)*?\*/", "", query).strip()
try:
columns = []
rows = []
sf = self._get_sf()
response = sf.query_all(query)
records = response['records']
if response['totalSize'] > 0 and len(records) == 0:
columns = self.fetch_columns([('Count', TYPE_INTEGER)])
rows = [{'Count': response['totalSize']}]
elif len(records) > 0:
cols = self._build_columns(sf, records[0])
rows = self._build_rows(cols, records)
columns = self.fetch_columns(cols)
error = None
data = {'columns': columns, 'rows': rows}
json_data = json_dumps(data)
except SalesforceError as err:
error = err.message
json_data = None
return json_data, error
def get_schema(self, get_stats=False):
sf = self._get_sf()
response = sf.describe()
if response is None:
raise Exception("Failed describing objects.")
schema = {}
for sobject in response['sobjects']:
table_name = sobject['name']
if sobject['queryable'] is True and table_name not in schema:
desc = sf.__getattr__(sobject['name']).describe()
fields = desc['fields']
schema[table_name] = {'name': table_name, 'columns': [f['name'] for f in fields]}
return schema.values()
register(Salesforce)

View File

@@ -19,7 +19,7 @@ def public_widget(widget):
} }
if widget.visualization and widget.visualization.id: if widget.visualization and widget.visualization.id:
query_data = models.QueryResult.query.get(widget.visualization.query.latest_query_data_id).to_dict() query_data = models.QueryResult.query.get(widget.visualization.query_rel.latest_query_data_id).to_dict()
res['visualization'] = { res['visualization'] = {
'type': widget.visualization.type, 'type': widget.visualization.type,
'name': widget.visualization.name, 'name': widget.visualization.name,
@@ -29,8 +29,8 @@ def public_widget(widget):
'created_at': widget.visualization.created_at, 'created_at': widget.visualization.created_at,
'query': { 'query': {
'query': ' ', # workaround, as otherwise the query data won't be loaded. 'query': ' ', # workaround, as otherwise the query data won't be loaded.
'name': widget.visualization.query.name, 'name': widget.visualization.query_rel.name,
'description': widget.visualization.query.description, 'description': widget.visualization.query_rel.description,
'options': {}, 'options': {},
'latest_query_data': query_data 'latest_query_data': query_data
} }

View File

@@ -185,7 +185,9 @@ default_query_runners = [
'redash.query_runner.mssql', 'redash.query_runner.mssql',
'redash.query_runner.jql', 'redash.query_runner.jql',
'redash.query_runner.google_analytics', 'redash.query_runner.google_analytics',
'redash.query_runner.snowflake' 'redash.query_runner.snowflake',
'redash.query_runner.axibase_tsd',
'redash.query_runner.salesforce'
] ]
enabled_query_runners = array_from_string(os.environ.get("REDASH_ENABLED_QUERY_RUNNERS", ",".join(default_query_runners))) enabled_query_runners = array_from_string(os.environ.get("REDASH_ENABLED_QUERY_RUNNERS", ",".join(default_query_runners)))

View File

@@ -1 +0,0 @@
../../../frontend/app/assets/images/favicon-16x16.png

Before

Width:  |  Height:  |  Size: 53 B

After

Width:  |  Height:  |  Size: 1.3 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 53 B

After

Width:  |  Height:  |  Size: 1.3 KiB

View File

@@ -1 +0,0 @@
../../../frontend/app/assets/images/favicon-32x32.png

Before

Width:  |  Height:  |  Size: 53 B

After

Width:  |  Height:  |  Size: 2.0 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 53 B

After

Width:  |  Height:  |  Size: 2.0 KiB

View File

@@ -1 +0,0 @@
../../../frontend/app/assets/images/favicon-96x96.png

Before

Width:  |  Height:  |  Size: 53 B

After

Width:  |  Height:  |  Size: 3.8 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 53 B

After

Width:  |  Height:  |  Size: 3.8 KiB

View File

@@ -4,7 +4,6 @@ import datetime
from redash.worker import celery from redash.worker import celery
from redash import utils from redash import utils
from redash import models, settings from redash import models, settings
from .base import BaseTask
logger = get_task_logger(__name__) logger = get_task_logger(__name__)
@@ -21,7 +20,7 @@ def notify_subscriptions(alert, new_state):
host = base_url(alert.query_rel.org) host = base_url(alert.query_rel.org)
for subscription in alert.subscriptions: for subscription in alert.subscriptions:
try: try:
subscription.notify(alert, alert.query, subscription.user, new_state, current_app, host) subscription.notify(alert, alert.query_rel, subscription.user, new_state, current_app, host)
except Exception as e: except Exception as e:
logger.exception("Error with processing destination") logger.exception("Error with processing destination")
@@ -34,7 +33,7 @@ def should_notify(alert, new_state):
return new_state != alert.state or (alert.state == models.Alert.TRIGGERED_STATE and passed_rearm_threshold) return new_state != alert.state or (alert.state == models.Alert.TRIGGERED_STATE and passed_rearm_threshold)
@celery.task(name="redash.tasks.check_alerts_for_query", base=BaseTask) @celery.task(name="redash.tasks.check_alerts_for_query")
def check_alerts_for_query(query_id): def check_alerts_for_query(query_id):
logger.debug("Checking query %d for alerts", query_id) logger.debug("Checking query %d for alerts", query_id)

View File

@@ -1,18 +0,0 @@
from celery import Task
from redash import create_app
from flask import has_app_context, current_app
class BaseTask(Task):
abstract = True
def after_return(self, *args, **kwargs):
if hasattr(self, 'app_ctx'):
self.app_ctx.pop()
def __call__(self, *args, **kwargs):
if not has_app_context():
flask_app = current_app or create_app()
self.app_ctx = flask_app.app_context()
self.app_ctx.push()
return super(BaseTask, self).__call__(*args, **kwargs)

View File

@@ -4,27 +4,30 @@ from flask_mail import Message
from redash.worker import celery from redash.worker import celery
from redash.version_check import run_version_check from redash.version_check import run_version_check
from redash import models, mail, settings from redash import models, mail, settings
from .base import BaseTask
logger = get_task_logger(__name__) logger = get_task_logger(__name__)
@celery.task(name="redash.tasks.record_event", base=BaseTask) @celery.task(name="redash.tasks.record_event")
def record_event(event): def record_event(raw_event):
original_event = event.copy() event = models.Event.record(raw_event)
models.Event.record(event)
models.db.session.commit() models.db.session.commit()
for hook in settings.EVENT_REPORTING_WEBHOOKS: for hook in settings.EVENT_REPORTING_WEBHOOKS:
logger.debug("Forwarding event to: %s", hook) logger.debug("Forwarding event to: %s", hook)
try: try:
response = requests.post(hook, original_event) data = {
"schema": "iglu:io.redash.webhooks/event/jsonschema/1-0-0",
"data": event.to_dict()
}
response = requests.post(hook, json=data)
if response.status_code != 200: if response.status_code != 200:
logger.error("Failed posting to %s: %s", hook, response.content) logger.error("Failed posting to %s: %s", hook, response.content)
except Exception: except Exception:
logger.exception("Failed posting to %s", hook) logger.exception("Failed posting to %s", hook)
@celery.task(name="redash.tasks.version_check", base=BaseTask) @celery.task(name="redash.tasks.version_check")
def version_check(): def version_check():
run_version_check() run_version_check()
@@ -42,7 +45,7 @@ def subscribe(form):
requests.post('https://beacon.redash.io/subscribe', json=data) requests.post('https://beacon.redash.io/subscribe', json=data)
@celery.task(name="redash.tasks.send_mail", base=BaseTask) @celery.task(name="redash.tasks.send_mail")
def send_mail(to, subject, html, text): def send_mail(to, subject, html, text):
from redash.wsgi import app from redash.wsgi import app

View File

@@ -9,7 +9,6 @@ from redash import redis_connection, models, statsd_client, settings, utils
from redash.utils import gen_query_hash from redash.utils import gen_query_hash
from redash.worker import celery from redash.worker import celery
from redash.query_runner import InterruptException from redash.query_runner import InterruptException
from .base import BaseTask
from .alerts import check_alerts_for_query from .alerts import check_alerts_for_query
logger = get_task_logger(__name__) logger = get_task_logger(__name__)
@@ -155,23 +154,25 @@ class QueryTask(object):
return self._async_result.id return self._async_result.id
def to_dict(self): def to_dict(self):
if self._async_result.status == 'STARTED': task_info = self._async_result._get_task_meta()
updated_at = self._async_result.result.get('start_time', 0) result, task_status = task_info['result'], task_info['status']
if task_status == 'STARTED':
updated_at = result.get('start_time', 0)
else: else:
updated_at = 0 updated_at = 0
status = self.STATUSES[self._async_result.status] status = self.STATUSES[task_status]
if isinstance(self._async_result.result, Exception): if isinstance(result, Exception):
error = self._async_result.result.message error = result.message
status = 4 status = 4
elif self._async_result.status == 'REVOKED': elif task_status == 'REVOKED':
error = 'Query execution cancelled.' error = 'Query execution cancelled.'
else: else:
error = '' error = ''
if self._async_result.successful() and not error: if task_status == 'SUCCESS' and not error:
query_result_id = self._async_result.result query_result_id = result
else: else:
query_result_id = None query_result_id = None
@@ -198,7 +199,7 @@ class QueryTask(object):
return self._async_result.revoke(terminate=True, signal='SIGINT') return self._async_result.revoke(terminate=True, signal='SIGINT')
def enqueue_query(query, data_source, user_id, scheduled=False, metadata={}): def enqueue_query(query, data_source, user_id, scheduled_query=None, metadata={}):
query_hash = gen_query_hash(query) query_hash = gen_query_hash(query)
logging.info("Inserting job for %s with metadata=%s", query_hash, metadata) logging.info("Inserting job for %s with metadata=%s", query_hash, metadata)
try_count = 0 try_count = 0
@@ -224,14 +225,21 @@ def enqueue_query(query, data_source, user_id, scheduled=False, metadata={}):
if not job: if not job:
pipe.multi() pipe.multi()
if scheduled: if scheduled_query:
queue_name = data_source.scheduled_queue_name queue_name = data_source.scheduled_queue_name
scheduled_query_id = scheduled_query.id
else: else:
queue_name = data_source.queue_name queue_name = data_source.queue_name
scheduled_query_id = None
result = execute_query.apply_async(args=(query, data_source.id, metadata, user_id), queue=queue_name) result = execute_query.apply_async(args=(
query, data_source.id, metadata, user_id,
scheduled_query_id),
queue=queue_name)
job = QueryTask(async_result=result) job = QueryTask(async_result=result)
tracker = QueryTaskTracker.create(result.id, 'created', query_hash, data_source.id, scheduled, metadata) tracker = QueryTaskTracker.create(
result.id, 'created', query_hash, data_source.id,
scheduled_query is not None, metadata)
tracker.save(connection=pipe) tracker.save(connection=pipe)
logging.info("[%s] Created new job: %s", query_hash, job.id) logging.info("[%s] Created new job: %s", query_hash, job.id)
@@ -248,7 +256,7 @@ def enqueue_query(query, data_source, user_id, scheduled=False, metadata={}):
return job return job
@celery.task(name="redash.tasks.refresh_queries", base=BaseTask) @celery.task(name="redash.tasks.refresh_queries")
def refresh_queries(): def refresh_queries():
logger.info("Refreshing queries...") logger.info("Refreshing queries...")
@@ -263,7 +271,7 @@ def refresh_queries():
logging.info("Skipping refresh of %s because datasource - %s is paused (%s).", query.id, query.data_source.name, query.data_source.pause_reason) logging.info("Skipping refresh of %s because datasource - %s is paused (%s).", query.id, query.data_source.name, query.data_source.pause_reason)
else: else:
enqueue_query(query.query_text, query.data_source, query.user_id, enqueue_query(query.query_text, query.data_source, query.user_id,
scheduled=True, scheduled_query=query,
metadata={'Query ID': query.id, 'Username': 'Scheduled'}) metadata={'Query ID': query.id, 'Username': 'Scheduled'})
query_ids.append(query.id) query_ids.append(query.id)
@@ -285,7 +293,7 @@ def refresh_queries():
statsd_client.gauge('manager.seconds_since_refresh', now - float(status.get('last_refresh_at', now))) statsd_client.gauge('manager.seconds_since_refresh', now - float(status.get('last_refresh_at', now)))
@celery.task(name="redash.tasks.cleanup_tasks", base=BaseTask) @celery.task(name="redash.tasks.cleanup_tasks")
def cleanup_tasks(): def cleanup_tasks():
in_progress = QueryTaskTracker.all(QueryTaskTracker.IN_PROGRESS_LIST) in_progress = QueryTaskTracker.all(QueryTaskTracker.IN_PROGRESS_LIST)
for tracker in in_progress: for tracker in in_progress:
@@ -317,7 +325,7 @@ def cleanup_tasks():
QueryTaskTracker.prune(QueryTaskTracker.DONE_LIST, 1000) QueryTaskTracker.prune(QueryTaskTracker.DONE_LIST, 1000)
@celery.task(name="redash.tasks.cleanup_query_results", base=BaseTask) @celery.task(name="redash.tasks.cleanup_query_results")
def cleanup_query_results(): def cleanup_query_results():
""" """
Job to cleanup unused query results -- such that no query links to them anymore, and older than Job to cleanup unused query results -- such that no query links to them anymore, and older than
@@ -331,15 +339,14 @@ def cleanup_query_results():
settings.QUERY_RESULTS_CLEANUP_COUNT, settings.QUERY_RESULTS_CLEANUP_MAX_AGE) settings.QUERY_RESULTS_CLEANUP_COUNT, settings.QUERY_RESULTS_CLEANUP_MAX_AGE)
unused_query_results = models.QueryResult.unused(settings.QUERY_RESULTS_CLEANUP_MAX_AGE).limit(settings.QUERY_RESULTS_CLEANUP_COUNT) unused_query_results = models.QueryResult.unused(settings.QUERY_RESULTS_CLEANUP_MAX_AGE).limit(settings.QUERY_RESULTS_CLEANUP_COUNT)
total_unused_query_results = models.QueryResult.unused().count() deleted_count = models.QueryResult.query.filter(
deleted_count = models.Query.query.filter( models.QueryResult.id.in_(unused_query_results.subquery())
models.Query.id.in_(unused_query_results.subquery())
).delete(synchronize_session=False) ).delete(synchronize_session=False)
models.db.session.commit() models.db.session.commit()
logger.info("Deleted %d unused query results out of total of %d." % (deleted_count, total_unused_query_results)) logger.info("Deleted %d unused query results.", deleted_count)
@celery.task(name="redash.tasks.refresh_schemas", base=BaseTask) @celery.task(name="redash.tasks.refresh_schemas")
def refresh_schemas(): def refresh_schemas():
""" """
Refreshes the data sources schemas. Refreshes the data sources schemas.
@@ -380,7 +387,8 @@ class QueryExecutionError(Exception):
# We could have created this as a celery.Task derived class, and act as the task itself. But this might result in weird # We could have created this as a celery.Task derived class, and act as the task itself. But this might result in weird
# issues as the task class created once per process, so decided to have a plain object instead. # issues as the task class created once per process, so decided to have a plain object instead.
class QueryExecutor(object): class QueryExecutor(object):
def __init__(self, task, query, data_source_id, user_id, metadata): def __init__(self, task, query, data_source_id, user_id, metadata,
scheduled_query):
self.task = task self.task = task
self.query = query self.query = query
self.data_source_id = data_source_id self.data_source_id = data_source_id
@@ -391,6 +399,7 @@ class QueryExecutor(object):
else: else:
self.user = None self.user = None
self.query_hash = gen_query_hash(self.query) self.query_hash = gen_query_hash(self.query)
self.scheduled_query = scheduled_query
# Load existing tracker or create a new one if the job was created before code update: # Load existing tracker or create a new one if the job was created before code update:
self.tracker = QueryTaskTracker.get_by_task_id(task.request.id) or QueryTaskTracker.create(task.request.id, self.tracker = QueryTaskTracker.get_by_task_id(task.request.id) or QueryTaskTracker.create(task.request.id,
'created', 'created',
@@ -425,7 +434,14 @@ class QueryExecutor(object):
if error: if error:
self.tracker.update(state='failed') self.tracker.update(state='failed')
result = QueryExecutionError(error) result = QueryExecutionError(error)
if self.scheduled_query:
self.scheduled_query.schedule_failures += 1
models.db.session.add(self.scheduled_query)
else: else:
if (self.scheduled_query and
self.scheduled_query.schedule_failures > 0):
self.scheduled_query.schedule_failures = 0
models.db.session.add(self.scheduled_query)
query_result, updated_query_ids = models.QueryResult.store_result( query_result, updated_query_ids = models.QueryResult.store_result(
self.data_source.org, self.data_source, self.data_source.org, self.data_source,
self.query_hash, self.query, data, self.query_hash, self.query, data,
@@ -452,10 +468,14 @@ class QueryExecutor(object):
return annotated_query return annotated_query
def _log_progress(self, state): def _log_progress(self, state):
logger.info(u"task=execute_query state=%s query_hash=%s type=%s ds_id=%d task_id=%s queue=%s query_id=%s username=%s", logger.info(
state, u"task=execute_query state=%s query_hash=%s type=%s ds_id=%d "
self.query_hash, self.data_source.type, self.data_source.id, self.task.request.id, self.task.request.delivery_info['routing_key'], "task_id=%s queue=%s query_id=%s username=%s",
self.metadata.get('Query ID', 'unknown'), self.metadata.get('Username', 'unknown')) state, self.query_hash, self.data_source.type, self.data_source.id,
self.task.request.id,
self.task.request.delivery_info['routing_key'],
self.metadata.get('Query ID', 'unknown'),
self.metadata.get('Username', 'unknown'))
self.tracker.update(state=state) self.tracker.update(state=state)
def _load_data_source(self): def _load_data_source(self):
@@ -465,6 +485,12 @@ class QueryExecutor(object):
# user_id is added last as a keyword argument for backward compatability -- to support executing previously submitted # user_id is added last as a keyword argument for backward compatability -- to support executing previously submitted
# jobs before the upgrade to this version. # jobs before the upgrade to this version.
@celery.task(name="redash.tasks.execute_query", bind=True, base=BaseTask, track_started=True) @celery.task(name="redash.tasks.execute_query", bind=True, track_started=True)
def execute_query(self, query, data_source_id, metadata, user_id=None): def execute_query(self, query, data_source_id, metadata, user_id=None,
return QueryExecutor(self, query, data_source_id, user_id, metadata).run() scheduled_query_id=None):
if scheduled_query_id is not None:
scheduled_query = models.Query.query.get(scheduled_query_id)
else:
scheduled_query = None
return QueryExecutor(self, query, data_source_id, user_id, metadata,
scheduled_query).run()

View File

@@ -19,7 +19,7 @@
{% endwith %} {% endwith %}
{% if google_auth_url %} {% if google_auth_url %}
<div class="row"> <div class="row">
<a href="{{ google_auth_url }}"><img src="/google_login.png" class="login-button"/></a> <a href="{{ google_auth_url }}"><img src="/images/google_login.png" class="login-button"/></a>
</div> </div>
<div class="login-or"> <div class="login-or">
<hr class="hr-or"> <hr class="hr-or">

View File

@@ -2,10 +2,12 @@ from __future__ import absolute_import
from random import randint from random import randint
from celery import Celery from celery import Celery
from flask import current_app
from datetime import timedelta from datetime import timedelta
from celery.schedules import crontab from celery.schedules import crontab
from redash import settings, __version__ from celery.signals import worker_process_init
from redash.metrics import celery from redash import settings, __version__, create_app
from redash.metrics import celery as celery_metrics
celery = Celery('redash', celery = Celery('redash',
@@ -48,9 +50,29 @@ celery.conf.update(CELERY_RESULT_BACKEND=settings.CELERY_BACKEND,
if settings.SENTRY_DSN: if settings.SENTRY_DSN:
from raven import Client from raven import Client
from raven.contrib.celery import register_signal, register_logger_signal from raven.contrib.celery import register_signal
client = Client(settings.SENTRY_DSN, release=__version__) client = Client(settings.SENTRY_DSN, release=__version__)
register_signal(client) register_signal(client)
# Create a new Task base class, that pushes a new Flask app context to allow DB connections if needed.
TaskBase = celery.Task
class ContextTask(TaskBase):
abstract = True
def __call__(self, *args, **kwargs):
with current_app.app_context():
return TaskBase.__call__(self, *args, **kwargs)
celery.Task = ContextTask
# Create Flask app after forking a new worker, to make sure no resources are shared between processes.
@worker_process_init.connect
def init_celery_flask_app(**kwargs):
app = create_app()
app.app_context().push()

View File

@@ -18,5 +18,7 @@ thrift>=0.8.0
thrift_sasl>=0.1.0 thrift_sasl>=0.1.0
cassandra-driver==3.1.1 cassandra-driver==3.1.1
snowflake_connector_python==1.3.7 snowflake_connector_python==1.3.7
atsd_client==2.0.12
simple_salesforce==0.72.2
# certifi is needed to support MongoDB and SSL: # certifi is needed to support MongoDB and SSL:
certifi certifi

2
setup.cfg Normal file
View File

@@ -0,0 +1,2 @@
[pep8]
max-line-length = 120

View File

@@ -1 +1,4 @@
# DEPRECATED
(left for reference purposes only)
Bootstrap script for Amazon Linux AMI. *Not supported*, we recommend to use the Docker images instead. Bootstrap script for Amazon Linux AMI. *Not supported*, we recommend to use the Docker images instead.

View File

@@ -1 +0,0 @@
Files used for the Docker image creation.

View File

@@ -1,27 +0,0 @@
#!/bin/bash
# This script assumes you're using docker-compose, with at least two images: redash for the redash instance
# and postgres for the postgres instance.
#
# This script is not idempotent and should be run once.
run_redash="docker-compose run --rm redash"
$run_redash /opt/redash/current/manage.py database create_tables
# Create default admin user
$run_redash /opt/redash/current/manage.py users create --admin --password admin "Admin" "admin"
# This is a hack to get the Postgres IP and PORT from the instance itself.
temp_env_file=$(mktemp /tmp/pg_env.XXXXXX || exit 3)
docker-compose run --rm postgres env > "$temp_env_file"
source "$temp_env_file"
run_psql="docker-compose run --rm postgres psql -h $POSTGRES_PORT_5432_TCP_ADDR -p $POSTGRES_PORT_5432_TCP_PORT -U postgres"
# Create redash_reader user. We don't use a strong password, as the instance supposed to be accesible only from the redash host.
$run_psql -c "CREATE ROLE redash_reader WITH PASSWORD 'redash_reader' NOCREATEROLE NOCREATEDB NOSUPERUSER LOGIN"
$run_psql -c "grant select(id,name,type) ON data_sources to redash_reader;"
$run_psql -c "grant select(id,name) ON users to redash_reader;"
$run_psql -c "grant select on events, queries, dashboards, widgets, visualizations, query_results to redash_reader;"
$run_redash /opt/redash/current/manage.py ds new "Redash Metadata" --type "pg" --options "{\"user\": \"redash_reader\", \"password\": \"redash_reader\", \"host\": \"postgres\", \"dbname\": \"postgres\"}"

View File

@@ -1,8 +0,0 @@
FROM nginx
MAINTAINER Di Wu <diwu@yelp.com>
COPY nginx.conf /etc/nginx/nginx.conf
RUN mkdir -p /var/log/nginx/log && \
touch /var/log/nginx/log/access.log && \
touch /var/log/nginx/log/error.log

View File

@@ -1,30 +0,0 @@
events {
worker_connections 4096; # Default: 1024
}
http {
server_tokens off;
upstream redashapp {
server redash:5000;
}
server {
listen 80;
access_log /var/log/nginx/log/access.log;
error_log /var/log/nginx/log/error.log;
gzip on;
gzip_types *;
gzip_proxied any;
location / {
proxy_pass http://redashapp;
proxy_set_header Host $http_host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
}

View File

@@ -7,18 +7,32 @@
}, },
"builders": [ "builders": [
{ {
"name": "redash-eu-west-1", "name": "redash-us-east-1",
"type": "amazon-ebs", "type": "amazon-ebs",
"access_key": "{{user `aws_access_key`}}", "access_key": "{{user `aws_access_key`}}",
"secret_key": "{{user `aws_secret_key`}}", "secret_key": "{{user `aws_secret_key`}}",
"region": "eu-west-1", "region": "us-east-1",
"source_ami": "ami-6177f712", "source_ami": "ami-4dd2575b",
"instance_type": "t2.micro", "instance_type": "t2.micro",
"ssh_username": "ubuntu", "ssh_username": "ubuntu",
"ami_name": "redash-{{user `image_version`}}-eu-west-1" "ami_name": "redash-{{user `image_version`}}-us-east-1"
},
{
"type": "googlecompute",
"account_file": "account.json",
"project_id": "redash-bird-123",
"source_image_family": "ubuntu-1604-lts",
"zone": "us-central1-a",
"ssh_username": "arik"
} }
], ],
"provisioners": [ "provisioners": [
{
"type": "shell",
"inline": [
"sleep 30"
]
},
{ {
"type": "shell", "type": "shell",
"script": "ubuntu/bootstrap.sh", "script": "ubuntu/bootstrap.sh",
@@ -33,5 +47,15 @@
"type": "shell", "type": "shell",
"inline": "sudo rm /home/ubuntu/.ssh/authorized_keys || true" "inline": "sudo rm /home/ubuntu/.ssh/authorized_keys || true"
} }
],
"post-processors": [
{
"type": "googlecompute-export",
"only": ["googlecompute"],
"paths": [
"gs://redash-images/redash.{{user `redash_version`}}.tar.gz"
],
"keep_input_artifact": true
}
] ]
} }

View File

@@ -1 +1 @@
Bootstrap scripts for Ubuntu (tested on Ubuntu 14.04, although should work with 12.04). Bootstrap scripts for Ubuntu 16.04.

View File

@@ -1,195 +1,110 @@
#!/bin/bash #!/bin/bash
#
# This script setups Redash along with supervisor, nginx, PostgreSQL and Redis. It was written to be used on
# Ubuntu 16.04. Technically it can work with other Ubuntu versions, but you might get non compatible versions
# of PostgreSQL, Redis and maybe some other dependencies.
#
# This script is not idempotent and if it stops in the middle, you can't just run it again. You should either
# understand what parts of it to exclude or just start over on a new VM (assuming you're using a VM).
set -eu set -eu
REDASH_BASE_PATH=/opt/redash REDASH_BASE_PATH=/opt/redash
REDASH_BRANCH="${REDASH_BRANCH:-master}" # Default branch/version to master if not specified in REDASH_BRANCH env var
# Default branch/version to master if not specified in REDASH_BRANCH env var REDASH_VERSION=${REDASH_VERSION-1.0.1.b2833} # Install latest version if not specified in REDASH_VERSION env var
REDASH_BRANCH="${REDASH_BRANCH:-master}" LATEST_URL="https://s3.amazonaws.com/redash-releases/redash.${REDASH_VERSION}.tar.gz"
# Install latest version if not specified in REDASH_VERSION env var
REDASH_VERSION=${REDASH_VERSION-0.12.0.b2449}
LATEST_URL="https://github.com/getredash/redash/releases/download/v${REDASH_VERSION}/redash.${REDASH_VERSION}.tar.gz"
VERSION_DIR="/opt/redash/redash.${REDASH_VERSION}" VERSION_DIR="/opt/redash/redash.${REDASH_VERSION}"
REDASH_TARBALL=/tmp/redash.tar.gz REDASH_TARBALL=/tmp/redash.tar.gz
FILES_BASE_URL=https://raw.githubusercontent.com/getredash/redash/${REDASH_BRANCH}/setup/ubuntu/files
FILES_BASE_URL=https://raw.githubusercontent.com/getredash/redash/${REDASH_BRANCH}/setup/ubuntu/files/ cd /tmp/
# Verify running as root: verify_root() {
if [ "$(id -u)" != "0" ]; then # Verify running as root:
if [ $# -ne 0 ]; then if [ "$(id -u)" != "0" ]; then
echo "Failed running with sudo. Exiting." 1>&2 if [ $# -ne 0 ]; then
exit 1 echo "Failed running with sudo. Exiting." 1>&2
exit 1
fi
echo "This script must be run as root. Trying to run with sudo."
sudo bash "$0" --with-sudo
exit 0
fi fi
echo "This script must be run as root. Trying to run with sudo."
sudo bash "$0" --with-sudo
exit 0
fi
# Base packages
apt-get -y update
DEBIAN_FRONTEND=noninteractive apt-get -y -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" dist-upgrade
apt-get install -y python-pip python-dev nginx curl build-essential pwgen
# BigQuery dependencies:
apt-get install -y libffi-dev libssl-dev
# MySQL dependencies:
apt-get install -y libmysqlclient-dev
# Microsoft SQL Server dependencies:
apt-get install -y freetds-dev
# Hive dependencies:
apt-get install -y libsasl2-dev
#Saml dependency
apt-get install -y xmlsec1
# Upgrade pip if host is Ubuntu 16.04
if [[ $(lsb_release -d) = *Ubuntu* ]] && [[ $(lsb_release -rs) = *16.04* ]]; then
pip install --upgrade pip
fi
pip install -U setuptools==23.1.0
# redash user
# TODO: check user doesn't exist yet?
adduser --system --no-create-home --disabled-login --gecos "" redash
# PostgreSQL
pg_available=0
psql --version || pg_available=$?
if [ $pg_available -ne 0 ]; then
wget $FILES_BASE_URL"postgres_apt.sh" -O /tmp/postgres_apt.sh
bash /tmp/postgres_apt.sh
apt-get update
apt-get -y install postgresql-9.3 postgresql-server-dev-9.3
fi
add_service() {
service_name=$1
service_command="/etc/init.d/$service_name"
echo "Adding service: $service_name (/etc/init.d/$service_name)."
chmod +x "$service_command"
if command -v chkconfig >/dev/null 2>&1; then
# we're chkconfig, so lets add to chkconfig and put in runlevel 345
chkconfig --add "$service_name" && echo "Successfully added to chkconfig!"
chkconfig --level 345 "$service_name" on && echo "Successfully added to runlevels 345!"
elif command -v update-rc.d >/dev/null 2>&1; then
#if we're not a chkconfig box assume we're able to use update-rc.d
update-rc.d "$service_name" defaults && echo "Success!"
else
echo "No supported init tool found."
fi
$service_command start
} }
# Redis create_redash_user() {
redis_available=0 adduser --system --no-create-home --disabled-login --gecos "" redash
redis-cli --version || redis_available=$? }
if [ $redis_available -ne 0 ]; then
wget http://download.redis.io/releases/redis-2.8.17.tar.gz
tar xzf redis-2.8.17.tar.gz
rm redis-2.8.17.tar.gz
(cd redis-2.8.17
make
make install
# Setup process init & configuration install_system_packages() {
apt-get -y update
# Base packages
apt install -y python-pip python-dev nginx curl build-essential pwgen
# Data sources dependencies:
apt install -y libffi-dev libssl-dev libmysqlclient-dev libpq-dev freetds-dev libsasl2-dev
# SAML dependency
apt install -y xmlsec1
# Storage servers
apt install -y postgresql redis-server
apt install -y supervisor
}
REDIS_PORT=6379 create_directories() {
REDIS_CONFIG_FILE="/etc/redis/$REDIS_PORT.conf" mkdir /opt/redash
REDIS_LOG_FILE="/var/log/redis_$REDIS_PORT.log" chown redash /opt/redash
REDIS_DATA_DIR="/var/lib/redis/$REDIS_PORT"
# Default config file
if [ ! -f "/opt/redash/.env" ]; then
sudo -u redash wget "$FILES_BASE_URL/env" -O /opt/redash/.env
fi
mkdir -p "$(dirname "$REDIS_CONFIG_FILE")" || die "Could not create redis config directory" COOKIE_SECRET=$(pwgen -1s 32)
mkdir -p "$(dirname "$REDIS_LOG_FILE")" || die "Could not create redis log dir" echo "export REDASH_COOKIE_SECRET=$COOKIE_SECRET" >> /opt/redash/.env
mkdir -p "$REDIS_DATA_DIR" || die "Could not create redis data directory" }
wget -O /etc/init.d/redis_6379 $FILES_BASE_URL"redis_init" extract_redash_sources() {
wget -O $REDIS_CONFIG_FILE $FILES_BASE_URL"redis.conf"
add_service "redis_$REDIS_PORT"
)
rm -rf redis-2.8.17
fi
# Directories
if [ ! -d "$REDASH_BASE_PATH" ]; then
sudo mkdir /opt/redash
sudo chown redash /opt/redash
sudo -u redash mkdir /opt/redash/logs
fi
# Default config file
if [ ! -f "/opt/redash/.env" ]; then
sudo -u redash wget $FILES_BASE_URL"env" -O /opt/redash/.env
echo 'export REDASH_STATIC_ASSETS_PATH="../rd_ui/dist/"' >> /opt/redash/.env
fi
if [ ! -d "$VERSION_DIR" ]; then
sudo -u redash wget "$LATEST_URL" -O "$REDASH_TARBALL" sudo -u redash wget "$LATEST_URL" -O "$REDASH_TARBALL"
sudo -u redash mkdir "$VERSION_DIR" sudo -u redash mkdir "$VERSION_DIR"
sudo -u redash tar -C "$VERSION_DIR" -xvf "$REDASH_TARBALL" sudo -u redash tar -C "$VERSION_DIR" -xvf "$REDASH_TARBALL"
ln -nfs "$VERSION_DIR" /opt/redash/current ln -nfs "$VERSION_DIR" /opt/redash/current
ln -nfs /opt/redash/.env /opt/redash/current/.env ln -nfs /opt/redash/.env /opt/redash/current/.env
}
cd /opt/redash/current install_python_packages() {
pip install --upgrade pip
# TODO: venv? # TODO: venv?
pip install -r requirements.txt pip install setproctitle # setproctitle is used by Celery for "pretty" process titles
fi pip install -r /opt/redash/current/requirements.txt
pip install -r /opt/redash/current/requirements_all_ds.txt
}
# Create database / tables create_database() {
pg_user_exists=0 # Create user and database
sudo -u postgres psql postgres -tAc "SELECT 1 FROM pg_roles WHERE rolname='redash'" | grep -q 1 || pg_user_exists=$?
if [ $pg_user_exists -ne 0 ]; then
echo "Creating redash postgres user & database."
sudo -u postgres createuser redash --no-superuser --no-createdb --no-createrole sudo -u postgres createuser redash --no-superuser --no-createdb --no-createrole
sudo -u postgres createdb redash --owner=redash sudo -u postgres createdb redash --owner=redash
cd /opt/redash/current cd /opt/redash/current
sudo -u redash bin/run ./manage.py database create_tables sudo -u redash bin/run ./manage.py database create_tables
fi }
# Create default admin user setup_supervisor() {
cd /opt/redash/current wget -O /etc/supervisor/conf.d/redash.conf "$FILES_BASE_URL/supervisord.conf"
# TODO: make sure user created only once service supervisor restart
# TODO: generate temp password and print to screen }
sudo -u redash bin/run ./manage.py users create --admin --password admin "Admin" "admin"
# Create Redash read only pg user & setup data source setup_nginx() {
pg_user_exists=0 rm /etc/nginx/sites-enabled/default
sudo -u postgres psql postgres -tAc "SELECT 1 FROM pg_roles WHERE rolname='redash_reader'" | grep -q 1 || pg_user_exists=$? wget -O /etc/nginx/sites-available/redash "$FILES_BASE_URL/nginx_redash_site"
if [ $pg_user_exists -ne 0 ]; then ln -nfs /etc/nginx/sites-available/redash /etc/nginx/sites-enabled/redash
echo "Creating redash reader postgres user." service nginx restart
REDASH_READER_PASSWORD=$(pwgen -1) }
sudo -u postgres psql -c "CREATE ROLE redash_reader WITH PASSWORD '$REDASH_READER_PASSWORD' NOCREATEROLE NOCREATEDB NOSUPERUSER LOGIN"
sudo -u redash psql -c "grant select(id,name,type) ON data_sources to redash_reader;" redash
sudo -u redash psql -c "grant select(id,name) ON users to redash_reader;" redash
sudo -u redash psql -c "grant select on alerts, alert_subscriptions, groups, events, queries, dashboards, widgets, visualizations, query_results to redash_reader;" redash
cd /opt/redash/current verify_root
sudo -u redash bin/run ./manage.py ds new "Redash Metadata" --type "pg" --options "{\"user\": \"redash_reader\", \"password\": \"$REDASH_READER_PASSWORD\", \"host\": \"localhost\", \"dbname\": \"redash\"}" install_system_packages
fi create_redash_user
create_directories
# Pip requirements for all data source types extract_redash_sources
cd /opt/redash/current install_python_packages
pip install -r requirements_all_ds.txt create_database
setup_supervisor
# Setup supervisord + sysv init startup script setup_nginx
sudo -u redash mkdir -p /opt/redash/supervisord
pip install supervisor==3.1.2 # TODO: move to requirements.txt
# Get supervisord startup script
sudo -u redash wget -O /opt/redash/supervisord/supervisord.conf $FILES_BASE_URL"supervisord.conf"
wget -O /etc/init.d/redash_supervisord $FILES_BASE_URL"redash_supervisord_init"
add_service "redash_supervisord"
# Nginx setup
rm /etc/nginx/sites-enabled/default
wget -O /etc/nginx/sites-available/redash $FILES_BASE_URL"nginx_redash_site"
ln -nfs /etc/nginx/sites-available/redash /etc/nginx/sites-enabled/redash
service nginx restart
# Hotfix: missing query snippets table:
cd /opt/redash/current
sudo -u redash bin/run python -c "from redash import models; models.QuerySnippet.create_table()"

View File

@@ -1,4 +1,3 @@
export REDASH_LOG_LEVEL="INFO" export REDASH_LOG_LEVEL="INFO"
export REDASH_REDIS_URL=redis://localhost:6379/0 export REDASH_REDIS_URL=redis://localhost:6379/0
export REDASH_DATABASE_URL="postgresql://redash" export REDASH_DATABASE_URL="postgresql:///redash"
export REDASH_COOKIE_SECRET=veryverysecret

View File

@@ -1,162 +0,0 @@
#!/bin/sh
# script to add apt.postgresql.org to sources.list
# from command line
CODENAME="$1"
# lsb_release is the best interface, but not always available
if [ -z "$CODENAME" ]; then
CODENAME=$(lsb_release -cs 2>/dev/null)
fi
# parse os-release (unreliable, does not work on Ubuntu)
if [ -z "$CODENAME" -a -f /etc/os-release ]; then
. /etc/os-release
# Debian: VERSION="7.0 (wheezy)"
# Ubuntu: VERSION="13.04, Raring Ringtail"
CODENAME=$(echo $VERSION | sed -ne 's/.*(\(.*\)).*/\1/')
fi
# guess from sources.list
if [ -z "$CODENAME" ]; then
CODENAME=$(grep '^deb ' /etc/apt/sources.list | head -n1 | awk '{ print $3 }')
fi
# complain if no result yet
if [ -z "$CODENAME" ]; then
cat <<EOF
Could not determine the distribution codename. Please report this as a bug to
pgsql-pkg-debian@postgresql.org. As a workaround, you can call this script with
the proper codename as parameter, e.g. "$0 squeeze".
EOF
exit 1
fi
# errors are non-fatal above
set -e
cat <<EOF
This script will enable the PostgreSQL APT repository on apt.postgresql.org on
your system. The distribution codename used will be $CODENAME-pgdg.
EOF
case $CODENAME in
# known distributions
sid|wheezy|squeeze|lenny|etch) ;;
precise|lucid) ;;
*) # unknown distribution, verify on the web
DISTURL="http://apt.postgresql.org/pub/repos/apt/dists/"
if [ -x /usr/bin/curl ]; then
DISTHTML=$(curl -s $DISTURL)
elif [ -x /usr/bin/wget ]; then
DISTHTML=$(wget --quiet -O - $DISTURL)
fi
if [ "$DISTHTML" ]; then
if ! echo "$DISTHTML" | grep -q "$CODENAME-pgdg"; then
cat <<EOF
Your system is using the distribution codename $CODENAME, but $CODENAME-pgdg
does not seem to be a valid distribution on
$DISTURL
We abort the installation here. If you want to use a distribution different
from your system, you can call this script with an explicit codename, e.g.
"$0 precise".
Specifically, if you are using a non-LTS Ubuntu release, refer to
https://wiki.postgresql.org/wiki/Apt/FAQ#I_am_using_a_non-LTS_release_of_Ubuntu
For more information, refer to https://wiki.postgresql.org/wiki/Apt
or ask on the mailing list for assistance: pgsql-pkg-debian@postgresql.org
EOF
exit 1
fi
fi
;;
esac
echo "Writing /etc/apt/sources.list.d/pgdg.list ..."
cat > /etc/apt/sources.list.d/pgdg.list <<EOF
deb http://apt.postgresql.org/pub/repos/apt/ $CODENAME-pgdg main
#deb-src http://apt.postgresql.org/pub/repos/apt/ $CODENAME-pgdg main
EOF
echo "Importing repository signing key ..."
KEYRING="/etc/apt/trusted.gpg.d/apt.postgresql.org.gpg"
test -e $KEYRING || touch $KEYRING
apt-key --keyring $KEYRING add - <<EOF
-----BEGIN PGP PUBLIC KEY BLOCK-----
Version: GnuPG v1
mQINBE6XR8IBEACVdDKT2HEH1IyHzXkb4nIWAY7echjRxo7MTcj4vbXAyBKOfjja
UrBEJWHN6fjKJXOYWXHLIYg0hOGeW9qcSiaa1/rYIbOzjfGfhE4x0Y+NJHS1db0V
G6GUj3qXaeyqIJGS2z7m0Thy4Lgr/LpZlZ78Nf1fliSzBlMo1sV7PpP/7zUO+aA4
bKa8Rio3weMXQOZgclzgeSdqtwKnyKTQdXY5MkH1QXyFIk1nTfWwyqpJjHlgtwMi
c2cxjqG5nnV9rIYlTTjYG6RBglq0SmzF/raBnF4Lwjxq4qRqvRllBXdFu5+2pMfC
IZ10HPRdqDCTN60DUix+BTzBUT30NzaLhZbOMT5RvQtvTVgWpeIn20i2NrPWNCUh
hj490dKDLpK/v+A5/i8zPvN4c6MkDHi1FZfaoz3863dylUBR3Ip26oM0hHXf4/2U
A/oA4pCl2W0hc4aNtozjKHkVjRx5Q8/hVYu+39csFWxo6YSB/KgIEw+0W8DiTII3
RQj/OlD68ZDmGLyQPiJvaEtY9fDrcSpI0Esm0i4sjkNbuuh0Cvwwwqo5EF1zfkVj
Tqz2REYQGMJGc5LUbIpk5sMHo1HWV038TWxlDRwtOdzw08zQA6BeWe9FOokRPeR2
AqhyaJJwOZJodKZ76S+LDwFkTLzEKnYPCzkoRwLrEdNt1M7wQBThnC5z6wARAQAB
tBxQb3N0Z3JlU1FMIERlYmlhbiBSZXBvc2l0b3J5iQI9BBMBCAAnAhsDBQsJCAcD
BRUKCQgLBRYCAwEAAh4BAheABQJS6RUZBQkOhCctAAoJEH/MfUaszEz4zmQP/2ad
HtuaXL5Xu3C3NGLha/aQb9iSJC8z5vN55HMCpsWlmslCBuEr+qR+oZvPkvwh0Io/
8hQl/qN54DMNifRwVL2n2eG52yNERie9BrAMK2kNFZZCH4OxlMN0876BmDuNq2U6
7vUtCv+pxT+g9R1LvlPgLCTjS3m+qMqUICJ310BMT2cpYlJx3YqXouFkdWBVurI0
pGU/+QtydcJALz5eZbzlbYSPWbOm2ZSS2cLrCsVNFDOAbYLtUn955yXB5s4rIscE
vTzBxPgID1iBknnPzdu2tCpk07yJleiupxI1yXstCtvhGCbiAbGFDaKzhgcAxSIX
0ZPahpaYLdCkcoLlfgD+ar4K8veSK2LazrhO99O0onRG0p7zuXszXphO4E/WdbTO
yDD35qCqYeAX6TaB+2l4kIdVqPgoXT/doWVLUK2NjZtd3JpMWI0OGYDFn2DAvgwP
xqKEoGTOYuoWKssnwLlA/ZMETegak27gFAKfoQlmHjeA/PLC2KRYd6Wg2DSifhn+
2MouoE4XFfeekVBQx98rOQ5NLwy/TYlsHXm1n0RW86ETN3chj/PPWjsi80t5oepx
82azRoVu95LJUkHpPLYyqwfueoVzp2+B2hJU2Rg7w+cJq64TfeJG8hrc93MnSKIb
zTvXfdPtvYdHhhA2LYu4+5mh5ASlAMJXD7zIOZt2iEYEEBEIAAYFAk6XSO4ACgkQ
xa93SlhRC1qmjwCg9U7U+XN7Gc/dhY/eymJqmzUGT/gAn0guvoX75Y+BsZlI6dWn
qaFU6N8HiQIcBBABCAAGBQJOl0kLAAoJEExaa6sS0qeuBfEP/3AnLrcKx+dFKERX
o4NBCGWr+i1CnowupKS3rm2xLbmiB969szG5TxnOIvnjECqPz6skK3HkV3jTZaju
v3sR6M2ItpnrncWuiLnYcCSDp9TEMpCWzTEgtrBlKdVuTNTeRGILeIcvqoZX5w+u
i0eBvvbeRbHEyUsvOEnYjrqoAjqUJj5FUZtR1+V9fnZp8zDgpOSxx0LomnFdKnhj
uyXAQlRCA6/roVNR9ruRjxTR5ubteZ9ubTsVYr2/eMYOjQ46LhAgR+3Alblu/WHB
MR/9F9//RuOa43R5Sjx9TiFCYol+Ozk8XRt3QGweEH51YkSYY3oRbHBb2Fkql6N6
YFqlLBL7/aiWnNmRDEs/cdpo9HpFsbjOv4RlsSXQfvvfOayHpT5nO1UQFzoyMVpJ
615zwmQDJT5Qy7uvr2eQYRV9AXt8t/H+xjQsRZCc5YVmeAo91qIzI/tA2gtXik49
6yeziZbfUvcZzuzjjxFExss4DSAwMgorvBeIbiz2k2qXukbqcTjB2XqAlZasd6Ll
nLXpQdqDV3McYkP/MvttWh3w+J/woiBcA7yEI5e3YJk97uS6+ssbqLEd0CcdT+qz
+Waw0z/ZIU99Lfh2Qm77OT6vr//Zulw5ovjZVO2boRIcve7S97gQ4KC+G/+QaRS+
VPZ67j5UMxqtT/Y4+NHcQGgwF/1iiQI9BBMBCAAnAhsDBQsJCAcDBRUKCQgLBRYC
AwEAAh4BAheABQJQeSssBQkDwxbfAAoJEH/MfUaszEz4bgkP/0AI0UgDgkNNqplA
IpE/pkwem2jgGpJGKurh2xDu6j2ZL+BPzPhzyCeMHZwTXkkI373TXGQQP8dIa+RD
HAZ3iijw4+ISdKWpziEUJjUk04UMPTlN+dYJt2EHLQDD0VLtX0yQC/wLmVEH/REp
oclbVjZR/+ehwX2IxOIlXmkZJDSycl975FnSUjMAvyzty8P9DN0fIrQ7Ju+BfMOM
TnUkOdp0kRUYez7pxbURJfkM0NxAP1geACI91aISBpFg3zxQs1d3MmUIhJ4wHvYB
uaR7Fx1FkLAxWddre/OCYJBsjucE9uqc04rgKVjN5P/VfqNxyUoB+YZ+8Lk4t03p
RBcD9XzcyOYlFLWXbcWxTn1jJ2QMqRIWi5lzZIOMw5B+OK9LLPX0dAwIFGr9WtuV
J2zp+D4CBEMtn4Byh8EaQsttHeqAkpZoMlrEeNBDz2L7RquPQNmiuom15nb7xU/k
7PGfqtkpBaaGBV9tJkdp7BdH27dZXx+uT+uHbpMXkRrXliHjWpAw+NGwADh/Pjmq
ExlQSdgAiXy1TTOdzxKH7WrwMFGDK0fddKr8GH3f+Oq4eOoNRa6/UhTCmBPbryCS
IA7EAd0Aae9YaLlOB+eTORg/F1EWLPm34kKSRtae3gfHuY2cdUmoDVnOF8C9hc0P
bL65G4NWPt+fW7lIj+0+kF19s2PviQI9BBMBCAAnAhsDBQsJCAcDBRUKCQgLBRYC
AwEAAh4BAheABQJRKm2VBQkINsBBAAoJEH/MfUaszEz4RTEP/1sQHyjHaUiAPaCA
v8jw/3SaWP/g8qLjpY6ROjLnDMvwKwRAoxUwcIv4/TWDOMpwJN+CJIbjXsXNYvf9
OX+UTOvq4iwi4ADrAAw2xw+Jomc6EsYla+hkN2FzGzhpXfZFfUsuphjY3FKL+4hX
H+R8ucNwIz3yrkfc17MMn8yFNWFzm4omU9/JeeaafwUoLxlULL2zY7H3+QmxCl0u
6t8VvlszdEFhemLHzVYRY0Ro/ISrR78CnANNsMIy3i11U5uvdeWVCoWV1BXNLzOD
4+BIDbMB/Do8PQCWiliSGZi8lvmj/sKbumMFQonMQWOfQswTtqTyQ3yhUM1LaxK5
PYq13rggi3rA8oq8SYb/KNCQL5pzACji4TRVK0kNpvtxJxe84X8+9IB1vhBvF/Ji
/xDd/3VDNPY+k1a47cON0S8Qc8DA3mq4hRfcgvuWy7ZxoMY7AfSJOhleb9+PzRBB
n9agYgMxZg1RUWZazQ5KuoJqbxpwOYVFja/stItNS4xsmi0lh2I4MNlBEDqnFLUx
SvTDc22c3uJlWhzBM/f2jH19uUeqm4jaggob3iJvJmK+Q7Ns3WcfhuWwCnc1+58d
iFAMRUCRBPeFS0qd56QGk1r97B6+3UfLUslCfaaA8IMOFvQSHJwDO87xWGyxeRTY
IIP9up4xwgje9LB7fMxsSkCDTHOk
=s3DI
-----END PGP PUBLIC KEY BLOCK-----
EOF
echo "Running apt-get update ..."
apt-get update
cat <<EOF
You can now start installing packages from apt.postgresql.org.
Have a look at https://wiki.postgresql.org/wiki/Apt for more information;
most notably the FAQ at https://wiki.postgresql.org/wiki/Apt/FAQ
EOF

View File

@@ -1,129 +0,0 @@
#!/bin/sh
# /etc/init.d/redash_supervisord
### BEGIN INIT INFO
# Provides: supervisord
# Required-Start: $remote_fs $syslog
# Required-Stop: $remote_fs $syslog
# Default-Start: 2 3 4 5
# Default-Stop: 0 1 6
# Short-Description: process supervisor
### END INIT INFO
# Author: Ron DuPlain <ron.duplain@gmail.com>
# Do NOT "set -e"
# PATH should only include /usr/* if it runs after the mountnfs.sh script
PATH=/sbin:/usr/sbin:/usr/local/sbin:/bin:/usr/bin:/usr/local/bin
NAME=supervisord
DESC="process supervisor"
DAEMON=/usr/local/bin/$NAME
DAEMON_ARGS="--configuration /opt/redash/supervisord/supervisord.conf "
PIDFILE=/opt/redash/supervisord/supervisord.pid
SCRIPTNAME=/etc/init.d/redash_supervisord
USER=redash
# Exit if the package is not installed
[ -x "$DAEMON" ] || exit 0
# Read configuration variable file if it is present
[ -r /etc/default/$NAME ] && . /etc/default/$NAME
# Load the VERBOSE setting and other rcS variables
. /lib/init/vars.sh
# Define LSB log_* functions.
# Depend on lsb-base (>= 3.2-14) to ensure that this file is present
# and status_of_proc is working.
. /lib/lsb/init-functions
#
# Function that starts the daemon/service
#
do_start()
{
# Return
# 0 if daemon has been started
# 1 if daemon was already running
# 2 if daemon could not be started
start-stop-daemon --start --quiet --pidfile $PIDFILE --user $USER --chuid $USER --exec $DAEMON --test > /dev/null \
|| return 1
start-stop-daemon --start --quiet --pidfile $PIDFILE --user $USER --chuid $USER --exec $DAEMON -- \
$DAEMON_ARGS \
|| return 2
# Add code here, if necessary, that waits for the process to be ready
# to handle requests from services started subsequently which depend
# on this one. As a last resort, sleep for some time.
}
#
# Function that stops the daemon/service
#
do_stop()
{
# Return
# 0 if daemon has been stopped
# 1 if daemon was already stopped
# 2 if daemon could not be stopped
# other if a failure occurred
start-stop-daemon --stop --quiet --retry=TERM/30/KILL/5 --pidfile $PIDFILE --user $USER --chuid $USER --name $NAME
RETVAL="$?"
[ "$RETVAL" = 2 ] && return 2
# Wait for children to finish too if this is a daemon that forks
# and if the daemon is only ever run from this initscript.
# If the above conditions are not satisfied then add some other code
# that waits for the process to drop all resources that could be
# needed by services started subsequently. A last resort is to
# sleep for some time.
start-stop-daemon --stop --quiet --oknodo --retry=0/30/KILL/5 --user $USER --chuid $USER --exec $DAEMON
[ "$?" = 2 ] && return 2
# Many daemons don't delete their pidfiles when they exit.
rm -f $PIDFILE
return "$RETVAL"
}
case "$1" in
start)
[ "$VERBOSE" != no ] && log_daemon_msg "Starting $DESC" "$NAME"
do_start
case "$?" in
0|1) [ "$VERBOSE" != no ] && log_end_msg 0 ;;
2) [ "$VERBOSE" != no ] && log_end_msg 1 ;;
esac
;;
stop)
[ "$VERBOSE" != no ] && log_daemon_msg "Stopping $DESC" "$NAME"
do_stop
case "$?" in
0|1) [ "$VERBOSE" != no ] && log_end_msg 0 ;;
2) [ "$VERBOSE" != no ] && log_end_msg 1 ;;
esac
;;
status)
status_of_proc "$DAEMON" "$NAME" && exit 0 || exit $?
;;
restart)
log_daemon_msg "Restarting $DESC" "$NAME"
do_stop
case "$?" in
0|1)
do_start
case "$?" in
0) log_end_msg 0 ;;
1) log_end_msg 1 ;; # Old process is still running
*) log_end_msg 1 ;; # Failed to start
esac
;;
*)
# Failed to stop
log_end_msg 1
;;
esac
;;
*)
echo "Usage: $SCRIPTNAME {start|stop|status|restart}" >&2
exit 3
;;
esac
:

Some files were not shown because too many files have changed in this diff Show More