Compare commits

...

160 Commits

Author SHA1 Message Date
Loïc Mathieu
1bfdae7506 chore(version): upgrade to 0.19.25 2025-06-03 13:52:03 +02:00
YannC.
6a232bf7c6 chore(version): upgrade version to v0.19.24 2025-05-20 13:34:19 +02:00
Loïc Mathieu
f0acc68a48 fix(system)*: reset the trigger into the KafkaScheduler instead of the ExecutorMain 2025-05-19 12:04:54 +02:00
YannC.
18391f535f chore(version): upgrade to version v0.19.23 2025-04-15 15:38:12 +02:00
brian.mulier
c6c2c974ee feat(release): 0.19.22 2025-04-09 18:33:45 +02:00
brian.mulier
c008920845 fix(core)!: prevent failing execution in case of duplicate label upon inheritance 2025-04-09 18:33:24 +02:00
nKwiatkowski
ae2a4db153 feat(Unit Tests): add assertj dependency 2025-04-08 15:58:34 +02:00
brian.mulier
fde37e4b30 feat(release): 0.19.21 2025-04-08 14:21:20 +02:00
nKwiatkowski
705e834c8a fix(test): change EOL centOS docker image 2025-03-19 14:23:10 +01:00
YannC
8f123b1da5 fix(runner-memory): delete MemorySchedulerTriggerState back due to cherry-pick 2025-03-06 15:29:52 +01:00
YannC
259a9036b9 fix(): align to EE 2025-03-06 11:40:08 +01:00
nKwiatkowski
49fd7e411f fix(docker): docker qemu error 2025-02-24 15:57:05 +01:00
nKwiatkowski
3b5527b872 tmp(release): disable sonar 2025-02-24 14:06:18 +01:00
nKwiatkowski
c14d9573d3 feat(release): 0.19.20 2025-02-24 09:59:37 +01:00
Ludovic DEHON
2dff7be851 fix(tasks): remove useless format metrics on return (#7486) 2025-02-21 21:44:08 +01:00
Ludovic DEHON
9ceb633e97 fix(jdbc): batch query expand query and lead to overflow of metrics 2025-02-18 20:15:08 +01:00
Florian Hussonnois
434f0caf9a chore: upgrade to version 0.19.19 2025-01-28 18:30:51 +01:00
Florian Hussonnois
d00a8f549f fix(webserver): ensure queues are not closed in nioEventLoop 2025-01-27 14:48:25 +01:00
Miloš Paunović
c8a829081c fix(ui): save content to proper file using the namespace file editor (#6931) 2025-01-24 17:28:41 +01:00
YannC
86117b63b6 chore: upgrade to version v0.19.18 2025-01-24 10:11:12 +01:00
brian.mulier
f3bb05a6d5 chore(version): version 0.19.17 2025-01-16 15:43:02 +01:00
Loïc Mathieu
317cc02d77 feat(webserver, ui): avoid cancelled SSE connection from following exec
Send a fake "start" event from the Execution following endpoint so that the UI didn't cancell it.

I'm not sure when the UI would cancel the SSE connection but it can ocurs if any of the view that opens an SSE connection are left but no event are received yet.
Sending a fake event immediatly lower the risk of occuring.
2025-01-16 15:15:46 +01:00
Loïc Mathieu
b0ae3b643c fix(core, ui): send a "start" event to be sure the UI receive the SSE
The UI only store a reference to the logs SSE when receive the first event.
In case a flow didn't emit any log, or the logs tab is closed before any logs is emitted, the UI will not have any reference to the SSE so the SSE connection would stay alive forever.
Each SSE connection starts a thread via the logs queue, creating a thread leak.

Sending a first "start" event makes sure the UI has a reference to the SSE.
2025-01-16 15:15:24 +01:00
brian.mulier
338cc8c38c chore(version): version 0.19.16 2024-12-19 11:49:18 +01:00
YannC
9231c28bb5 chore: upgrade to version 0.19.15 2024-12-18 17:00:07 +01:00
YannC
5a722cdb6c fix(core): save flowable's output when flowable is child of another flowable (#6500)
close #6494
2024-12-18 16:52:07 +01:00
Loïc Mathieu
da47b05d61 chore(version): version 0.19.14 2024-12-13 12:28:18 +01:00
Loïc Mathieu
97d1267852 feat(core,jdbc): small trigger / scheduler improvements 2024-12-13 12:28:05 +01:00
YannC
afe41c1600 fix(ui): axios missing content type 2024-12-09 11:38:43 +01:00
YannC
a12fa5b82b chore: upgrade to version 0.19.13 2024-12-04 16:54:51 +01:00
Loïc Mathieu
b3376d2183 fix(core, webserver): properly close the queue on Flux.onFinally
Two fixes:
- close the queue onFinally and not onComplete and onCancel to take into accunt errors.
- close the queue onFinally in the execution creation as now it is only done on the success path and not even via a Flux lifecycle method

This may fix or improve some incosistent behavior reported by users on the webserver.
2024-12-04 16:04:19 +01:00
Loïc Mathieu
dd6d8a7c22 fix(core): Correctly parse Content-Disposition in the Download task
Fixes #6270
2024-12-04 16:03:34 +01:00
YannC
cb3e73c651 fix(): better comparison of triggers (#6018) 2024-12-02 12:05:06 +01:00
Loïc Mathieu
f95f47ae0c feat(core): implements equals() and hashCode() for Property 2024-11-29 15:08:02 +01:00
Loïc Mathieu
e042a0a4b3 fix(core): deserialize duration from a float string 2024-11-29 15:07:54 +01:00
Loïc Mathieu
f92bfe9e98 Revert "feat(core): remove the execution state from the scheduler (#1588)"
This reverts commit f7d3d0bcd4.
2024-11-28 14:41:10 +01:00
brian.mulier
7ef33e35f1 chore: upgrade version to 0.19.12 2024-11-26 17:32:34 +01:00
Loïc Mathieu
bc27733149 fix(core): serialize duration as strings in Property
Fixes #5615
2024-11-25 16:03:56 +01:00
YannC
355543d4f7 chore: upgrade to version 0.19.11 2024-11-19 10:40:43 +01:00
Manoj Balaraj
12474e118a chore(ui): properly refresh gantt view after replaying the execution (#5829) 2024-11-14 14:21:56 +01:00
Loïc Mathieu
7c31e0306c chore: version 0.19.10 2024-11-12 14:16:14 +01:00
Loïc Mathieu
3aace06d31 chore(core,webserver): discard not-used completed part and reads in an IO thread 2024-11-06 17:40:47 +01:00
Loïc Mathieu
dcd697bcac chore(webserver): ensure all input streams are closed 2024-11-06 17:40:30 +01:00
Loïc Mathieu
5fc3f769da fix(core): If in a flowable task
Fixes #5812
2024-11-06 17:25:47 +01:00
YannC
0d52d9a6a9 chore: upgrade to version 0.19.9 2024-11-05 20:38:21 +01:00
YannC
85f11feace fix(core): Namespace file changes for git tasks (#5784) 2024-11-05 20:37:48 +01:00
YannC
523294ce8c fix(ui): encode filename when downloading it (#5760)
close #5589
2024-11-04 19:22:22 +01:00
YannC
b15bc9bacd fix(core): Windows path issues (#5763) 2024-11-04 18:28:39 +01:00
Malay Dewangan
74f28be32e fix(core): return flow outputs in execution API response when wait=true (#5531) 2024-11-04 18:28:33 +01:00
Loïc Mathieu
2a714791a1 fix(core): redo the If fix 2024-11-04 10:48:31 +01:00
Loïc Mathieu
f9a547ed63 fix(core): save the If evaluation condition in the output
This avoids re-evaluation on each child task run which can be an issue if the taskrun modify something that is part of the condition.

Fixes #5437
2024-11-04 10:48:11 +01:00
brian.mulier
a817970c1c chore: upgrade to version 0.19.8 2024-10-30 21:09:34 +01:00
brian.mulier
c9e8b7ea06 fix(ui): prevent duplicate input validations + avoid error spamming + send boolean inputs properly
closes #5731
2024-10-30 21:05:58 +01:00
brian.mulier
8f4a9e2dc8 chore: upgrade to version 0.19.7 2024-10-30 13:41:39 +01:00
brian.mulier
911ae32113 fix: better log display and combine temporal & taskruns view log navigation handling 2024-10-30 13:39:51 +01:00
brian.mulier
928214a22b fix: gantt view was no longer fetching logs 2024-10-30 13:39:48 +01:00
brian.mulier
d5eafa69aa chore: upgrade to version 0.19.6 2024-10-30 10:28:44 +01:00
Sachin
ae48352300 fix(ui): log navigation is now working in temporal view (#5685)
closes #5215
2024-10-30 10:23:58 +01:00
Loïc Mathieu
a59f758d28 feat(core,jdbc): add message protection metric 2024-10-25 12:24:23 +02:00
Mradul Vishwakarma
befbefbdd9 feat(ui): only show columns with data on triggers page (#5555) 2024-10-24 13:42:03 +02:00
Miloš Paunović
57c7389f9e fix(ui): prevent validation errors showing if inputs are empty (#5648) 2024-10-24 12:27:36 +02:00
AbdurRahman2004
427da64744 chore(ui): amend table link colors on main dashboard (#5638) 2024-10-23 23:12:49 +02:00
Abhishek Khairnar
430dc8ecee chore(ui): add background color to every namespace on listing (#5603) 2024-10-23 13:34:49 +02:00
YannC
2d9c98b921 chore: upgrade to version 0.19.5 2024-10-22 16:07:21 +02:00
YannC
a49b406f03 fix(core): encode filename (#5593)
close #5589
2024-10-22 16:06:04 +02:00
YannC
2a578fe651 fix(core): missing method 2024-10-22 15:58:48 +02:00
Loïc Mathieu
277bf77fb4 fix(core): serialize default inputs
Fixes https://github.com/kestra-io/kestra-ee/issues/1887
2024-10-22 09:09:07 +02:00
Harsh4902
e6ec59443a fixes #5459 used HashMap instead of Map to accept null values.
Signed-off-by: Harsh4902 <harshparmar4902@gmail.com>
2024-10-18 15:25:31 +02:00
Loïc Mathieu
b68b281ac0 fix(webserver): don't load the flow too early so a user with only EXECUTION permission can access execution files
Fixes https://github.com/kestra-io/kestra/issues/4958
2024-10-18 15:24:10 +02:00
Loïc Mathieu
37bf6ea8f3 fix(core): decrypt additional render variables
Fixes https://github.com/kestra-io/plugin-kubernetes/issues/150
2024-10-18 15:24:04 +02:00
brian.mulier
71a296a814 fix(core): better error message in case of docker socket not found / not accessible
closes #5524
2024-10-18 14:39:15 +02:00
brian.mulier
6d8bc07f5b fix(doc): CardLogos.vue had wrong URL
closes kestra-io/docs#1808
2024-10-18 14:39:11 +02:00
Florian Hussonnois
07974aa145 fix(core): fix inputs for execution resume (#5494)
fix: #5494
2024-10-18 11:54:05 +02:00
YannC
c7288bd325 chore: upgrade to version 0.19.4 2024-10-18 11:08:46 +02:00
YannC
96f553c1ba fix(ui): missing translation 2024-10-18 11:08:46 +02:00
YannC
8389102706 fix(core): correctly cast input to FLOAT for subflows (#5539)
close #5535
2024-10-18 09:29:44 +02:00
MilosPaunovic
16a0096c45 chore(ui): remove the unnecessary file for this version 2024-10-17 09:53:33 +02:00
Varsha U N
6327dcd51b chore(ui): increase the in-app docs scrollbar width (#5473) 2024-10-17 09:50:45 +02:00
Purandhar Adigarla
e91beaa15f chore(ui): add an extra space between icon and label inside a button (#5507)
Co-authored-by: PurandharAdigarla <purandharadigarla.com>
2024-10-17 09:06:10 +02:00
Sachin
833bdb38ee fix(ui): hide pagination when no flows results data available (#5501)
Co-authored-by: Sachin KS <mac@apples-MacBook-Air.local>
2024-10-17 09:05:58 +02:00
GitHub Action
0d64c74a67 chore(translations): auto generate values for languages other than english 2024-10-17 09:05:48 +02:00
Sachin
4740fa3628 chore(ui): update no logs message for flows source search
Co-authored-by: Sachin KS <mac@apples-MacBook-Air.local>
2024-10-17 09:04:38 +02:00
Loïc Mathieu
b29965c239 chore: version 0.19.3 2024-10-15 14:08:05 +02:00
Florian Hussonnois
05d1eeadef refactor(core): move to reactor for handling execution inputs (#5383)
related-to: #5383
2024-10-15 12:05:29 +02:00
Florian Hussonnois
acd2ce9041 fix(core): fix do not upload file when validating inputs (#5399)
Fix: #5399
2024-10-15 12:04:35 +02:00
Malay Dewangan
a3829c3d7e fix(core): OutputValues to support arrays and complex objects (#5440) 2024-10-14 14:06:10 +02:00
Sachin
17c18f94dd chore(ui): use standard graphs across all pages (#5443) 2024-10-14 12:43:35 +02:00
MITHIN DEV
14daa96295 chore(ui): make bar chart more responsive on smaller screens (#5439)
Signed-off-by: mithindev <mithindev1@gmail.com>
2024-10-14 12:43:29 +02:00
yuri
aa9aa80f0a feat(ui): allow searching read-only inputs (#5427) 2024-10-14 12:43:20 +02:00
Sachin
705d17340d fix(ui): prevent tab title change on cancelling unsaved changes (#5435)
Co-authored-by: Sachin KS <mac@apples-MacBook-Air.local>
2024-10-14 12:43:13 +02:00
Sachin
cf70c99e59 fix(ui): resolve issue preventing flow creation (#5444)
Co-authored-by: Sachin KS <mac@apples-MacBook-Air.local>
2024-10-14 12:43:07 +02:00
Miloš Paunović
c5e0cddca5 chore(ui): add filters for flow logs (#5446) 2024-10-14 12:42:59 +02:00
Loïc Mathieu
4d14464191 fix(cli): incorrect JDBC conf 2024-10-11 14:31:21 +02:00
Miloš Paunović
ed12797b46 chore(ui): check if tab exists before setting dirty attribute on it (#5423) 2024-10-11 12:36:02 +02:00
GitHub Action
ec85a748ce chore(translations): auto generate values for languages other than english 2024-10-11 11:50:00 +02:00
Mohammed Viqar Ahmed
3e8a63888a feat(ui): add option to delete multiple kv pairs at once (#5413) 2024-10-11 11:49:52 +02:00
Jonnadula Chaitanya
8d0bcc1da3 fix(ui): take default namespace into account on filters (#5406) 2024-10-11 11:37:04 +02:00
Loïc Mathieu
0b53f1cf25 feat(core): ForEachItme avoid checking all split file for existance but list them 2024-10-10 18:01:34 +02:00
Loïc Mathieu
3621aad6a1 fix(core): incorrect duration metric computed on the Worker 2024-10-10 18:01:26 +02:00
Miloš Paunović
dbb1cc5007 fix(ui): amend duplication the lables field on execution run (#5405) 2024-10-10 11:52:38 +02:00
Miloš Paunović
0d6e655b22 chore(ui): filter out system namespace from namespace select filter (#5403) 2024-10-10 10:57:30 +02:00
riya mustare
7a1a180fdb Pre-fill namespace from current filter (#5398) 2024-10-10 09:42:22 +02:00
Miloš Paunović
ce2daf52ff fix(ui): make sure disable toggle for triggers of next executions works every time (#5397) 2024-10-10 09:08:56 +02:00
Mohammed Viqar Ahmed
f086da3a2a chore(docs): add section about javascript memory heap out error in contributing guide (#5392)
* CONTRIBUTING.md : addind node options variable

* chore(ui): improve wording of contribution guide

---------

Co-authored-by: Miloš Paunović <paun992@hotmail.com>
2024-10-10 08:44:19 +02:00
Ahmad Midlaj B
1886a443c7 fix(ui): make execution replay dialog description readable in dark theme (#5360)
* fix(ui): make text readable in dark theme

* chore(ui): revert package-lock.json file to initial state

---------

Co-authored-by: MilosPaunovic <paun992@hotmail.com>
2024-10-10 08:42:28 +02:00
riya mustare
5a4e2b791d aligned relative filter (#5388) 2024-10-10 08:42:15 +02:00
Miloš Paunović
a595cecb3d fix(ui): revert icon coloring tweak (#5393) 2024-10-09 19:05:47 +02:00
Miloš Paunović
472b699ca7 fix(ui): amend dark mode icon color on plugins page (#5385) 2024-10-09 14:40:47 +02:00
Harshvardhan Parmar
f55f52b43a fix(ui): prevent overwriting the flow after save from topology (#5361)
Signed-off-by: Harsh4902 <harshparmar4902@gmail.com>
2024-10-09 11:09:42 +02:00
Miloš Paunović
c796308839 chore(ui): make sidebar toggle more prominent (#5357) 2024-10-09 08:40:01 +02:00
MilosPaunovic
37a880164d chore(translations) add missing key/value pair 2024-10-09 08:35:40 +02:00
GitHub Action
5f1408c560 chore(translations): auto generate values for languages other than english 2024-10-09 08:20:12 +02:00
Sai Mounika Peri
4186900fdb feat(ui): add a tooltip over flow triggers tab if empty (#5358)
Co-authored-by: Will Russell <will@wrussell.co.uk>
2024-10-09 08:15:04 +02:00
Florian Hussonnois
4338437a6f chore: update version to v0.19.2 2024-10-08 15:52:48 +02:00
MilosPaunovic
68ee5e4df0 chore(translations): add missing key in english language file 2024-10-08 15:52:48 +02:00
Loïc Mathieu
2def5cf7f8 fix(jdbc): always include deleted the the logs and metrics queries
Even if not needed to be sure we use the correct index.
2024-10-08 13:11:51 +02:00
Florian Hussonnois
d184858abf feat(core): move service usages 2024-10-08 11:02:11 +02:00
Sachin
dfa5875fa1 feat(ui): add chart visibility toggle to flows and logs page (#5345)
Co-authored-by: Sachin KS <mac@apples-MacBook-Air.local>
2024-10-08 10:08:33 +02:00
Sachin
ac4f7f261d fix(ui): amend translation keys usage (#5346)
Co-authored-by: Sachin KS <mac@apples-MacBook-Air.local>
2024-10-08 09:48:38 +02:00
GitHub Action
ae55685d2e chore(translations): auto generate values for languages other than english 2024-10-08 09:26:20 +02:00
Sai Mounika Peri
dd34317e4f feat(ui): improve page shown when flow has no dependencies (#5340) 2024-10-08 09:26:11 +02:00
riya mustare
f95e3073dd chore(ui): reduced line height on input description (#5344) 2024-10-08 09:18:03 +02:00
Florian Hussonnois
9f20988997 fix(core): use tenant for resolving worker groups 2024-10-07 14:16:00 +02:00
Sachin
5da3ab4f71 fix(ui): add bottom border on debug outputs (#5334)
Co-authored-by: Sachin KS <mac@apples-MacBook-Air.local>
2024-10-07 13:06:30 +02:00
Sachin
243eaab826 fix(ui): prevent removal of empty fields in metadata editor (#5313)
Co-authored-by: Sachin KS <mac@apples-MacBook-Air.local>
2024-10-07 11:25:37 +02:00
Sachin
6d362d688d fix(ui): amend flow disable from low code editor (#5315)
Co-authored-by: Sachin KS <mac@apples-MacBook-Air.local>
2024-10-07 11:20:28 +02:00
brian.mulier
39a01e0e7d fix(core): windows backslashes in paths were leading to wrong URI being created leading to error upon execution deletion 2024-10-07 11:19:35 +02:00
Sachin
a44b2ef7cb fix(ui): persisting flow metadata from low code editor (#5316)
Co-authored-by: Sachin KS <mac@apples-MacBook-Air.local>
2024-10-07 11:15:16 +02:00
Sachin
6bcad13444 feat(ui): added executions tab to single namespace (#5322)
Co-authored-by: Sachin KS <mac@apples-MacBook-Air.local>
2024-10-07 11:05:02 +02:00
Antoine Gauthier
02acf01ea5 chore(ui): update button conditions based on flow states (#5319) 2024-10-07 10:39:06 +02:00
Sai Mounika Peri
55193361b8 chore(ui): improve validation for kv store (#5321)
* Validation error of previous type should be cleared once the KV type is changed

* chore(ui): remove comment as code is self-explanatory

---------

Co-authored-by: MilosPaunovic <paun992@hotmail.com>
2024-10-07 09:28:38 +02:00
brian.mulier
8d509a3ba5 fix(core): path matcher for windows were not working 2024-10-04 19:41:30 +02:00
GitHub Action
500680bcf7 chore(translations): auto generate values for languages other than english 2024-10-04 15:47:10 +02:00
Miloš Paunović
412c27cb12 chore(ui): improve the dashboard ratios calculation (#5311) 2024-10-04 15:46:59 +02:00
Sachin
8d7d9a356f chore(ui): use improved chart for flow executions (#5309)
* Replace the Flows Execution barchart with the barchart used on the main dashboard

* chore(ui): added bottom margin

---------

Co-authored-by: Sachin KS <mac@apples-MacBook-Air.local>
Co-authored-by: MilosPaunovic <paun992@hotmail.com>
2024-10-04 15:01:14 +02:00
Miloš Paunović
d2ab2e97b4 fix(ui): prevent cases where dashboard totals shows nan instead of value (#5308) 2024-10-04 11:01:41 +02:00
Miloš Paunović
6a0f360fc6 fix(ui): amend end date on dashboard refresh (#5303) 2024-10-04 09:14:07 +02:00
Vivek Gangwani
0484fd389a chore(ui): making the color scheme the same for gantt and topology(#5280) 2024-10-04 09:13:14 +02:00
Miloš Paunović
e92aac3b39 chore(ui): re-calculate translation strings for left menu after language change (#5302) 2024-10-04 08:04:02 +02:00
Miloš Paunović
39b8ac8804 chore(ci): add check for translation keys matching (#5301) 2024-10-04 07:37:15 +02:00
Miloš Paunović
f928ed5876 chore(ui): uniform translation keys across languages (#5298) 2024-10-04 07:37:06 +02:00
Miloš Paunović
54856af0a8 fix(ui): amend logs scrolling for the last task (#5294) 2024-10-03 16:28:02 +02:00
MilosPaunovic
8bd79e82ab chore(ci): exit workflow with success if no changes are present 2024-10-03 16:27:53 +02:00
MilosPaunovic
104a491b92 chore(ci): separate direct pull requests and the ones from forked repositories 2024-10-03 16:27:44 +02:00
MilosPaunovic
5f46a0dd16 chore(ci): expose paste to editor function globally for testing 2024-10-03 16:27:35 +02:00
Loïc Mathieu
24c3703418 fix(core): hide secret inputs in logs
Fixes #5259
2024-10-03 10:34:27 +02:00
yuri
e5af245855 fix(ui): enable keyboard shortcut to launch execution (#5288) 2024-10-03 08:19:06 +02:00
Vivek Gangwani
d58e8f98a2 fix (ui): Unable to unselect the currently chosen log level (#5287)
* Update root.scss to Fix Topology View for Light Mode

* Update root-dark.scss to unify Gantt and TOpology View Colors

* Added deselect button for Log Levels
2024-10-03 08:18:49 +02:00
MilosPaunovic
ce2f1bfdb3 chore(ui): uniform using import class 2024-10-02 15:15:48 +02:00
Miloš Paunović
b619f88eff chore(ci): generate translation values as a commit to existing pull request (#5278) 2024-10-02 12:48:39 +02:00
Sai Mounika Peri
1f1775752b chore(ui): update parent from metadata editor (#5265) 2024-10-02 11:10:03 +02:00
AbdurRahman2004
b2475e53a2 chore(ui): move the delete logs button to top (#5266)
* Move 'Delete logs' button to top right corner of navigation

---------

Co-authored-by: Miloš Paunović <paun992@hotmail.com>
2024-10-02 10:49:59 +02:00
Antoine Gauthier
7e8956a0b7 fix(ui): amend typos in french translations (#5272) 2024-10-02 10:48:14 +02:00
brian.mulier
6537ee984b chore(version): update to version 'v0.19.1'. 2024-10-01 22:32:48 +02:00
brian.mulier
573aa48237 fix(ci): add back datahub plugin to ci build 2024-10-01 22:32:07 +02:00
brian.mulier
66ddeaa219 chore(version): update to version 'v0.19.0'. 2024-10-01 18:15:40 +02:00
brian.mulier
02c5e8a1a2 fix(ci): remove datahub plugin for now as it's not finished 2024-10-01 18:15:40 +02:00
brian.mulier
733c7897b9 fix(ci): restore github release on main workflow in case of skipped e2e 2024-10-01 15:33:36 +02:00
brian.mulier
c051287688 fix(ci): publish maven even if E2E were skipped 2024-10-01 14:26:02 +02:00
brian.mulier
1af8de6bce fix(ci): no more docker build & E2E for tags build 2024-10-01 13:43:35 +02:00
160 changed files with 2522 additions and 830 deletions

View File

@@ -52,14 +52,17 @@ The backend is made with [Micronaut](https://micronaut.io).
Open the cloned repository in your favorite IDE. In most of decent IDEs, Gradle build will be detected and all dependencies will be downloaded.
You can also build it from a terminal using `./gradlew build`, the Gradle wrapper will download the right Gradle version to use.
- You may need to enable java annotation processors since we are using it a lot.
- The main class is `io.kestra.cli.App` from module `kestra.cli.main`
- Pass as program arguments the server you want to develop, for example `server local` will start the [standalone local](https://kestra.io/docs/administrator-guide/server-cli#kestra-local-development-server-with-no-dependencies)
- ![Intellij Idea Configuration ](https://user-images.githubusercontent.com/2064609/161399626-1b681add-cfa8-4e0e-a843-2631cc59758d.png) Intellij Idea configuration can be found in screenshot below.
- `MICRONAUT_ENVIRONMENTS`: can be set any string and will load a custom configuration file in `cli/src/main/resources/application-{env}.yml`
- `KESTRA_PLUGINS_PATH`: is the path where you will save plugins as Jar and will be load on the startup.
- You can also use the gradle task `./gradlew runLocal` that will run a standalone server with `MICRONAUT_ENVIRONMENTS=override` and plugins path `local/plugins`
- The server start by default on port 8080 and is reachable on `http://localhost:8080`
- You may need to enable java annotation processors since we are using them.
- On IntelliJ IDEA, click on **Run -> Edit Configurations -> + Add new Configuration** to create a run configuration to start Kestra.
- The main class is `io.kestra.cli.App` from module `kestra.cli.main`.
- Pass as program arguments the server you want to work with, for example `server local` will start the [standalone local](https://kestra.io/docs/administrator-guide/server-cli#kestra-local-development-server-with-no-dependencies). You can also use `server standalone` and use the provided `docker-compose-ci.yml` Docker compose file to start a standalone server with a real database as a backend that would need to be configured properly.
- Configure the following environment variables:
- `MICRONAUT_ENVIRONMENTS`: can be set to any string and will load a custom configuration file in `cli/src/main/resources/application-{env}.yml`.
- `KESTRA_PLUGINS_PATH`: is the path where you will save plugins as Jar and will be load on startup.
- See the screenshot bellow for an example: ![Intellij IDEA Configuration ](run-app.png)
- If you encounter **JavaScript memory heap out** error during startup, configure `NODE_OPTIONS` environment variable with some large value.
- Example `NODE_OPTIONS: --max-old-space-size=4096` or `NODE_OPTIONS: --max-old-space-size=8192` ![Intellij IDEA Configuration ](node_option_env_var.png)
- The server starts by default on port 8080 and is reachable on `http://localhost:8080`
If you want to launch all tests, you need Python and some packages installed on your machine, on Ubuntu you can install them with:

BIN
.github/node_option_env_var.png vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 130 KiB

View File

@@ -77,6 +77,11 @@ jobs:
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
- name: Docker - Fix Qemu
shell: bash
run: |
docker run --rm --privileged multiarch/qemu-user-static --reset -p yes -c yes
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3

View File

@@ -1,45 +1,111 @@
name: Generate Translations
on:
pull_request:
types: [opened, synchronize]
paths:
- "ui/src/translations/en.json"
push:
branches:
- develop
paths:
- 'ui/src/translations/en.json'
env:
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
jobs:
generate-translations:
name: Generate Translations and Create PR
commit:
name: Commit directly to PR
runs-on: ubuntu-latest
if: ${{ github.event.pull_request.head.repo.fork == false }}
steps:
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 10 # Ensures that at least 10 commits are fetched for comparison
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 10
ref: ${{ github.head_ref }}
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: '3.x'
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.x"
- name: Install dependencies
run: pip install gitpython openai
- name: Install Python dependencies
run: pip install gitpython openai
- name: Generate translations
run: python ui/src/translations/generate_translations.py
- name: Generate translations
run: python ui/src/translations/generate_translations.py
- name: Commit, push changes, and create PR
env:
GH_TOKEN: ${{ github.token }}
run: |
git config --global user.name "GitHub Action"
git config --global user.email "actions@github.com"
BRANCH_NAME="translations/update-translations-$(date +%s)"
git checkout -b $BRANCH_NAME
git add ui/src/translations/*.json
git commit -m "Auto-generate translations from en.json"
git push --set-upstream origin $BRANCH_NAME
gh pr create --title "Auto-generate translations from en.json" --body "This PR was created automatically by a GitHub Action." --base develop --head $BRANCH_NAME --assignee anna-geller --reviewer anna-geller
- name: Set up Node
uses: actions/setup-node@v4
with:
node-version: "20.x"
- name: Check keys matching
run: node ui/src/translations/check.js
- name: Set up Git
run: |
git config --global user.name "GitHub Action"
git config --global user.email "actions@github.com"
- name: Check for changes and commit
env:
GH_TOKEN: ${{ github.token }}
run: |
git add ui/src/translations/*.json
if git diff --cached --quiet; then
echo "No changes to commit. Exiting with success."
exit 0
fi
git commit -m "chore(translations): auto generate values for languages other than english"
git push origin ${{ github.head_ref }}
pull_request:
name: Open PR for a forked repository
runs-on: ubuntu-latest
if: ${{ github.event.pull_request.head.repo.fork == true }}
steps:
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 10
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.x"
- name: Install Python dependencies
run: pip install gitpython openai
- name: Generate translations
run: python ui/src/translations/generate_translations.py
- name: Set up Node
uses: actions/setup-node@v4
with:
node-version: "20.x"
- name: Check keys matching
run: node ui/src/translations/check.js
- name: Set up Git
run: |
git config --global user.name "GitHub Action"
git config --global user.email "actions@github.com"
- name: Create and push a new branch
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
BRANCH_NAME="generated-translations-${{ github.event.pull_request.head.repo.name }}"
git checkout -b $BRANCH_NAME
git add ui/src/translations/*.json
if git diff --cached --quiet; then
echo "No changes to commit. Exiting with success."
exit 0
fi
git commit -m "chore(translations): auto generate values for languages other than english"
git push origin $BRANCH_NAME

View File

@@ -68,6 +68,7 @@ jobs:
# Get Plugins List
- name: Get Plugins List
uses: ./.github/actions/plugins-list
if: "!startsWith(github.ref, 'refs/tags/v')"
id: plugins-list
with:
plugin-version: ${{ env.PLUGIN_VERSION }}
@@ -75,6 +76,7 @@ jobs:
# Set Plugins List
- name: Set Plugin List
id: plugins
if: "!startsWith(github.ref, 'refs/tags/v')"
run: |
PLUGINS="${{ steps.plugins-list.outputs.plugins }}"
TAG=${GITHUB_REF#refs/*/}
@@ -122,6 +124,7 @@ jobs:
# Docker Build
- name: Build & Export Docker Image
uses: docker/build-push-action@v6
if: "!startsWith(github.ref, 'refs/tags/v')"
with:
context: .
push: false
@@ -149,6 +152,7 @@ jobs:
- name: Upload Docker
uses: actions/upload-artifact@v4
if: "!startsWith(github.ref, 'refs/tags/v')"
with:
name: ${{ steps.vars.outputs.artifact }}
path: /tmp/${{ steps.vars.outputs.artifact }}.tar
@@ -156,7 +160,7 @@ jobs:
check-e2e:
name: Check E2E Tests
needs: build-artifacts
if: ${{ github.event.inputs.skip-test == 'false' || github.event.inputs.skip-test == '' }}
if: ${{ (github.event.inputs.skip-test == 'false' || github.event.inputs.skip-test == '') && !startsWith(github.ref, 'refs/tags/v') }}
uses: ./.github/workflows/e2e.yml
strategy:
fail-fast: false
@@ -214,13 +218,13 @@ jobs:
export GOOGLE_APPLICATION_CREDENTIALS=$HOME/.gcp-service-account.json
./gradlew check javadoc --parallel
# Sonar
- name: Analyze with Sonar
if: ${{ env.SONAR_TOKEN != 0 && (github.event.inputs.skip-test == 'false' || github.event.inputs.skip-test == '') }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
run: ./gradlew sonar --info
# # Sonar
# - name: Analyze with Sonar
# if: ${{ env.SONAR_TOKEN != 0 && (github.event.inputs.skip-test == 'false' || github.event.inputs.skip-test == '') }}
# env:
# GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
# SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
# run: ./gradlew sonar --info
# Allure check
- name: Auth to Google Cloud
@@ -276,7 +280,11 @@ jobs:
name: Github Release
runs-on: ubuntu-latest
needs: [ check, check-e2e ]
if: startsWith(github.ref, 'refs/tags/v')
if: |
always() &&
startsWith(github.ref, 'refs/tags/v') &&
needs.check.result == 'success' &&
(needs.check-e2e.result == 'skipped' || needs.check-e2e.result == 'success')
steps:
# Download Exec
- name: Download executable
@@ -368,7 +376,11 @@ jobs:
name: Publish to Maven
runs-on: ubuntu-latest
needs: [check, check-e2e]
if: github.ref == 'refs/heads/develop' || startsWith(github.ref, 'refs/tags/v')
if: |
always() &&
github.ref == 'refs/heads/develop' || startsWith(github.ref, 'refs/tags/v') &&
needs.check.result == 'success' &&
(needs.check-e2e.result == 'skipped' || needs.check-e2e.result == 'success')
steps:
- uses: actions/checkout@v4

View File

@@ -179,6 +179,8 @@ subprojects {
testImplementation 'org.hamcrest:hamcrest'
testImplementation 'org.hamcrest:hamcrest-library'
testImplementation 'org.exparity:hamcrest-date'
testImplementation 'org.assertj:assertj-core'
}
test {

View File

@@ -124,6 +124,7 @@ kestra:
delay: 1s
maxDelay: ""
jdbc:
queues:
min-poll-interval: 25ms
max-poll-interval: 1000ms

View File

@@ -19,58 +19,60 @@ import org.apache.commons.lang3.ArrayUtils;
@Singleton
@Slf4j
public class MetricRegistry {
public final static String METRIC_WORKER_JOB_PENDING_COUNT = "worker.job.pending";
public final static String METRIC_WORKER_JOB_RUNNING_COUNT = "worker.job.running";
public final static String METRIC_WORKER_JOB_THREAD_COUNT = "worker.job.thread";
public final static String METRIC_WORKER_RUNNING_COUNT = "worker.running.count";
public final static String METRIC_WORKER_QUEUED_DURATION = "worker.queued.duration";
public final static String METRIC_WORKER_STARTED_COUNT = "worker.started.count";
public final static String METRIC_WORKER_TIMEOUT_COUNT = "worker.timeout.count";
public final static String METRIC_WORKER_ENDED_COUNT = "worker.ended.count";
public final static String METRIC_WORKER_ENDED_DURATION = "worker.ended.duration";
public final static String METRIC_WORKER_TRIGGER_DURATION = "worker.trigger.duration";
public final static String METRIC_WORKER_TRIGGER_RUNNING_COUNT = "worker.trigger.running.count";
public final static String METRIC_WORKER_TRIGGER_STARTED_COUNT = "worker.trigger.started.count";
public final static String METRIC_WORKER_TRIGGER_ENDED_COUNT = "worker.trigger.ended.count";
public final static String METRIC_WORKER_TRIGGER_ERROR_COUNT = "worker.trigger.error.count";
public final static String METRIC_WORKER_TRIGGER_EXECUTION_COUNT = "worker.trigger.execution.count";
public static final String METRIC_WORKER_JOB_PENDING_COUNT = "worker.job.pending";
public static final String METRIC_WORKER_JOB_RUNNING_COUNT = "worker.job.running";
public static final String METRIC_WORKER_JOB_THREAD_COUNT = "worker.job.thread";
public static final String METRIC_WORKER_RUNNING_COUNT = "worker.running.count";
public static final String METRIC_WORKER_QUEUED_DURATION = "worker.queued.duration";
public static final String METRIC_WORKER_STARTED_COUNT = "worker.started.count";
public static final String METRIC_WORKER_TIMEOUT_COUNT = "worker.timeout.count";
public static final String METRIC_WORKER_ENDED_COUNT = "worker.ended.count";
public static final String METRIC_WORKER_ENDED_DURATION = "worker.ended.duration";
public static final String METRIC_WORKER_TRIGGER_DURATION = "worker.trigger.duration";
public static final String METRIC_WORKER_TRIGGER_RUNNING_COUNT = "worker.trigger.running.count";
public static final String METRIC_WORKER_TRIGGER_STARTED_COUNT = "worker.trigger.started.count";
public static final String METRIC_WORKER_TRIGGER_ENDED_COUNT = "worker.trigger.ended.count";
public static final String METRIC_WORKER_TRIGGER_ERROR_COUNT = "worker.trigger.error.count";
public static final String METRIC_WORKER_TRIGGER_EXECUTION_COUNT = "worker.trigger.execution.count";
public final static String EXECUTOR_TASKRUN_NEXT_COUNT = "executor.taskrun.next.count";
public final static String EXECUTOR_TASKRUN_ENDED_COUNT = "executor.taskrun.ended.count";
public final static String EXECUTOR_TASKRUN_ENDED_DURATION = "executor.taskrun.ended.duration";
public final static String EXECUTOR_WORKERTASKRESULT_COUNT = "executor.workertaskresult.count";
public final static String EXECUTOR_EXECUTION_STARTED_COUNT = "executor.execution.started.count";
public final static String EXECUTOR_EXECUTION_END_COUNT = "executor.execution.end.count";
public final static String EXECUTOR_EXECUTION_DURATION = "executor.execution.duration";
public static final String EXECUTOR_TASKRUN_NEXT_COUNT = "executor.taskrun.next.count";
public static final String EXECUTOR_TASKRUN_ENDED_COUNT = "executor.taskrun.ended.count";
public static final String EXECUTOR_TASKRUN_ENDED_DURATION = "executor.taskrun.ended.duration";
public static final String EXECUTOR_WORKERTASKRESULT_COUNT = "executor.workertaskresult.count";
public static final String EXECUTOR_EXECUTION_STARTED_COUNT = "executor.execution.started.count";
public static final String EXECUTOR_EXECUTION_END_COUNT = "executor.execution.end.count";
public static final String EXECUTOR_EXECUTION_DURATION = "executor.execution.duration";
public final static String METRIC_INDEXER_REQUEST_COUNT = "indexer.request.count";
public final static String METRIC_INDEXER_REQUEST_DURATION = "indexer.request.duration";
public final static String METRIC_INDEXER_REQUEST_RETRY_COUNT = "indexer.request.retry.count";
public final static String METRIC_INDEXER_SERVER_DURATION = "indexer.server.duration";
public final static String METRIC_INDEXER_MESSAGE_FAILED_COUNT = "indexer.message.failed.count";
public final static String METRIC_INDEXER_MESSAGE_IN_COUNT = "indexer.message.in.count";
public final static String METRIC_INDEXER_MESSAGE_OUT_COUNT = "indexer.message.out.count";
public static final String METRIC_INDEXER_REQUEST_COUNT = "indexer.request.count";
public static final String METRIC_INDEXER_REQUEST_DURATION = "indexer.request.duration";
public static final String METRIC_INDEXER_REQUEST_RETRY_COUNT = "indexer.request.retry.count";
public static final String METRIC_INDEXER_SERVER_DURATION = "indexer.server.duration";
public static final String METRIC_INDEXER_MESSAGE_FAILED_COUNT = "indexer.message.failed.count";
public static final String METRIC_INDEXER_MESSAGE_IN_COUNT = "indexer.message.in.count";
public static final String METRIC_INDEXER_MESSAGE_OUT_COUNT = "indexer.message.out.count";
public final static String SCHEDULER_LOOP_COUNT = "scheduler.loop.count";
public final static String SCHEDULER_TRIGGER_COUNT = "scheduler.trigger.count";
public final static String SCHEDULER_TRIGGER_DELAY_DURATION = "scheduler.trigger.delay.duration";
public final static String SCHEDULER_EVALUATE_COUNT = "scheduler.evaluate.count";
public final static String SCHEDULER_EXECUTION_RUNNING_DURATION = "scheduler.execution.running.duration";
public final static String SCHEDULER_EXECUTION_MISSING_DURATION = "scheduler.execution.missing.duration";
public static final String SCHEDULER_LOOP_COUNT = "scheduler.loop.count";
public static final String SCHEDULER_TRIGGER_COUNT = "scheduler.trigger.count";
public static final String SCHEDULER_TRIGGER_DELAY_DURATION = "scheduler.trigger.delay.duration";
public static final String SCHEDULER_EVALUATE_COUNT = "scheduler.evaluate.count";
public static final String SCHEDULER_EXECUTION_RUNNING_DURATION = "scheduler.execution.running.duration";
public static final String SCHEDULER_EXECUTION_MISSING_DURATION = "scheduler.execution.missing.duration";
public final static String STREAMS_STATE_COUNT = "stream.state.count";
public static final String STREAMS_STATE_COUNT = "stream.state.count";
public static final String JDBC_QUERY_DURATION = "jdbc.query.duration";
public final static String JDBC_QUERY_DURATION = "jdbc.query.duration";
public static final String QUEUE_BIG_MESSAGE_COUNT = "queue.big_message.count";
public final static String TAG_TASK_TYPE = "task_type";
public final static String TAG_TRIGGER_TYPE = "trigger_type";
public final static String TAG_FLOW_ID = "flow_id";
public final static String TAG_NAMESPACE_ID = "namespace_id";
public final static String TAG_STATE = "state";
public final static String TAG_ATTEMPT_COUNT = "attempt_count";
public final static String TAG_WORKER_GROUP = "worker_group";
public final static String TAG_TENANT_ID = "tenant_id";
public static final String TAG_TASK_TYPE = "task_type";
public static final String TAG_TRIGGER_TYPE = "trigger_type";
public static final String TAG_FLOW_ID = "flow_id";
public static final String TAG_NAMESPACE_ID = "namespace_id";
public static final String TAG_STATE = "state";
public static final String TAG_ATTEMPT_COUNT = "attempt_count";
public static final String TAG_WORKER_GROUP = "worker_group";
public static final String TAG_TENANT_ID = "tenant_id";
public static final String TAG_CLASS_NAME = "class_name";
@Inject
private MeterRegistry meterRegistry;

View File

@@ -0,0 +1,150 @@
package io.kestra.core.models.collectors;
import com.google.common.annotations.VisibleForTesting;
import io.kestra.core.repositories.ServiceInstanceRepositoryInterface;
import io.kestra.core.server.Service;
import io.kestra.core.server.ServiceInstance;
import java.math.BigDecimal;
import java.math.RoundingMode;
import java.time.Duration;
import java.time.Instant;
import java.time.LocalDate;
import java.time.ZoneId;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.LongSummaryStatistics;
import java.util.Map;
import java.util.function.Function;
import java.util.stream.Collectors;
import java.util.stream.Stream;
/**
* Statistics about the number of running services over a given period.
*/
public record ServiceUsage(
List<DailyServiceStatistics> dailyStatistics
) {
/**
* Daily statistics for a specific service type.
*
* @param type The service type.
* @param values The statistic values.
*/
public record DailyServiceStatistics(
String type,
List<DailyStatistics> values
) {
}
/**
* Statistics about the number of services running at any given time interval (e.g., 15 minutes) over a day.
*
* @param date The {@link LocalDate}.
* @param min The minimum number of services.
* @param max The maximum number of services.
* @param avg The average number of services.
*/
public record DailyStatistics(
LocalDate date,
long min,
long max,
long avg
) {
}
public static ServiceUsage of(final Instant from,
final Instant to,
final ServiceInstanceRepositoryInterface repository,
final Duration interval) {
List<DailyServiceStatistics> statistics = Arrays
.stream(Service.ServiceType.values())
.map(type -> of(from, to, repository, type, interval))
.toList();
return new ServiceUsage(statistics);
}
private static DailyServiceStatistics of(final Instant from,
final Instant to,
final ServiceInstanceRepositoryInterface repository,
final Service.ServiceType serviceType,
final Duration interval) {
return of(serviceType, interval, repository.findAllInstancesBetween(serviceType, from, to));
}
@VisibleForTesting
static DailyServiceStatistics of(final Service.ServiceType serviceType,
final Duration interval,
final List<ServiceInstance> instances) {
// Compute the number of running service per time-interval.
final long timeIntervalInMillis = interval.toMillis();
final Map<Long, Long> aggregatePerTimeIntervals = instances
.stream()
.flatMap(instance -> {
List<ServiceInstance.TimestampedEvent> events = instance.events();
long start = 0;
long end = 0;
for (ServiceInstance.TimestampedEvent event : events) {
long epochMilli = event.ts().toEpochMilli();
if (event.state().equals(Service.ServiceState.RUNNING)) {
start = epochMilli;
}
else if (event.state().equals(Service.ServiceState.NOT_RUNNING) && end == 0) {
end = epochMilli;
}
else if (event.state().equals(Service.ServiceState.TERMINATED_GRACEFULLY)) {
end = epochMilli; // more precise than NOT_RUNNING
}
else if (event.state().equals(Service.ServiceState.TERMINATED_FORCED)) {
end = epochMilli; // more precise than NOT_RUNNING
}
}
if (instance.state().equals(Service.ServiceState.RUNNING)) {
end = Instant.now().toEpochMilli();
}
if (start != 0 && end != 0) {
// align to epoch-time by removing precision.
start = (start / timeIntervalInMillis) * timeIntervalInMillis;
// approximate the number of time interval for the current service
int intervals = (int) ((end - start) / timeIntervalInMillis);
// compute all time intervals
List<Long> keys = new ArrayList<>(intervals);
while (start < end) {
keys.add(start);
start = start + timeIntervalInMillis; // Next window
}
return keys.stream();
}
return Stream.empty(); // invalid service
})
.collect(Collectors.groupingBy(Function.identity(), Collectors.counting()));
// Aggregate per day
List<DailyStatistics> dailyStatistics = aggregatePerTimeIntervals.entrySet()
.stream()
.collect(Collectors.groupingBy(entry -> {
Long epochTimeMilli = entry.getKey();
return Instant.ofEpochMilli(epochTimeMilli).atZone(ZoneId.systemDefault()).toLocalDate();
}, Collectors.toList()))
.entrySet()
.stream()
.map(entry -> {
LongSummaryStatistics statistics = entry.getValue().stream().collect(Collectors.summarizingLong(Map.Entry::getValue));
return new DailyStatistics(
entry.getKey(),
statistics.getMin(),
statistics.getMax(),
BigDecimal.valueOf(statistics.getAverage()).setScale(2, RoundingMode.HALF_EVEN).longValue()
);
})
.toList();
return new DailyServiceStatistics(serviceType.name(), dailyStatistics);
}
}

View File

@@ -62,4 +62,8 @@ public class Usage {
@Valid
private final ExecutionUsage executions;
@Valid
@Nullable
private ServiceUsage services;
}

View File

@@ -358,4 +358,8 @@ public class Flow extends AbstractFlow {
.deleted(true)
.build();
}
public FlowWithSource withSource(String source) {
return FlowWithSource.of(this, source);
}
}

View File

@@ -1,6 +1,5 @@
package io.kestra.core.models.flows;
import com.fasterxml.jackson.annotation.JsonInclude;
import com.fasterxml.jackson.annotation.JsonSetter;
import com.fasterxml.jackson.annotation.JsonSubTypes;
import com.fasterxml.jackson.annotation.JsonTypeInfo;
@@ -43,7 +42,6 @@ import lombok.experimental.SuperBuilder;
@JsonSubTypes.Type(value = MultiselectInput.class, name = "MULTISELECT"),
@JsonSubTypes.Type(value = YamlInput.class, name = "YAML")
})
@JsonInclude(JsonInclude.Include.NON_DEFAULT)
public abstract class Input<T> implements Data {
@Schema(
title = "The ID of the input."

View File

@@ -4,10 +4,7 @@ import com.fasterxml.jackson.core.JsonGenerator;
import com.fasterxml.jackson.core.JsonParser;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.core.type.TypeReference;
import com.fasterxml.jackson.databind.DeserializationContext;
import com.fasterxml.jackson.databind.JavaType;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.SerializerProvider;
import com.fasterxml.jackson.databind.*;
import com.fasterxml.jackson.databind.annotation.JsonDeserialize;
import com.fasterxml.jackson.databind.annotation.JsonSerialize;
import com.fasterxml.jackson.databind.deser.std.StdDeserializer;
@@ -25,6 +22,7 @@ import java.io.IOException;
import java.io.Serial;
import java.util.List;
import java.util.Map;
import java.util.Objects;
/**
* Define a plugin properties that will be rendered and converted to a target type at use time.
@@ -37,7 +35,12 @@ import java.util.Map;
@NoArgsConstructor
@AllArgsConstructor(access = AccessLevel.PACKAGE)
public class Property<T> {
private static final ObjectMapper MAPPER = JacksonMapper.ofJson();
// By default, durations are stored as numbers.
// We cannot change that globally, as in JDBC/Elastic 'execution.state.duration' must be a number to be able to aggregate them.
// So we only change it here.
private static final ObjectMapper MAPPER = JacksonMapper.ofJson()
.copy()
.configure(SerializationFeature.WRITE_DURATIONS_AS_TIMESTAMPS, false);
private String expression;
private T value;
@@ -185,6 +188,18 @@ public class Property<T> {
return value != null ? value.toString() : expression;
}
@Override
public boolean equals(Object o) {
if (o == null || getClass() != o.getClass()) return false;
Property<?> property = (Property<?>) o;
return Objects.equals(expression, property.expression);
}
@Override
public int hashCode() {
return Objects.hash(expression);
}
// used only by the serializer
String getExpression() {
return this.expression;

View File

@@ -23,9 +23,6 @@ public class Trigger extends TriggerContext {
@Nullable
private String executionId;
@Nullable
private State.Type executionCurrentState;
@Nullable
private Instant updatedDate;
@@ -39,7 +36,6 @@ public class Trigger extends TriggerContext {
protected Trigger(TriggerBuilder<?, ?> b) {
super(b);
this.executionId = b.executionId;
this.executionCurrentState = b.executionCurrentState;
this.updatedDate = b.updatedDate;
this.evaluateRunningDate = b.evaluateRunningDate;
}
@@ -141,7 +137,6 @@ public class Trigger extends TriggerContext {
.date(trigger.getDate())
.nextExecutionDate(trigger.getNextExecutionDate())
.executionId(execution.getId())
.executionCurrentState(execution.getState().getCurrent())
.updatedDate(Instant.now())
.backfill(trigger.getBackfill())
.stopAfter(trigger.getStopAfter())

View File

@@ -11,6 +11,7 @@ import io.kestra.core.services.KVStoreService;
import io.kestra.core.storages.Storage;
import io.kestra.core.storages.StorageInterface;
import io.kestra.core.storages.kv.KVStore;
import io.kestra.core.utils.ListUtils;
import io.kestra.core.utils.VersionProvider;
import io.micronaut.context.ApplicationContext;
import io.micronaut.context.annotation.Value;
@@ -30,7 +31,6 @@ import java.nio.file.Path;
import java.security.GeneralSecurityException;
import java.util.*;
import java.util.concurrent.atomic.AtomicBoolean;
import java.util.function.Supplier;
import java.util.stream.Collectors;
import static io.kestra.core.utils.MapUtils.mergeWithNullableValues;
@@ -67,6 +67,7 @@ public class DefaultRunContext extends RunContext {
private String triggerExecutionId;
private Storage storage;
private Map<String, Object> pluginConfiguration;
private List<String> secretInputs;
private final AtomicBoolean isInitialized = new AtomicBoolean(false);
@@ -98,6 +99,15 @@ public class DefaultRunContext extends RunContext {
return variables;
}
/**
* {@inheritDoc}
*/
@Override
@JsonInclude
public List<String> getSecretInputs() {
return secretInputs;
}
@JsonIgnore
public ApplicationContext getApplicationContext() {
return applicationContext;
@@ -123,6 +133,17 @@ public class DefaultRunContext extends RunContext {
void setLogger(final RunContextLogger logger) {
this.logger = logger;
// this is used when a run context is re-hydrated so we need to add again the secrets from the inputs
if (!ListUtils.isEmpty(secretInputs) && getVariables().containsKey("inputs")) {
Map<String, Object> inputs = (Map<String, Object>) getVariables().get("inputs");
for (String secretInput : secretInputs) {
String secret = (String) inputs.get(secretInput);
if (secret != null) {
logger.usedSecret(secret);
}
}
}
}
void setPluginConfiguration(final Map<String, Object> pluginConfiguration) {
@@ -179,7 +200,7 @@ public class DefaultRunContext extends RunContext {
@Override
@SuppressWarnings("unchecked")
public String render(String inline, Map<String, Object> variables) throws IllegalVariableEvaluationException {
return variableRenderer.render(inline, mergeWithNullableValues(this.variables, variables));
return variableRenderer.render(inline, mergeWithNullableValues(this.variables, decryptVariables(variables)));
}
/**
@@ -196,7 +217,7 @@ public class DefaultRunContext extends RunContext {
@Override
@SuppressWarnings("unchecked")
public List<String> render(List<String> inline, Map<String, Object> variables) throws IllegalVariableEvaluationException {
return variableRenderer.render(inline, mergeWithNullableValues(this.variables, variables));
return variableRenderer.render(inline, mergeWithNullableValues(this.variables, decryptVariables(variables)));
}
/**
@@ -213,7 +234,7 @@ public class DefaultRunContext extends RunContext {
@Override
@SuppressWarnings("unchecked")
public Set<String> render(Set<String> inline, Map<String, Object> variables) throws IllegalVariableEvaluationException {
return variableRenderer.render(inline, mergeWithNullableValues(this.variables, variables));
return variableRenderer.render(inline, mergeWithNullableValues(this.variables, decryptVariables(variables)));
}
@Override
@@ -224,7 +245,7 @@ public class DefaultRunContext extends RunContext {
@Override
@SuppressWarnings("unchecked")
public Map<String, Object> render(Map<String, Object> inline, Map<String, Object> variables) throws IllegalVariableEvaluationException {
return variableRenderer.render(inline, mergeWithNullableValues(this.variables, variables));
return variableRenderer.render(inline, mergeWithNullableValues(this.variables, decryptVariables(variables)));
}
@Override
@@ -239,7 +260,7 @@ public class DefaultRunContext extends RunContext {
return null;
}
Map<String, Object> allVariables = mergeWithNullableValues(this.variables, variables);
Map<String, Object> allVariables = mergeWithNullableValues(this.variables, decryptVariables(variables));
return inline
.entrySet()
.stream()
@@ -350,6 +371,14 @@ public class DefaultRunContext extends RunContext {
return this;
}
private Map<String, Object> decryptVariables(Map<String, Object> variables) {
if (secretKey.isPresent()) {
final Secret secret = new Secret(secretKey, logger);
return secret.decrypt(variables);
}
return variables;
}
@SuppressWarnings("unchecked")
private Map<String, String> metricsTags() {
ImmutableMap.Builder<String, String> builder = ImmutableMap.builder();
@@ -488,6 +517,7 @@ public class DefaultRunContext extends RunContext {
private String triggerExecutionId;
private RunContextLogger logger;
private KVStoreService kvStoreService;
private List<String> secretInputs;
/**
* Builds the new {@link DefaultRunContext} object.
@@ -507,6 +537,7 @@ public class DefaultRunContext extends RunContext {
context.storage = storage;
context.triggerExecutionId = triggerExecutionId;
context.kvStoreService = kvStoreService;
context.secretInputs = secretInputs;
return context;
}
}

View File

@@ -325,9 +325,7 @@ public class ExecutorService {
);
if (!nexts.isEmpty()) {
return nexts.stream()
.map(throwFunction(NextTaskRun::getTaskRun))
.toList();
return saveFlowableOutput(nexts, executor);
}
} catch (Exception e) {
log.warn("Unable to resolve the next tasks to run", e);
@@ -437,7 +435,6 @@ public class ExecutorService {
}
return executor.withTaskRun(
// TODO - saveFlowableOutput seems to be only useful for Template
this.saveFlowableOutput(nextTaskRuns, executor),
"handleNext"
);
@@ -748,7 +745,8 @@ public class ExecutorService {
.map(WorkerGroup::getKey)
.orElse(null);
// Check if the worker group exist
if (workerGroupExecutorInterface.isWorkerGroupExistForKey(workerGroup)) {
String tenantId = executor.getFlow().getTenantId();
if (workerGroupExecutorInterface.isWorkerGroupExistForKey(workerGroup, tenantId)) {
// Check whether at-least one worker is available
if (workerGroupExecutorInterface.isWorkerGroupAvailableForKey(workerGroup)) {
return workerTask;

View File

@@ -5,6 +5,7 @@ import com.google.common.annotations.VisibleForTesting;
import com.google.common.collect.ImmutableMap;
import io.kestra.core.encryption.EncryptionService;
import io.kestra.core.exceptions.IllegalVariableEvaluationException;
import io.kestra.core.exceptions.KestraRuntimeException;
import io.kestra.core.models.executions.Execution;
import io.kestra.core.models.flows.Data;
import io.kestra.core.models.flows.DependsOn;
@@ -18,6 +19,7 @@ import io.kestra.core.models.flows.input.ItemTypeInterface;
import io.kestra.core.models.tasks.common.EncryptedString;
import io.kestra.core.models.validations.ManualConstraintViolation;
import io.kestra.core.serializers.JacksonMapper;
import io.kestra.core.storages.StorageContext;
import io.kestra.core.storages.StorageInterface;
import io.kestra.core.utils.ListUtils;
import io.kestra.core.utils.MapUtils;
@@ -33,6 +35,7 @@ import org.reactivestreams.Publisher;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
import reactor.core.scheduler.Schedulers;
import java.io.File;
@@ -90,31 +93,14 @@ public class FlowInputOutput {
* @param inputs The Flow's inputs.
* @param execution The Execution.
* @param data The Execution's inputs data.
* @param deleteInputsFromStorage Specifies whether inputs stored on internal storage should be deleted before returning.
* @return The list of {@link InputAndValue}.
*/
public List<InputAndValue> validateExecutionInputs(final List<Input<?>> inputs,
public Mono<List<InputAndValue>> validateExecutionInputs(final List<Input<?>> inputs,
final Execution execution,
final Publisher<CompletedPart> data,
final boolean deleteInputsFromStorage) throws IOException {
if (ListUtils.isEmpty(inputs)) return Collections.emptyList();
final Publisher<CompletedPart> data) {
if (ListUtils.isEmpty(inputs)) return Mono.just(Collections.emptyList());
Map<String, ?> dataByInputId = readData(inputs, execution, data);
List<InputAndValue> values = this.resolveInputs(inputs, execution, dataByInputId);
if (deleteInputsFromStorage) {
values.stream()
.filter(it -> it.input() instanceof FileInput && Objects.nonNull(it.value()))
.forEach(it -> {
try {
URI uri = URI.create(it.value().toString());
storageInterface.delete(execution.getTenantId(), uri);
} catch (IllegalArgumentException | IOException e) {
log.debug("Failed to remove execution input after validation [{}]", it.value(), e);
}
});
}
return values;
return readData(inputs, execution, data, false).map(inputData -> resolveInputs(inputs, execution, inputData));
}
/**
@@ -125,9 +111,9 @@ public class FlowInputOutput {
* @param data The Execution's inputs data.
* @return The Map of typed inputs.
*/
public Map<String, Object> readExecutionInputs(final Flow flow,
public Mono<Map<String, Object>> readExecutionInputs(final Flow flow,
final Execution execution,
final Publisher<CompletedPart> data) throws IOException {
final Publisher<CompletedPart> data) {
return this.readExecutionInputs(flow.getInputs(), execution, data);
}
@@ -139,39 +125,58 @@ public class FlowInputOutput {
* @param data The Execution's inputs data.
* @return The Map of typed inputs.
*/
public Map<String, Object> readExecutionInputs(final List<Input<?>> inputs,
final Execution execution,
final Publisher<CompletedPart> data) throws IOException {
return this.readExecutionInputs(inputs, execution, readData(inputs, execution, data));
public Mono<Map<String, Object>> readExecutionInputs(final List<Input<?>> inputs,
final Execution execution,
final Publisher<CompletedPart> data) {
return readData(inputs, execution, data, true).map(inputData -> this.readExecutionInputs(inputs, execution, inputData));
}
private Map<String, ?> readData(List<Input<?>> inputs, Execution execution, Publisher<CompletedPart> data) throws IOException {
private Mono<Map<String, Object>> readData(List<Input<?>> inputs, Execution execution, Publisher<CompletedPart> data, boolean uploadFiles) {
return Flux.from(data)
.subscribeOn(Schedulers.boundedElastic())
.map(throwFunction(input -> {
.publishOn(Schedulers.boundedElastic())
.<AbstractMap.SimpleEntry<String, String>>handle((input, sink) -> {
if (input instanceof CompletedFileUpload fileUpload) {
final String fileExtension = FileInput.findFileInputExtension(inputs, fileUpload.getFilename());
File tempFile = File.createTempFile(fileUpload.getFilename() + "_", fileExtension);
try (var inputStream = fileUpload.getInputStream();
var outputStream = new FileOutputStream(tempFile)) {
long transferredBytes = inputStream.transferTo(outputStream);
if (transferredBytes == 0) {
throw new RuntimeException("Can't upload file: " + fileUpload.getFilename());
}
if (!uploadFiles) {
final String fileExtension = FileInput.findFileInputExtension(inputs, fileUpload.getFilename());
URI from = URI.create("kestra://" + StorageContext
.forInput(execution, fileUpload.getFilename(), fileUpload.getFilename() + fileExtension)
.getContextStorageURI()
);
fileUpload.discard();
sink.next(new AbstractMap.SimpleEntry<>(fileUpload.getFilename(), from.toString()));
} else {
try {
final String fileExtension = FileInput.findFileInputExtension(inputs, fileUpload.getFilename());
URI from = storageInterface.from(execution, fileUpload.getFilename(), tempFile);
return new AbstractMap.SimpleEntry<>(fileUpload.getFilename(), from.toString());
} finally {
if (!tempFile.delete()) {
tempFile.deleteOnExit();
File tempFile = File.createTempFile(fileUpload.getFilename() + "_", fileExtension);
try (var inputStream = fileUpload.getInputStream();
var outputStream = new FileOutputStream(tempFile)) {
long transferredBytes = inputStream.transferTo(outputStream);
if (transferredBytes == 0) {
sink.error(new KestraRuntimeException("Can't upload file: " + fileUpload.getFilename()));
return;
}
URI from = storageInterface.from(execution, fileUpload.getFilename(), tempFile);
sink.next(new AbstractMap.SimpleEntry<>(fileUpload.getFilename(), from.toString()));
} finally {
if (!tempFile.delete()) {
tempFile.deleteOnExit();
}
}
} catch (IOException e) {
fileUpload.discard();
sink.error(e);
}
}
} else {
return new AbstractMap.SimpleEntry<>(input.getName(), new String(input.getBytes()));
try {
sink.next(new AbstractMap.SimpleEntry<>(input.getName(), new String(input.getBytes())));
} catch (IOException e) {
sink.error(e);
}
}
}))
.collectMap(AbstractMap.SimpleEntry::getKey, AbstractMap.SimpleEntry::getValue)
.block();
})
.collectMap(AbstractMap.SimpleEntry::getKey, AbstractMap.SimpleEntry::getValue);
}
/**
@@ -404,7 +409,8 @@ public class FlowInputOutput {
yield EncryptionService.encrypt(secretKey.get(), (String) current);
}
case INT -> current instanceof Integer ? current : Integer.valueOf((String) current);
case FLOAT -> current instanceof Float ? current : Float.valueOf((String) current);
// Assuming that after the render we must have a double/int, so we can safely use its toString representation
case FLOAT -> current instanceof Float ? current : Float.valueOf(current.toString());
case BOOLEAN -> current instanceof Boolean ? current : Boolean.valueOf((String) current);
case DATETIME -> Instant.parse(((String) current));
case DATE -> LocalDate.parse(((String) current));

View File

@@ -47,6 +47,12 @@ public abstract class RunContext {
@JsonInclude
public abstract Map<String, Object> getVariables();
/**
* Returns the list of inputs of type SECRET.
*/
@JsonInclude
public abstract List<String> getSecretInputs();
public abstract String render(String inline) throws IllegalVariableEvaluationException;
public abstract Object renderTyped(String inline) throws IllegalVariableEvaluationException;

View File

@@ -5,6 +5,7 @@ import io.kestra.core.metrics.MetricRegistry;
import io.kestra.core.models.executions.Execution;
import io.kestra.core.models.executions.TaskRun;
import io.kestra.core.models.flows.Flow;
import io.kestra.core.models.flows.Type;
import io.kestra.core.models.tasks.Task;
import io.kestra.core.models.triggers.AbstractTrigger;
import io.kestra.core.plugins.PluginConfigurations;
@@ -15,12 +16,12 @@ import io.kestra.core.storages.StorageContext;
import io.kestra.core.storages.StorageInterface;
import io.micronaut.context.ApplicationContext;
import io.micronaut.context.annotation.Value;
import jakarta.annotation.Nullable;
import jakarta.inject.Inject;
import jakarta.inject.Singleton;
import jakarta.validation.constraints.NotNull;
import java.net.URI;
import java.util.Collections;
import java.util.List;
import java.util.Map;
import java.util.Optional;
import java.util.function.Function;
@@ -83,8 +84,10 @@ public class RunContextFactory {
.withFlow(flow)
.withExecution(execution)
.withDecryptVariables(true)
.withSecretInputs(secretInputsFromFlow(flow))
)
.build(runContextLogger))
.withSecretInputs(secretInputsFromFlow(flow))
.build();
}
@@ -107,8 +110,10 @@ public class RunContextFactory {
.withExecution(execution)
.withTaskRun(taskRun)
.withDecryptVariables(decryptVariables)
.withSecretInputs(secretInputsFromFlow(flow))
.build(runContextLogger))
.withKvStoreService(kvStoreService)
.withSecretInputs(secretInputsFromFlow(flow))
.build();
}
@@ -122,8 +127,10 @@ public class RunContextFactory {
.withVariables(newRunVariablesBuilder()
.withFlow(flow)
.withTrigger(trigger)
.withSecretInputs(secretInputsFromFlow(flow))
.build(runContextLogger)
)
.withSecretInputs(secretInputsFromFlow(flow))
.build();
}
@@ -135,6 +142,7 @@ public class RunContextFactory {
.withLogger(runContextLogger)
.withStorage(new InternalStorage(runContextLogger.logger(), StorageContext.forFlow(flow), storageInterface, flowService))
.withVariables(variables)
.withSecretInputs(secretInputsFromFlow(flow))
.build();
}
@@ -177,6 +185,16 @@ public class RunContextFactory {
return of(Map.of());
}
private List<String> secretInputsFromFlow(Flow flow) {
if (flow == null || flow.getInputs() == null) {
return Collections.emptyList();
}
return flow.getInputs().stream()
.filter(input -> input.getType() == Type.SECRET)
.map(input -> input.getId()).toList();
}
private DefaultRunContext.Builder newBuilder() {
return new DefaultRunContext.Builder()
// inject mandatory services and config

View File

@@ -9,6 +9,7 @@ import io.kestra.core.models.flows.State;
import io.kestra.core.models.flows.input.SecretInput;
import io.kestra.core.models.tasks.Task;
import io.kestra.core.models.triggers.AbstractTrigger;
import io.kestra.core.utils.ListUtils;
import lombok.AllArgsConstructor;
import lombok.With;
@@ -125,6 +126,8 @@ public final class RunVariables {
Builder withGlobals(Map<?, ?> globals);
Builder withSecretInputs(List<String> secretInputs);
/**
* Builds the immutable map of run variables.
*
@@ -152,6 +155,7 @@ public final class RunVariables {
protected Map<String, ?> envs;
protected Map<?, ?> globals;
private final Optional<String> secretKey;
private List<String> secretInputs;
public DefaultBuilder() {
this(Optional.empty());
@@ -252,6 +256,16 @@ public final class RunVariables {
if (!inputs.isEmpty()) {
builder.put("inputs", inputs);
// if a secret input is used, add it to the list of secrets to mask on the logger
if (logger != null && !ListUtils.isEmpty(secretInputs)) {
for (String secretInput : secretInputs) {
String secret = (String) inputs.get(secretInput);
if (secret != null) {
logger.usedSecret(secret);
}
}
}
}
if (execution.getTrigger() != null && execution.getTrigger().getVariables() != null) {

View File

@@ -530,10 +530,6 @@ public class Worker implements Service, Runnable, AutoCloseable {
.increment();
}
private static ZonedDateTime now() {
return ZonedDateTime.now().truncatedTo(ChronoUnit.SECONDS);
}
private WorkerTask cleanUpTransient(WorkerTask workerTask) {
try {
return MAPPER.readValue(MAPPER.writeValueAsString(workerTask), WorkerTask.class);
@@ -553,7 +549,7 @@ public class Worker implements Service, Runnable, AutoCloseable {
metricRegistry
.timer(MetricRegistry.METRIC_WORKER_QUEUED_DURATION, metricRegistry.tags(workerTask, workerGroup))
.record(Duration.between(
workerTask.getTaskRun().getState().getStartDate(), now()
workerTask.getTaskRun().getState().getStartDate(), Instant.now()
));
}
@@ -704,8 +700,7 @@ public class Worker implements Service, Runnable, AutoCloseable {
}
private WorkerTask runAttempt(WorkerTask workerTask) throws QueueException {
DefaultRunContext runContext = (DefaultRunContext) workerTask.getRunContext();
runContextInitializer.forWorker(runContext, workerTask);
DefaultRunContext runContext = runContextInitializer.forWorker((DefaultRunContext) workerTask.getRunContext(), workerTask);;
Logger logger = runContext.logger();

View File

@@ -13,12 +13,13 @@ import java.util.Set;
public interface WorkerGroupExecutorInterface {
/**
* Checks whether a Worker Group exists for the given key.
* Checks whether a Worker Group exists for the given key and tenant.
*
* @param key The Worker Group's key - can be {@code null}.
* @param tenant The tenant's ID - can be {@code null}.
* @return {@code true} if the worker group exists, or is {@code null}, {@code false} otherwise.
*/
boolean isWorkerGroupExistForKey(String key);
boolean isWorkerGroupExistForKey(String key, String tenant);
/**
* Checks whether the Worker Group is available.
@@ -46,7 +47,7 @@ public interface WorkerGroupExecutorInterface {
class DefaultWorkerGroupExecutorInterface implements WorkerGroupExecutorInterface {
@Override
public boolean isWorkerGroupExistForKey(String key) {
public boolean isWorkerGroupExistForKey(String key, String tenant) {
return true;
}

View File

@@ -27,7 +27,10 @@ public class CurrentEachOutputFunction implements Function {
if (parents != null && !parents.isEmpty()) {
Collections.reverse(parents);
for (Map<?, ?> parent : parents) {
outputs = (Map<?, ?>) outputs.get(((Map<?, ?>) parent.get("taskrun")).get("value"));
Map<?, ?> taskrun = (Map<?, ?>) parent.get("taskrun");
if (taskrun != null) {
outputs = (Map<?, ?>) outputs.get(taskrun.get("value"));
}
}
}
Map<?, ?> taskrun = (Map<?, ?>) context.getVariable("taskrun");

View File

@@ -73,12 +73,14 @@ public abstract class AbstractScheduler implements Scheduler, Service {
private final PluginDefaultService pluginDefaultService;
private final WorkerGroupService workerGroupService;
private final LogService logService;
protected SchedulerExecutionStateInterface executionState;
// must be volatile as it's updated by the flow listener thread and read by the scheduleExecutor thread
private volatile Boolean isReady = false;
private final ScheduledExecutorService scheduleExecutor = Executors.newSingleThreadScheduledExecutor();
@Getter
protected SchedulerTriggerStateInterface triggerState;
// schedulable and schedulableNextDate must be volatile and their access synchronized as they are updated and read by different threads.
@@ -357,7 +359,7 @@ public abstract class AbstractScheduler implements Scheduler, Service {
logError(conditionContext, flow, abstractTrigger, e);
return null;
}
this.triggerState.save(triggerContext, scheduleContext);
this.triggerState.save(triggerContext, scheduleContext, "/kestra/services/scheduler/compute-schedulable/save/lastTrigger-nextDate-null");
} else {
triggerContext = lastTrigger;
}
@@ -446,11 +448,6 @@ public abstract class AbstractScheduler implements Scheduler, Service {
)
.build()
)
.peek(f -> {
if (f.getTriggerContext().getEvaluateRunningDate() != null || !isExecutionNotRunning(f)) {
this.triggerState.unlock(f.getTriggerContext());
}
})
.filter(f -> f.getTriggerContext().getEvaluateRunningDate() == null)
.filter(this::isExecutionNotRunning)
.map(FlowWithWorkerTriggerNextDate::of)
@@ -486,7 +483,7 @@ public abstract class AbstractScheduler implements Scheduler, Service {
Trigger triggerRunning = Trigger.of(f.getTriggerContext(), now);
var flowWithTrigger = f.toBuilder().triggerContext(triggerRunning).build();
try {
this.triggerState.save(triggerRunning, scheduleContext);
this.triggerState.save(triggerRunning, scheduleContext, "/kestra/services/scheduler/handle/save/on-eval-true/polling");
this.sendWorkerTriggerToWorker(flowWithTrigger);
} catch (InternalException e) {
logService.logTrigger(
@@ -511,7 +508,7 @@ public abstract class AbstractScheduler implements Scheduler, Service {
schedule.nextEvaluationDate(f.getConditionContext(), Optional.of(f.getTriggerContext()))
);
trigger = trigger.checkBackfill();
this.triggerState.save(trigger, scheduleContext);
this.triggerState.save(trigger, scheduleContext, "/kestra/services/scheduler/handle/save/on-eval-true/schedule");
}
} else {
logService.logTrigger(
@@ -529,7 +526,7 @@ public abstract class AbstractScheduler implements Scheduler, Service {
logError(f, e);
}
var trigger = f.getTriggerContext().toBuilder().nextExecutionDate(nextExecutionDate).build().checkBackfill();
this.triggerState.save(trigger, scheduleContext);
this.triggerState.save(trigger, scheduleContext, "/kestra/services/scheduler/handle/save/on-eval-false");
}
} catch (Exception ie) {
// validate schedule condition can fail to render variables
@@ -546,7 +543,7 @@ public abstract class AbstractScheduler implements Scheduler, Service {
.build();
ZonedDateTime nextExecutionDate = this.nextEvaluationDate(f.getAbstractTrigger());
var trigger = f.getTriggerContext().resetExecution(State.Type.FAILED, nextExecutionDate);
this.saveLastTriggerAndEmitExecution(execution, trigger, triggerToSave -> this.triggerState.save(triggerToSave, scheduleContext));
this.saveLastTriggerAndEmitExecution(execution, trigger, triggerToSave -> this.triggerState.save(triggerToSave, scheduleContext, "/kestra/services/scheduler/handle/save/on-error"));
}
});
});
@@ -586,7 +583,7 @@ public abstract class AbstractScheduler implements Scheduler, Service {
// Schedule triggers are being executed directly from the handle method within the context where triggers are locked.
// So we must save them by passing the scheduleContext.
this.saveLastTriggerAndEmitExecution(result.getExecution(), trigger, triggerToSave -> this.triggerState.save(triggerToSave, scheduleContext));
this.saveLastTriggerAndEmitExecution(result.getExecution(), trigger, triggerToSave -> this.triggerState.save(triggerToSave, scheduleContext, "/kestra/services/scheduler/handleEvaluateSchedulingTriggerResult/save"));
}
protected void saveLastTriggerAndEmitExecution(Execution execution, Trigger trigger, Consumer<Trigger> saveAction) {
@@ -615,8 +612,10 @@ public abstract class AbstractScheduler implements Scheduler, Service {
return true;
}
// The execution is not yet started, we skip
if (lastTrigger.getExecutionCurrentState() == null) {
Optional<Execution> execution = executionState.findById(lastTrigger.getTenantId(), lastTrigger.getExecutionId());
// executionState hasn't received the execution, we skip
if (execution.isEmpty()) {
if (lastTrigger.getUpdatedDate() != null) {
metricRegistry
.timer(MetricRegistry.SCHEDULER_EXECUTION_MISSING_DURATION, metricRegistry.tags(lastTrigger))
@@ -650,7 +649,7 @@ public abstract class AbstractScheduler implements Scheduler, Service {
Level.DEBUG,
"Execution '{}' is still '{}', updated at '{}'",
lastTrigger.getExecutionId(),
lastTrigger.getExecutionCurrentState(),
execution.get().getState().getCurrent(),
lastTrigger.getUpdatedDate()
);
}

View File

@@ -1,4 +1,14 @@
package io.kestra.core.schedulers;
import java.util.function.Consumer;
/**
* This context is used by the Scheduler to allow evaluating and updating triggers in a transaction from the main evaluation loop.
* See AbstractScheduler.handle().
*/
public interface ScheduleContextInterface {
/**
* Do trigger retrieval and updating in a single transaction.
*/
void doInTransaction(Consumer<ScheduleContextInterface> consumer);
}

View File

@@ -0,0 +1,19 @@
package io.kestra.core.schedulers;
import io.kestra.core.models.executions.Execution;
import io.kestra.core.repositories.ExecutionRepositoryInterface;
import java.util.Optional;
import jakarta.inject.Inject;
import jakarta.inject.Singleton;
@Singleton
public class SchedulerExecutionState implements SchedulerExecutionStateInterface {
@Inject
private ExecutionRepositoryInterface executionRepository;
@Override
public Optional<Execution> findById(String tenantId, String id) {
return executionRepository.findById(tenantId, id);
}
}

View File

@@ -0,0 +1,9 @@
package io.kestra.core.schedulers;
import io.kestra.core.models.executions.Execution;
import java.util.Optional;
public interface SchedulerExecutionStateInterface {
Optional<Execution> findById(String tenantId, String id);
}

View File

@@ -20,19 +20,22 @@ public interface SchedulerTriggerStateInterface {
Trigger create(Trigger trigger) throws ConstraintViolationException;
Trigger save(Trigger trigger, ScheduleContextInterface scheduleContext, String headerContent) throws ConstraintViolationException;
Trigger create(Trigger trigger, String headerContent) throws ConstraintViolationException;
Trigger update(Trigger trigger);
Trigger update(Flow flow, AbstractTrigger abstractTrigger, ConditionContext conditionContext) throws Exception;
/**
* Used by the JDBC implementation: find triggers in all tenants.
*/
List<Trigger> findByNextExecutionDateReadyForAllTenants(ZonedDateTime now, ScheduleContextInterface scheduleContext);
/**
* Required for Kafka
* Used by the Kafka implementation: find triggers in the scheduler assigned flow (as in Kafka partition assignment).
*/
List<Trigger> findByNextExecutionDateReadyForGivenFlows(List<Flow> flows, ZonedDateTime now, ScheduleContextInterface scheduleContext);
/**
* Required for Kafka
*/
void unlock(Trigger trigger);
}

View File

@@ -0,0 +1,77 @@
package io.kestra.core.serializers;
import com.fasterxml.jackson.core.*;
import com.fasterxml.jackson.core.io.NumberInput;
import com.fasterxml.jackson.databind.DeserializationContext;
import com.fasterxml.jackson.datatype.jsr310.DecimalUtils;
import java.io.IOException;
import java.math.BigDecimal;
import java.time.DateTimeException;
import java.time.Duration;
public class DurationDeserializer extends com.fasterxml.jackson.datatype.jsr310.deser.DurationDeserializer {
// durations can be a string with a number which is not taken into account as it should not happen
// we specialize the Duration deserialization from string to support that
@Override
protected Duration _fromString(JsonParser parser, DeserializationContext ctxt, String value0) throws IOException {
String value = value0.trim();
if (value.isEmpty()) {
// 22-Oct-2020, tatu: not sure if we should pass original (to distinguish
// b/w empty and blank); for now don't which will allow blanks to be
// handled like "regular" empty (same as pre-2.12)
return _fromEmptyString(parser, ctxt, value);
}
// 30-Sep-2020: Should allow use of "Timestamp as String" for
// some textual formats
if (ctxt.isEnabled(StreamReadCapability.UNTYPED_SCALARS)
&& _isValidTimestampString(value)) {
return _fromTimestamp(ctxt, NumberInput.parseLong(value));
}
// These are the only lines we changed from the default impl: we check for a float as string and parse it
if (_isFloat(value)) {
double d = Double.parseDouble(value);
BigDecimal bigDecimal = BigDecimal.valueOf(d);
return DecimalUtils.extractSecondsAndNanos(bigDecimal, Duration::ofSeconds);
}
try {
return Duration.parse(value);
} catch (DateTimeException e) {
return _handleDateTimeException(ctxt, e, value);
}
}
// this method is inspired by _isIntNumber but allow the decimal separator '.'
private boolean _isFloat(String text) {
final int len = text.length();
if (len > 0) {
char c = text.charAt(0);
// skip leading sign (plus not allowed for strict JSON numbers but...)
int i;
if (c == '-' || c == '+') {
if (len == 1) {
return false;
}
i = 1;
} else {
i = 0;
}
// We will allow leading
for (; i < len; ++i) {
int ch = text.charAt(i);
if (ch == '.') {
continue;
}
if (ch > '9' || ch < '0') {
return false;
}
}
return true;
}
return false;
}
}

View File

@@ -10,6 +10,7 @@ import com.fasterxml.jackson.databind.DeserializationFeature;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.SerializationFeature;
import com.fasterxml.jackson.databind.module.SimpleModule;
import com.fasterxml.jackson.databind.node.ObjectNode;
import com.fasterxml.jackson.dataformat.ion.IonObjectMapper;
import com.fasterxml.jackson.dataformat.yaml.YAMLFactory;
@@ -29,6 +30,7 @@ import org.apache.commons.lang3.tuple.Pair;
import org.yaml.snakeyaml.LoaderOptions;
import java.io.IOException;
import java.time.Duration;
import java.time.ZoneId;
import java.util.List;
import java.util.Map;
@@ -119,6 +121,9 @@ public final class JacksonMapper {
}
private static ObjectMapper configure(ObjectMapper mapper) {
SimpleModule durationDeserialization = new SimpleModule();
durationDeserialization.addDeserializer(Duration.class, new DurationDeserializer());
return mapper
.configure(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS, false)
.setSerializationInclusion(JsonInclude.Include.NON_NULL)
@@ -128,6 +133,7 @@ public final class JacksonMapper {
.registerModules(new GuavaModule())
.registerModule(new PluginModule())
.registerModule(new RunContextModule())
.registerModule(durationDeserialization)
.setTimeZone(TimeZone.getDefault());
}

View File

@@ -5,6 +5,7 @@ import io.kestra.core.models.collectors.*;
import io.kestra.core.plugins.PluginRegistry;
import io.kestra.core.repositories.ExecutionRepositoryInterface;
import io.kestra.core.repositories.FlowRepositoryInterface;
import io.kestra.core.repositories.ServiceInstanceRepositoryInterface;
import io.kestra.core.serializers.JacksonMapper;
import io.kestra.core.utils.IdUtils;
import io.kestra.core.utils.VersionProvider;
@@ -24,6 +25,7 @@ import lombok.extern.slf4j.Slf4j;
import java.lang.management.ManagementFactory;
import java.net.URI;
import java.time.Duration;
import java.time.Instant;
import java.time.ZoneId;
import java.time.ZonedDateTime;
@@ -66,6 +68,9 @@ public class CollectorService {
@Value("${kestra.anonymous-usage-report.uri}")
protected URI url;
@Inject
private ServiceInstanceRepositoryInterface serviceRepository;
private transient Usage defaultUsage;
protected synchronized Usage defaultUsage() {
@@ -109,7 +114,8 @@ public class CollectorService {
if (details) {
builder = builder
.flows(FlowUsage.of(flowRepository))
.executions(ExecutionUsage.of(executionRepository, from, to));
.executions(ExecutionUsage.of(executionRepository, from, to))
.services(ServiceUsage.of(from.toInstant(), to.toInstant(), serviceRepository, Duration.ofMinutes(5)));
}
return builder.build();
}

View File

@@ -3,7 +3,11 @@ package io.kestra.core.services;
import io.kestra.core.events.CrudEvent;
import io.kestra.core.events.CrudEventType;
import io.kestra.core.exceptions.InternalException;
import io.kestra.core.models.executions.*;
import io.kestra.core.models.executions.Execution;
import io.kestra.core.models.executions.ExecutionKilled;
import io.kestra.core.models.executions.ExecutionKilledExecution;
import io.kestra.core.models.executions.TaskRun;
import io.kestra.core.models.executions.TaskRunAttempt;
import io.kestra.core.models.flows.Flow;
import io.kestra.core.models.flows.State;
import io.kestra.core.models.flows.input.InputAndValue;
@@ -26,7 +30,6 @@ import io.kestra.plugin.core.flow.Pause;
import io.kestra.plugin.core.flow.WorkingDirectory;
import io.micronaut.context.event.ApplicationEventPublisher;
import io.micronaut.core.annotation.Nullable;
import io.micronaut.http.HttpResponse;
import io.micronaut.http.multipart.CompletedPart;
import jakarta.inject.Inject;
import jakarta.inject.Named;
@@ -38,12 +41,21 @@ import lombok.experimental.SuperBuilder;
import lombok.extern.slf4j.Slf4j;
import org.reactivestreams.Publisher;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
import java.io.IOException;
import java.net.URI;
import java.time.Instant;
import java.time.ZonedDateTime;
import java.util.*;
import java.util.AbstractMap;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import java.util.Map;
import java.util.NoSuchElementException;
import java.util.Objects;
import java.util.Optional;
import java.util.Set;
import java.util.concurrent.atomic.AtomicReference;
import java.util.function.Predicate;
import java.util.stream.Collectors;
@@ -447,19 +459,16 @@ public class ExecutionService {
* @param flow the flow of the execution
* @param inputs the onResume inputs
* @return the execution in the new state.
* @throws Exception if the state of the execution cannot be updated
*/
public List<InputAndValue> validateForResume(final Execution execution, Flow flow, @Nullable Publisher<CompletedPart> inputs) throws Exception {
Task task = getFirstPausedTaskOrThrow(execution, flow);
if (task instanceof Pause pauseTask) {
return flowInputOutput.validateExecutionInputs(
pauseTask.getOnResume(),
execution,
inputs,
true
);
}
return Collections.emptyList();
public Mono<List<InputAndValue>> validateForResume(final Execution execution, Flow flow, @Nullable Publisher<CompletedPart> inputs) {
return getFirstPausedTaskOrThrow(execution, flow)
.flatMap(task -> {
if (task instanceof Pause pauseTask) {
return flowInputOutput.validateExecutionInputs(pauseTask.getOnResume(), execution, inputs);
} else {
return Mono.just(Collections.emptyList());
}
});
}
/**
@@ -471,27 +480,36 @@ public class ExecutionService {
* @param flow the flow of the execution
* @param inputs the onResume inputs
* @return the execution in the new state.
* @throws Exception if the state of the execution cannot be updated
*/
public Execution resume(final Execution execution, Flow flow, State.Type newState, @Nullable Publisher<CompletedPart> inputs) throws Exception {
var task = getFirstPausedTaskOrThrow(execution, flow);
Map<String, Object> pauseOutputs = Collections.emptyMap();
if (task instanceof Pause pauseTask) {
pauseOutputs = flowInputOutput.readExecutionInputs(
pauseTask.getOnResume(),
execution,
inputs
);
}
return resume(execution, flow, newState, pauseOutputs);
public Mono<Execution> resume(final Execution execution, Flow flow, State.Type newState, @Nullable Publisher<CompletedPart> inputs) {
return getFirstPausedTaskOrThrow(execution, flow)
.flatMap(task -> {
if (task instanceof Pause pauseTask) {
return flowInputOutput.readExecutionInputs(pauseTask.getOnResume(), execution, inputs);
} else {
return Mono.just(Collections.<String, Object>emptyMap());
}
})
.handle((resumeInputs, sink) -> {
try {
sink.next(resume(execution, flow, newState, resumeInputs));
} catch (Exception e) {
sink.error(e);
}
});
}
private static Task getFirstPausedTaskOrThrow(Execution execution, Flow flow) throws InternalException {
var runningTaskRun = execution
.findFirstByState(State.Type.PAUSED)
.orElseThrow(() -> new IllegalArgumentException("No paused task found on execution " + execution.getId()));
return flow.findTaskByTaskId(runningTaskRun.getTaskId());
private static Mono<Task> getFirstPausedTaskOrThrow(Execution execution, Flow flow){
return Mono.create(sink -> {
try {
var runningTaskRun = execution
.findFirstByState(State.Type.PAUSED)
.orElseThrow(() -> new IllegalArgumentException("No paused task found on execution " + execution.getId()));
sink.success(flow.findTaskByTaskId(runningTaskRun.getTaskId()));
} catch (InternalException e) {
sink.error(e);
}
});
}
/**

View File

@@ -19,21 +19,12 @@ import jakarta.inject.Singleton;
import lombok.SneakyThrows;
import lombok.extern.slf4j.Slf4j;
import org.apache.commons.lang3.ClassUtils;
import org.apache.commons.lang3.builder.EqualsBuilder;
import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method;
import java.lang.reflect.Modifier;
import java.util.AbstractMap;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.NoSuchElementException;
import java.util.Objects;
import java.util.Optional;
import java.util.*;
import java.util.regex.Pattern;
import java.util.stream.Collectors;
import java.util.stream.IntStream;
@@ -308,7 +299,7 @@ public class FlowService {
.stream()
.filter(oldTrigger -> ListUtils.emptyOnNull(previous.getTriggers())
.stream()
.anyMatch(trigger -> trigger.getId().equals(oldTrigger.getId()) && !trigger.equals(oldTrigger))
.anyMatch(trigger -> trigger.getId().equals(oldTrigger.getId()) && !EqualsBuilder.reflectionEquals(trigger, oldTrigger))
)
.toList();
}

View File

@@ -201,6 +201,6 @@ public class InternalNamespace implements Namespace {
**/
@Override
public boolean delete(Path path) throws IOException {
return storage.delete(tenant, NamespaceFile.of(namespace, path).storagePath().toUri());
return storage.delete(tenant, URI.create(path.toString().replace("\\","/")));
}
}

View File

@@ -1,6 +1,8 @@
package io.kestra.core.storages;
import io.kestra.core.utils.WindowsUtils;
import jakarta.annotation.Nullable;
import org.apache.commons.io.FilenameUtils;
import java.net.URI;
import java.nio.file.Path;
@@ -47,8 +49,7 @@ public record NamespaceFile(
return of(namespace, (Path) null);
}
Path path = Path.of(uri.getPath());
Path path = Path.of(WindowsUtils.windowsToUnixPath(uri.getPath()));
final NamespaceFile namespaceFile;
if (uri.getScheme() != null) {
if (!uri.getScheme().equalsIgnoreCase("kestra")) {

View File

@@ -99,7 +99,7 @@ public final class PathMatcherPredicate implements Predicate<Path> {
} else {
pattern = mayAddRecursiveMatch(p);
}
syntaxAndPattern = SYNTAX_GLOB + pattern;
syntaxAndPattern = SYNTAX_GLOB + pattern.replace("\\", "/");
}
return syntaxAndPattern;
})
@@ -125,7 +125,7 @@ public final class PathMatcherPredicate implements Predicate<Path> {
}
private static String mayAddLeadingSlash(final String path) {
return path.startsWith("/") ? path : "/" + path;
return (path.startsWith("/") || path.startsWith("\\")) ? path : "/" + path;
}
public static boolean isPrefixWithSyntax(final String pattern) {

View File

@@ -1,19 +1,30 @@
package io.kestra.core.utils;
import java.net.URI;
import java.util.regex.Matcher;
public class WindowsUtils {
public static String windowsToUnixPath(String path){
public static String windowsToUnixPath(String path, boolean startWithSlash) {
Matcher matcher = java.util.regex.Pattern.compile("([A-Za-z]:)").matcher(path);
String unixPath = matcher.replaceAll(m -> m.group().toLowerCase());
unixPath = unixPath
.replace("\\", "/")
.replace(":", "");
if (!unixPath.startsWith("/")) {
if (!unixPath.startsWith("/") && startWithSlash) {
unixPath = "/" + unixPath;
}
return unixPath;
}
public static String windowsToUnixPath(String path) {
return windowsToUnixPath(path, true);
}
public static URI windowsToUnixURI(URI uri) {
return URI.create(windowsToUnixPath(uri.toString(), false));
}
}

View File

@@ -66,8 +66,8 @@ public class Return extends Task implements RunnableTask<Return.Output> {
long end = System.nanoTime();
runContext
.metric(Counter.of("length", Optional.ofNullable(render).map(String::length).orElse(0), "format", render))
.metric(Timer.of("duration", Duration.ofNanos(end - start), "format", render));
.metric(Counter.of("length", Optional.ofNullable(render).map(String::length).orElse(0)))
.metric(Timer.of("duration", Duration.ofNanos(end - start)));
return Output.builder()
.value(render)

View File

@@ -19,6 +19,9 @@ import io.kestra.core.models.tasks.*;
import io.kestra.core.runners.*;
import io.kestra.core.serializers.FileSerde;
import io.kestra.core.services.StorageService;
import io.kestra.core.storages.FileAttributes;
import io.kestra.core.storages.StorageContext;
import io.kestra.core.storages.StorageInterface;
import io.kestra.core.storages.StorageSplitInterface;
import io.kestra.core.utils.GraphUtils;
import io.swagger.v3.oas.annotations.media.Schema;
@@ -580,23 +583,25 @@ public class ForEachItem extends Task implements FlowableTask<VoidOutput>, Child
return null;
}
Integer iterations = (Integer) taskOutput.get(ExecutableUtils.TASK_VARIABLE_NUMBER_OF_BATCHES);
String subflowOutputsBaseUri = (String) taskOutput.get(ExecutableUtils.TASK_VARIABLE_SUBFLOW_OUTPUTS_BASE_URI);
String subflowOutputsBase = (String) taskOutput.get(ExecutableUtils.TASK_VARIABLE_SUBFLOW_OUTPUTS_BASE_URI);
URI subflowOutputsBaseUri = URI.create(StorageContext.KESTRA_PROTOCOL + subflowOutputsBase + "/");
List<URI> outputsURIs = IntStream.rangeClosed(1, iterations)
.mapToObj(it -> "kestra://" + subflowOutputsBaseUri + "/" + it + "/outputs.ion")
.map(throwFunction(URI::create))
.filter(runContext.storage()::isFileExist)
.toList();
StorageInterface storage = ((DefaultRunContext) runContext).getApplicationContext().getBean(StorageInterface.class);
if (storage.exists(runContext.tenantId(), subflowOutputsBaseUri)) {
List<FileAttributes> list = storage.list(runContext.tenantId(), subflowOutputsBaseUri);
if (!outputsURIs.isEmpty()) {
// Merge outputs from each sub-flow into a single stored in the internal storage.
List<InputStream> streams = outputsURIs.stream()
.map(throwFunction(runContext.storage()::getFile))
.toList();
try (InputStream is = new SequenceInputStream(Collections.enumeration(streams))) {
URI uri = runContext.storage().putFile(is, "outputs.ion");
return ForEachItemMergeOutputs.Output.builder().subflowOutputs(uri).build();
if (!list.isEmpty()) {
// Merge outputs from each sub-flow into a single stored in the internal storage.
List<InputStream> streams = list.stream()
.map(throwFunction(attr -> {
URI file = subflowOutputsBaseUri.resolve(attr.getFileName() + "/outputs.ion");
return runContext.storage().getFile(file);
}))
.toList();
try (InputStream is = new SequenceInputStream(Collections.enumeration(streams))) {
URI uri = runContext.storage().putFile(is, "outputs.ion");
return ForEachItemMergeOutputs.Output.builder().subflowOutputs(uri).build();
}
}
}

View File

@@ -14,17 +14,13 @@ import io.kestra.core.models.hierarchies.RelationType;
import io.kestra.core.models.tasks.FlowableTask;
import io.kestra.core.models.tasks.ResolvedTask;
import io.kestra.core.models.tasks.Task;
import io.kestra.core.models.tasks.VoidOutput;
import io.kestra.core.runners.FlowableUtils;
import io.kestra.core.runners.RunContext;
import io.kestra.core.utils.GraphUtils;
import io.kestra.core.utils.ListUtils;
import io.kestra.core.utils.TruthUtils;
import io.swagger.v3.oas.annotations.media.Schema;
import lombok.EqualsAndHashCode;
import lombok.Getter;
import lombok.NoArgsConstructor;
import lombok.ToString;
import lombok.*;
import lombok.experimental.SuperBuilder;
import jakarta.validation.Valid;
@@ -50,12 +46,12 @@ import java.util.stream.Stream;
code = """
id: if
namespace: company.team
inputs:
- id: string
type: STRING
required: true
tasks:
- id: if
type: io.kestra.plugin.core.flow.If
@@ -73,7 +69,7 @@ import java.util.stream.Stream;
},
aliases = "io.kestra.core.tasks.flows.If"
)
public class If extends Task implements FlowableTask<VoidOutput> {
public class If extends Task implements FlowableTask<If.Output> {
@PluginProperty(dynamic = true)
@Schema(
title = "The `If` condition which can be any expression that evaluates to a boolean value.",
@@ -139,8 +135,18 @@ public class If extends Task implements FlowableTask<VoidOutput> {
@Override
public List<ResolvedTask> childTasks(RunContext runContext, TaskRun parentTaskRun) throws IllegalVariableEvaluationException {
String rendered = runContext.render(condition);
if (TruthUtils.isTruthy(rendered)) {
// We need to evaluate the condition once, so if the condition is impacted during the processing or a branch, the same branch is always taken.
// This can exist for ex if the condition is based on a KV and the KV is changed in the branch.
// For this, we evaluate the condition in the outputs() method and get it from the outputs.
// But unfortunately, the output may not have yet been computed in some cases, like if the task is inside a flowable, in this case we compute the result anyway.
Boolean evaluationResult;
if (parentTaskRun.getOutputs() == null || parentTaskRun.getOutputs().get("evaluationResult") == null) {
evaluationResult = isTrue(runContext);
} else {
evaluationResult = (Boolean) parentTaskRun.getOutputs().get("evaluationResult");
}
if (Boolean.TRUE.equals(evaluationResult)) {
return FlowableUtils.resolveTasks(then, parentTaskRun);
}
return FlowableUtils.resolveTasks(_else, parentTaskRun);
@@ -173,4 +179,22 @@ public class If extends Task implements FlowableTask<VoidOutput> {
this.isAllowFailure()
);
}
@Override
public If.Output outputs(RunContext runContext) throws Exception {
Boolean evaluationResult = isTrue(runContext);
return If.Output.builder().evaluationResult(evaluationResult).build();
}
private Boolean isTrue(RunContext runContext) throws IllegalVariableEvaluationException {
String rendered = runContext.render(condition);
return TruthUtils.isTruthy(rendered);
}
@Builder
@Getter
public static class Output implements io.kestra.core.models.tasks.Output {
@Schema(title = "Condition evaluation result.")
public Boolean evaluationResult;
}
}

View File

@@ -6,39 +6,22 @@ import io.kestra.core.models.annotations.Example;
import io.kestra.core.models.annotations.Plugin;
import io.kestra.core.models.annotations.PluginProperty;
import io.kestra.core.models.executions.Execution;
import io.kestra.core.models.property.Property;
import io.kestra.core.models.executions.TaskRun;
import io.kestra.core.models.executions.TaskRunAttempt;
import io.kestra.core.models.flows.State;
import io.kestra.core.models.property.Property;
import io.kestra.core.models.tasks.ExecutableTask;
import io.kestra.core.models.tasks.Task;
import io.kestra.core.runners.ExecutableUtils;
import io.kestra.core.runners.FlowExecutorInterface;
import io.kestra.core.runners.FlowInputOutput;
import io.kestra.core.runners.DefaultRunContext;
import io.kestra.core.runners.RunContext;
import io.kestra.core.runners.SubflowExecution;
import io.kestra.core.runners.SubflowExecutionResult;
import io.kestra.core.runners.*;
import io.swagger.v3.oas.annotations.media.Schema;
import jakarta.validation.constraints.Min;
import lombok.experimental.SuperBuilder;
import lombok.Builder;
import lombok.EqualsAndHashCode;
import lombok.Getter;
import lombok.NoArgsConstructor;
import lombok.ToString;
import jakarta.validation.constraints.NotEmpty;
import jakarta.validation.constraints.NotNull;
import lombok.*;
import lombok.experimental.SuperBuilder;
import java.time.ZonedDateTime;
import java.util.ArrayList;
import java.util.Collections;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.Optional;
import java.util.*;
import java.util.stream.Collectors;
@SuperBuilder
@@ -147,7 +130,7 @@ public class Subflow extends Task implements ExecutableTask<Subflow.Output>, Chi
@Schema(
title = "Don't trigger the subflow now but schedule it on a specific date."
)
)
private Property<ZonedDateTime> scheduleDate;
@Override
@@ -168,6 +151,7 @@ public class Subflow extends Task implements ExecutableTask<Subflow.Output>, Chi
if (this.labels != null) {
for (Map.Entry<String, String> entry : this.labels.entrySet()) {
labels.removeIf(label -> label.key().equals(entry.getKey()));
labels.add(new Label(entry.getKey(), runContext.render(entry.getValue())));
}
}

View File

@@ -21,6 +21,10 @@ import java.io.BufferedOutputStream;
import java.io.File;
import java.io.FileOutputStream;
import java.net.URI;
import java.net.URLEncoder;
import java.nio.charset.StandardCharsets;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.util.List;
import java.util.Map;
@@ -128,6 +132,9 @@ public class Download extends AbstractHttp implements RunnableTask<Download.Outp
String contentDisposition = builder.headers.get("Content-Disposition").getFirst();
filename = filenameFromHeader(runContext, contentDisposition);
}
if (filename != null) {
filename = URLEncoder.encode(filename, StandardCharsets.UTF_8);
}
builder.uri(runContext.storage().putFile(tempFile, filename));
@@ -137,21 +144,31 @@ public class Download extends AbstractHttp implements RunnableTask<Download.Outp
}
}
// Note: this is a naive basic implementation that may bot cover all possible use cases.
// Note: this is a basic implementation that should cover all possible use cases.
// If this is not enough, we should find some helper method somewhere to cover all possible rules of the Content-Disposition header.
private String filenameFromHeader(RunContext runContext, String contentDisposition) {
try {
String[] parts = contentDisposition.split(" ");
// Content-Disposition parts are separated by ';'
String[] parts = contentDisposition.split(";");
String filename = null;
for (String part : parts) {
if (part.startsWith("filename")) {
filename = part.substring(part.lastIndexOf('=') + 2, part.length() - 1);
String stripped = part.strip();
if (stripped.startsWith("filename")) {
filename = stripped.substring(stripped.lastIndexOf('=') + 1);
}
if (part.startsWith("filename*")) {
if (stripped.startsWith("filename*")) {
// following https://datatracker.ietf.org/doc/html/rfc5987 the filename* should be <ENCODING>'(lang)'<filename>
filename = part.substring(part.lastIndexOf('\'') + 2, part.length() - 1);
filename = stripped.substring(stripped.lastIndexOf('\'') + 2, stripped.length() - 1);
}
}
// filename may be in double-quotes
if (filename != null && filename.charAt(0) == '"') {
filename = filename.substring(1, filename.length() - 1);
}
// if filename contains a path: use only the last part to avoid security issues due to host file overwriting
if (filename != null && filename.contains(File.separator)) {
filename = filename.substring(filename.lastIndexOf(File.separator) + 1);
}
return filename;
} catch (Exception e) {
// if we cannot parse the Content-Disposition header, we return null

View File

@@ -16,6 +16,7 @@ import lombok.experimental.SuperBuilder;
import org.slf4j.Logger;
import java.time.Duration;
import java.util.HashMap;
import java.util.Map;
import java.util.Optional;
@@ -67,7 +68,7 @@ import java.util.Optional;
- id: log_response
type: io.kestra.plugin.core.log.Log
message: '{{ trigger.body }}'
triggers:
- id: http
type: io.kestra.plugin.core.http.Trigger
@@ -154,12 +155,12 @@ public class Trigger extends AbstractTrigger implements PollingTriggerInterface,
Object body = this.encryptBody
? runContext.decrypt(output.getEncryptedBody().getValue())
: output.getBody();
Map<String, Object> responseVariables = Map.of("response", Map.of(
"statusCode", output.getCode(),
"body", body,
"headers", output.getHeaders()
)
);
Map<String, Object> response = new HashMap<>();
response.put("statusCode", output.getCode());
response.put("body", body); // body can be null so we need a null-friendly map
response.put("headers", output.getHeaders());
Map<String, Object> responseVariables = Map.of("response", response);
var renderedCondition = runContext.render(this.responseCondition, responseVariables);
if (TruthUtils.isTruthy(renderedCondition)) {
Execution execution = TriggerService.generateExecution(this, conditionContext, context, output);

View File

@@ -20,6 +20,7 @@ import lombok.experimental.SuperBuilder;
import org.slf4j.Logger;
import java.net.URI;
import java.nio.file.Path;
import java.util.List;
import java.util.Map;
@@ -103,7 +104,7 @@ public class DeleteFiles extends Task implements RunnableTask<DeleteFiles.Output
long count = matched
.stream()
.map(Rethrow.throwFunction(file -> {
if (namespace.delete(file)) {
if (namespace.delete(NamespaceFile.of(renderedNamespace, Path.of(file.path().replace("\\","/"))).storagePath())) {
logger.debug(String.format("Deleted %s", (file.path())));
return true;
}

View File

@@ -23,6 +23,7 @@ import java.util.Map;
You can use this task to return some outputs and pass them to downstream tasks.
It's helpful for parsing and returning values from a task. You can then access these outputs in your downstream tasks
using the expression `{{ outputs.mytask_id.values.my_output_name }}` and you can see them in the Outputs tab.
The values can be strings, numbers, arrays, or any valid JSON object.
"""
)
@Plugin(
@@ -39,6 +40,11 @@ tasks:
values:
taskrun_data: "{{ task.id }} > {{ taskrun.startDate }}"
execution_data: "{{ flow.id }} > {{ execution.startDate }}"
number_value: 42
array_value: ["{{ task.id }}", "{{ flow.id }}", "static value"]
nested_object:
key1: "value1"
key2: "{{ execution.id }}"
- id: log_values
type: io.kestra.plugin.core.log.Log
@@ -51,15 +57,16 @@ tasks:
)
public class OutputValues extends Task implements RunnableTask<OutputValues.Output> {
@Schema(
title = "The templated strings to render."
title = "The templated strings to render.",
description = "These values can be strings, numbers, arrays, or objects. Templated strings (enclosed in {{ }}) will be rendered using the current context."
)
private HashMap<String, String> values;
private HashMap<String, Object> values;
@Override
public OutputValues.Output run(RunContext runContext) throws Exception {
return OutputValues.Output.builder()
.values(runContext.renderMap(values))
.values(runContext.render(values))
.build();
}
@@ -69,6 +76,6 @@ public class OutputValues extends Task implements RunnableTask<OutputValues.Outp
@Schema(
title = "The generated values."
)
private Map<String, String> values;
private Map<String, Object> values;
}
}

View File

@@ -0,0 +1,61 @@
package io.kestra.core.models.collectors;
import io.kestra.core.server.Service;
import io.kestra.core.server.ServiceInstance;
import io.kestra.core.utils.IdUtils;
import org.junit.jupiter.api.Assertions;
import org.junit.jupiter.api.Test;
import java.time.Duration;
import java.time.Instant;
import java.time.LocalDate;
import java.time.ZoneId;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
import java.util.Set;
class ServiceUsageTest {
@Test
void shouldGetDailyUsage() {
// Given
LocalDate now = LocalDate.now();
LocalDate start = now.withDayOfMonth(1);
LocalDate end = start.withDayOfMonth(start.getMonth().length(start.isLeapYear()));
List<ServiceInstance> instances = new ArrayList<>();
while (start.toEpochDay() < end.toEpochDay()) {
Instant createAt = start.atStartOfDay(ZoneId.systemDefault()).toInstant();
Instant updatedAt = start.atStartOfDay(ZoneId.systemDefault()).plus(Duration.ofHours(10)).toInstant();
ServiceInstance instance = new ServiceInstance(
IdUtils.create(),
Service.ServiceType.WORKER,
Service.ServiceState.EMPTY,
null,
createAt,
updatedAt,
List.of(),
null,
Map.of(),
Set.of()
);
instance = instance
.state(Service.ServiceState.RUNNING, createAt)
.state(Service.ServiceState.NOT_RUNNING, updatedAt);
instances.add(instance);
start = start.plusDays(1);
}
// When
ServiceUsage.DailyServiceStatistics statistics = ServiceUsage.of(
Service.ServiceType.WORKER,
Duration.ofMinutes(15),
instances
);
// Then
Assertions.assertEquals(instances.size(), statistics.values().size());
}
}

View File

@@ -1,16 +1,32 @@
package io.kestra.core.runners;
import io.kestra.core.encryption.EncryptionService;
import io.kestra.core.exceptions.IllegalVariableEvaluationException;
import io.kestra.core.models.tasks.common.EncryptedString;
import io.micronaut.context.ApplicationContext;
import io.micronaut.context.annotation.Value;
import io.micronaut.test.extensions.junit5.annotation.MicronautTest;
import jakarta.inject.Inject;
import org.junit.jupiter.api.Assertions;
import org.junit.jupiter.api.Test;
import java.security.GeneralSecurityException;
import java.util.Map;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.is;
@MicronautTest
class DefaultRunContextTest {
@Inject
ApplicationContext applicationContext;
private ApplicationContext applicationContext;
@Value("${kestra.encryption.secret-key}")
private String secretKey;
@Inject
private RunContextFactory runContextFactory;
@Test
void shouldGetKestraVersion() {
@@ -18,4 +34,16 @@ class DefaultRunContextTest {
runContext.init(applicationContext);
Assertions.assertNotNull(runContext.version());
}
@Test
void shouldDecryptVariables() throws GeneralSecurityException, IllegalVariableEvaluationException {
RunContext runContext = runContextFactory.of();
String encryptedSecret = EncryptionService.encrypt(secretKey, "It's a secret");
Map<String, Object> variables = Map.of("test", "test",
"secret", Map.of("type", EncryptedString.TYPE, "value", encryptedSecret));
String render = runContext.render("What ? {{secret}}", variables);
assertThat(render, is(("What ? It's a secret")));
}
}

View File

@@ -4,6 +4,7 @@ import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.models.executions.Execution;
import io.kestra.core.models.flows.DependsOn;
import io.kestra.core.models.flows.Input;
import io.kestra.core.models.flows.Type;
import io.kestra.core.models.flows.input.FileInput;
import io.kestra.core.models.flows.input.InputAndValue;
import io.kestra.core.models.flows.input.StringInput;
@@ -198,35 +199,22 @@ class FlowInputOutputTest {
}
@Test
void shouldDeleteFileInputAfterValidationGivenDeleteTrue() throws IOException {
void shouldNotUploadFileInputAfterValidation() throws IOException {
// Given
FileInput input = FileInput.builder()
FileInput input = FileInput
.builder()
.id("input")
.type(Type.FILE)
.build();
Publisher<CompletedPart> data = Mono.just(new MemoryCompletedFileUpload("input", "input", "???".getBytes(StandardCharsets.UTF_8)));
// When
List<InputAndValue> values = flowInputOutput.validateExecutionInputs(List.of(input), DEFAULT_TEST_EXECUTION, data, true);
List<InputAndValue> values = flowInputOutput.validateExecutionInputs(List.of(input), DEFAULT_TEST_EXECUTION, data).block();
// Then
Assertions.assertFalse(storageInterface.exists(null, URI.create(values.get(0).value().toString())));
}
@Test
void shouldNotDeleteFileInputAfterValidationGivenDeleteFalse() throws IOException {
// Given
FileInput input = FileInput.builder()
.id("input")
.build();
Publisher<CompletedPart> data = Mono.just(new MemoryCompletedFileUpload("input", "input", "???".getBytes(StandardCharsets.UTF_8)));
// When
List<InputAndValue> values = flowInputOutput.validateExecutionInputs(List.of(input), DEFAULT_TEST_EXECUTION, data, false);
// Then
Assertions.assertTrue(storageInterface.exists(null, URI.create(values.get(0).value().toString())));
Assertions.assertNull(values.getFirst().exception());
Assertions.assertFalse(storageInterface.exists(null, URI.create(values.getFirst().value().toString())));
}
private static final class MemoryCompletedFileUpload implements CompletedFileUpload {
@@ -285,5 +273,9 @@ class FlowInputOutputTest {
public boolean isComplete() {
return true;
}
@Override
public void discard() {
}
}
}

View File

@@ -3,16 +3,23 @@ package io.kestra.core.runners;
import com.google.common.collect.ImmutableMap;
import com.google.common.io.CharStreams;
import io.kestra.core.models.executions.Execution;
import io.kestra.core.models.executions.LogEntry;
import io.kestra.core.models.flows.Flow;
import io.kestra.core.models.flows.State;
import io.kestra.core.queues.QueueException;
import io.kestra.core.queues.QueueFactoryInterface;
import io.kestra.core.queues.QueueInterface;
import io.kestra.core.repositories.FlowRepositoryInterface;
import io.kestra.core.storages.StorageInterface;
import io.kestra.core.utils.TestsUtils;
import jakarta.inject.Inject;
import jakarta.inject.Named;
import org.jcodings.util.Hash;
import org.junit.jupiter.api.Test;
import jakarta.validation.ConstraintViolationException;
import reactor.core.publisher.Flux;
import java.io.FileInputStream;
import java.io.IOException;
import java.io.InputStreamReader;
@@ -22,17 +29,19 @@ import java.time.Duration;
import java.time.Instant;
import java.time.LocalDate;
import java.time.LocalTime;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.Objects;
import java.util.*;
import java.util.concurrent.TimeoutException;
import java.util.function.Consumer;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.*;
import static org.junit.jupiter.api.Assertions.assertThrows;
public class InputsTest extends AbstractMemoryRunnerTest {
@Inject
@Named(QueueFactoryInterface.WORKERTASKLOG_NAMED)
private QueueInterface<LogEntry> logQueue;
public static Map<String, Object> inputs = ImmutableMap.<String, Object>builder()
.put("string", "myString")
.put("enum", "ENUM_VALUE")
@@ -351,4 +360,22 @@ public class InputsTest extends AbstractMemoryRunnerTest {
assertThat(((Map<?, ?>) execution.getInputs().get("json")).size(), is(0));
assertThat((String) execution.findTaskRunsByTaskId("jsonOutput").getFirst().getOutputs().get("value"), is("{}"));
}
@Test
void shouldNotLogSecretInput() throws TimeoutException, QueueException {
Flux<LogEntry> receive = TestsUtils.receive(logQueue, l -> {});
Execution execution = runnerUtils.runOne(
null,
"io.kestra.tests",
"input-log-secret"
);
assertThat(execution.getTaskRunList(), hasSize(1));
assertThat(execution.getState().getCurrent(), is(State.Type.SUCCESS));
var logEntry = receive.blockLast();
assertThat(logEntry, notNullValue());
assertThat(logEntry.getMessage(), is("This is my secret: ********"));
}
}

View File

@@ -18,14 +18,14 @@ import java.time.ZonedDateTime;
import java.util.Collections;
import java.util.List;
import java.util.Map;
import java.util.Optional;
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.TimeUnit;
import static io.kestra.core.utils.Rethrow.throwConsumer;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.is;
import static org.mockito.Mockito.doReturn;
import static org.mockito.Mockito.spy;
import static org.mockito.Mockito.*;
class SchedulerConditionTest extends AbstractSchedulerTest {
@Inject
@@ -34,6 +34,9 @@ class SchedulerConditionTest extends AbstractSchedulerTest {
@Inject
protected SchedulerTriggerStateInterface triggerState;
@Inject
protected SchedulerExecutionStateInterface executionState;
private static Flow createScheduleFlow() {
Schedule schedule = Schedule.builder()
.id("hourly")
@@ -59,6 +62,7 @@ class SchedulerConditionTest extends AbstractSchedulerTest {
void schedule() throws Exception {
// mock flow listeners
FlowListeners flowListenersServiceSpy = spy(this.flowListenersService);
SchedulerExecutionStateInterface executionRepositorySpy = spy(this.executionState);
CountDownLatch queueCount = new CountDownLatch(4);
Flow flow = createScheduleFlow();
@@ -75,6 +79,11 @@ class SchedulerConditionTest extends AbstractSchedulerTest {
.when(flowListenersServiceSpy)
.flows();
// mock the backfill execution is ended
doAnswer(invocation -> Optional.of(Execution.builder().state(new State().withState(State.Type.SUCCESS)).build()))
.when(executionRepositorySpy)
.findById(any(), any());
// scheduler
try (AbstractScheduler scheduler = new JdbcScheduler(
applicationContext,
@@ -95,7 +104,7 @@ class SchedulerConditionTest extends AbstractSchedulerTest {
}));
scheduler.run();
queueCount.await(30, TimeUnit.SECONDS);
queueCount.await(15, TimeUnit.SECONDS);
receive.blockLast();

View File

@@ -40,6 +40,9 @@ public class SchedulerPollingTriggerTest extends AbstractSchedulerTest {
@Inject
private SchedulerTriggerStateInterface triggerState;
@Inject
private SchedulerExecutionState schedulerExecutionState;
@Inject
private FlowListeners flowListenersService;

View File

@@ -40,6 +40,9 @@ public class SchedulerScheduleTest extends AbstractSchedulerTest {
@Inject
protected SchedulerTriggerStateInterface triggerState;
@Inject
protected SchedulerExecutionStateInterface executionState;
@Inject
@Named(QueueFactoryInterface.WORKERTASKLOG_NAMED)
protected QueueInterface<LogEntry> logQueue;
@@ -67,7 +70,7 @@ public class SchedulerScheduleTest extends AbstractSchedulerTest {
.truncatedTo(ChronoUnit.HOURS);
}
protected AbstractScheduler scheduler(FlowListeners flowListenersServiceSpy) {
protected AbstractScheduler scheduler(FlowListeners flowListenersServiceSpy, SchedulerExecutionStateInterface executionStateSpy) {
return new JdbcScheduler(
applicationContext,
flowListenersServiceSpy
@@ -79,6 +82,7 @@ public class SchedulerScheduleTest extends AbstractSchedulerTest {
void schedule() throws Exception {
// mock flow listeners
FlowListeners flowListenersServiceSpy = spy(this.flowListenersService);
SchedulerExecutionStateInterface executionStateSpy = spy(this.executionState);
CountDownLatch queueCount = new CountDownLatch(6);
CountDownLatch invalidLogCount = new CountDownLatch(1);
Set<String> date = new HashSet<>();
@@ -113,7 +117,7 @@ public class SchedulerScheduleTest extends AbstractSchedulerTest {
triggerState.create(trigger.toBuilder().triggerId("schedule-invalid").flowId(invalid.getId()).build());
// scheduler
try (AbstractScheduler scheduler = scheduler(flowListenersServiceSpy)) {
try (AbstractScheduler scheduler = scheduler(flowListenersServiceSpy, executionStateSpy)) {
// wait for execution
Flux<Execution> receiveExecutions = TestsUtils.receive(executionQueue, throwConsumer(either -> {
Execution execution = either.getLeft();
@@ -173,7 +177,7 @@ public class SchedulerScheduleTest extends AbstractSchedulerTest {
triggerState.create(trigger);
// scheduler
try (AbstractScheduler scheduler = scheduler(flowListenersServiceSpy)) {
try (AbstractScheduler scheduler = scheduler(flowListenersServiceSpy, executionState)) {
scheduler.run();
Await.until(() -> {
@@ -207,7 +211,7 @@ public class SchedulerScheduleTest extends AbstractSchedulerTest {
CountDownLatch queueCount = new CountDownLatch(1);
// scheduler
try (AbstractScheduler scheduler = scheduler(flowListenersServiceSpy)) {
try (AbstractScheduler scheduler = scheduler(flowListenersServiceSpy, executionState)) {
// wait for execution
Flux<Execution> receive = TestsUtils.receive(executionQueue, either -> {
Execution execution = either.getLeft();
@@ -252,7 +256,7 @@ public class SchedulerScheduleTest extends AbstractSchedulerTest {
CountDownLatch queueCount = new CountDownLatch(1);
// scheduler
try (AbstractScheduler scheduler = scheduler(flowListenersServiceSpy)) {
try (AbstractScheduler scheduler = scheduler(flowListenersServiceSpy, executionState)) {
// wait for execution
Flux<Execution> receive = TestsUtils.receive(executionQueue, either -> {
Execution execution = either.getLeft();
@@ -296,7 +300,7 @@ public class SchedulerScheduleTest extends AbstractSchedulerTest {
triggerState.create(lastTrigger);
// scheduler
try (AbstractScheduler scheduler = scheduler(flowListenersServiceSpy)) {
try (AbstractScheduler scheduler = scheduler(flowListenersServiceSpy, executionState)) {
scheduler.run();
Await.until(() -> scheduler.isReady(), Duration.ofMillis(100), Duration.ofSeconds(5));
@@ -327,7 +331,7 @@ public class SchedulerScheduleTest extends AbstractSchedulerTest {
.build();
// scheduler
try (AbstractScheduler scheduler = scheduler(flowListenersServiceSpy)) {
try (AbstractScheduler scheduler = scheduler(flowListenersServiceSpy, executionState)) {
scheduler.run();
Await.until(() -> {
@@ -392,7 +396,7 @@ public class SchedulerScheduleTest extends AbstractSchedulerTest {
triggerState.create(trigger);
// scheduler
try (AbstractScheduler scheduler = scheduler(flowListenersServiceSpy)) {
try (AbstractScheduler scheduler = scheduler(flowListenersServiceSpy, executionState)) {
scheduler.run();
// Wait 3s to see if things happen
@@ -430,7 +434,7 @@ public class SchedulerScheduleTest extends AbstractSchedulerTest {
CountDownLatch queueCount = new CountDownLatch(2);
// scheduler
try (AbstractScheduler scheduler = scheduler(flowListenersServiceSpy)) {
try (AbstractScheduler scheduler = scheduler(flowListenersServiceSpy, executionState)) {
// wait for execution
Flux<Execution> receive = TestsUtils.receive(executionQueue, throwConsumer(either -> {
Execution execution = either.getLeft();
@@ -490,7 +494,7 @@ public class SchedulerScheduleTest extends AbstractSchedulerTest {
CountDownLatch queueCount = new CountDownLatch(1);
// scheduler
try (AbstractScheduler scheduler = scheduler(flowListenersServiceSpy)) {
try (AbstractScheduler scheduler = scheduler(flowListenersServiceSpy, executionState)) {
// wait for execution
Flux<Execution> receive = TestsUtils.receive(executionQueue, either -> {
Execution execution = either.getLeft();

View File

@@ -19,21 +19,21 @@ import reactor.core.publisher.Flux;
import java.util.Collections;
import java.util.List;
import java.util.Map;
import java.util.Optional;
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.TimeUnit;
import static io.kestra.core.utils.Rethrow.throwConsumer;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.*;
import static org.mockito.Mockito.doReturn;
import static org.mockito.Mockito.spy;
import static org.mockito.Mockito.*;
public class SchedulerThreadTest extends AbstractSchedulerTest {
@Inject
protected FlowListeners flowListenersService;
@Inject
protected SchedulerTriggerStateInterface triggerState;
protected SchedulerExecutionStateInterface executionState;
@Test
void thread() throws Exception {
@@ -54,12 +54,17 @@ public class SchedulerThreadTest extends AbstractSchedulerTest {
// mock flow listeners
FlowListeners flowListenersServiceSpy = spy(this.flowListenersService);
SchedulerExecutionStateInterface schedulerExecutionStateSpy = spy(this.executionState);
doReturn(Collections.singletonList(flow))
.when(flowListenersServiceSpy)
.flows();
// mock the backfill execution is ended
doAnswer(invocation -> Optional.of(Execution.builder().state(new State().withState(State.Type.SUCCESS)).build()))
.when(schedulerExecutionStateSpy)
.findById(any(), any());
// scheduler
try (
AbstractScheduler scheduler = new JdbcScheduler(

View File

@@ -123,10 +123,10 @@ class YamlFlowParserTest {
void inputs() {
Flow flow = this.parse("flows/valids/inputs.yaml");
assertThat(flow.getInputs().size(), is(28));
assertThat(flow.getInputs().stream().filter(Input::getRequired).count(), is(10L));
assertThat(flow.getInputs().size(), is(29));
assertThat(flow.getInputs().stream().filter(Input::getRequired).count(), is(11L));
assertThat(flow.getInputs().stream().filter(r -> !r.getRequired()).count(), is(18L));
assertThat(flow.getInputs().stream().filter(r -> r.getDefaults() != null).count(), is(2L));
assertThat(flow.getInputs().stream().filter(r -> r.getDefaults() != null).count(), is(3L));
assertThat(flow.getInputs().stream().filter(r -> r instanceof StringInput && ((StringInput)r).getValidator() != null).count(), is(1L));
}

View File

@@ -26,7 +26,7 @@ public class OutputValuesTest extends AbstractMemoryRunnerTest {
assertThat(execution.getState().getCurrent(), is(State.Type.SUCCESS));
assertThat(execution.getTaskRunList(), hasSize(1));
TaskRun outputValues = execution.getTaskRunList().getFirst();
Map<String, String> values = (Map<String, String>) outputValues.getOutputs().get("values");
Map<String, Object> values = (Map<String, Object>) outputValues.getOutputs().get("values");
assertThat(values.get("output1"), is("xyz"));
assertThat(values.get("output2"), is("abc"));
}

View File

@@ -86,4 +86,14 @@ class IfTest extends AbstractMemoryRunnerTest {
assertThat(execution.findTaskRunsByTaskId("when-true").isEmpty(), is(true));
assertThat(execution.getState().getCurrent(), is(State.Type.SUCCESS));
}
@Test
void ifInFlowable() throws TimeoutException, QueueException {
Execution execution = runnerUtils.runOne(null, "io.kestra.tests", "if-in-flowable", null,
(f, e) -> Map.of("param", true) , Duration.ofSeconds(120));
assertThat(execution.getTaskRunList(), hasSize(8));
assertThat(execution.findTaskRunsByTaskId("after_if").getFirst().getState().getCurrent(), is(State.Type.SUCCESS));
assertThat(execution.getState().getCurrent(), is(State.Type.SUCCESS));
}
}

View File

@@ -216,7 +216,7 @@ public class PauseTest extends AbstractMemoryRunnerTest {
flow,
State.Type.RUNNING,
Flux.just(part1, part2)
);
).block();
execution = runnerUtils.awaitExecution(
e -> e.getId().equals(executionId) && e.getState().getCurrent() == State.Type.SUCCESS,
@@ -243,7 +243,7 @@ public class PauseTest extends AbstractMemoryRunnerTest {
ConstraintViolationException e = assertThrows(
ConstraintViolationException.class,
() -> executionService.resume(execution, flow, State.Type.RUNNING, Mono.empty())
() -> executionService.resume(execution, flow, State.Type.RUNNING, Mono.empty()).block()
);
assertThat(e.getMessage(), containsString("Invalid input for `asked`, missing required input, but received `null`"));

View File

@@ -1,6 +1,7 @@
package io.kestra.plugin.core.http;
import com.google.common.collect.ImmutableMap;
import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.runners.RunContext;
import io.kestra.core.runners.RunContextFactory;
import io.kestra.core.storages.StorageInterface;
@@ -12,19 +13,16 @@ import io.micronaut.http.annotation.Controller;
import io.micronaut.http.annotation.Get;
import io.micronaut.http.client.exceptions.HttpClientResponseException;
import io.micronaut.runtime.server.EmbeddedServer;
import io.kestra.core.junit.annotations.KestraTest;
import jakarta.inject.Inject;
import org.apache.commons.io.IOUtils;
import org.junit.jupiter.api.Test;
import java.io.IOException;
import java.net.URI;
import java.net.URL;
import java.nio.charset.StandardCharsets;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.endsWith;
import static org.hamcrest.Matchers.is;
import static org.hamcrest.Matchers.*;
import static org.junit.jupiter.api.Assertions.assertDoesNotThrow;
import static org.junit.jupiter.api.Assertions.assertThrows;
@@ -138,6 +136,25 @@ class DownloadTest {
assertThat(output.getUri().toString(), endsWith("filename.jpg"));
}
@Test
void contentDispositionWithPath() throws Exception {
EmbeddedServer embeddedServer = applicationContext.getBean(EmbeddedServer.class);
embeddedServer.start();
Download task = Download.builder()
.id(DownloadTest.class.getSimpleName())
.type(DownloadTest.class.getName())
.uri(embeddedServer.getURI() + "/content-disposition")
.build();
RunContext runContext = TestsUtils.mockRunContext(this.runContextFactory, task, ImmutableMap.of());
Download.Output output = task.run(runContext);
assertThat(output.getUri().toString(), not(containsString("/secure-path/")));
assertThat(output.getUri().toString(), endsWith("filename.jpg"));
}
@Controller()
public static class SlackWebController {
@Get("500")
@@ -155,5 +172,11 @@ class DownloadTest {
return HttpResponse.ok("Hello World".getBytes())
.header(HttpHeaders.CONTENT_DISPOSITION, "attachment; filename=\"filename.jpg\"");
}
@Get("content-disposition-path")
public HttpResponse<byte[]> contentDispositionWithPath() {
return HttpResponse.ok("Hello World".getBytes())
.header(HttpHeaders.CONTENT_DISPOSITION, "attachment; filename=\"/secure-path/filename.jpg\"");
}
}
}

View File

@@ -0,0 +1,22 @@
id: if-in-flowable
namespace: io.kestra.tests
inputs:
- id: param
type: STRING
tasks:
- id: for_each
type: io.kestra.plugin.core.flow.ForEach
values: ["value 1", "value 2", "value 3"]
tasks:
- id: before_if
type: io.kestra.plugin.core.debug.Return
format: "Before if: {{ taskrun.value }}"
- id: if
type: io.kestra.plugin.core.flow.If
condition: "{{ taskrun.value equals 'value 2' }}"
then:
- id: after_if
type: io.kestra.plugin.core.debug.Return
format: "After if: {{ parent.taskrun.value }}"

View File

@@ -0,0 +1,12 @@
id: input-log-secret
namespace: io.kestra.tests
inputs:
- id: secret
type: SECRET
defaults: password
tasks:
- id: log-secret
type: io.kestra.plugin.core.log.Log
message: "This is my secret: {{inputs.secret}}"

View File

@@ -104,6 +104,11 @@ inputs:
value: value1
- key: key2
value: value2
# required true and an empty default value will only work if we correctly serialize default values which is what this input is about to test.
- name: empty
type: STRING
defaults: ''
required: true
tasks:
- id: string

View File

@@ -1,5 +1,5 @@
version=0.19.0-SNAPSHOT
version=0.19.25
org.gradle.parallel=true
org.gradle.caching=true
org.gradle.priority=low
org.gradle.priority=low

View File

@@ -2,12 +2,13 @@ package io.kestra.schedulers.h2;
import io.kestra.core.runners.FlowListeners;
import io.kestra.core.schedulers.AbstractScheduler;
import io.kestra.core.schedulers.SchedulerExecutionStateInterface;
import io.kestra.core.schedulers.SchedulerScheduleTest;
import io.kestra.jdbc.runner.JdbcScheduler;
class H2SchedulerScheduleTest extends SchedulerScheduleTest {
@Override
protected AbstractScheduler scheduler(FlowListeners flowListenersServiceSpy) {
protected AbstractScheduler scheduler(FlowListeners flowListenersServiceSpy, SchedulerExecutionStateInterface executionStateSpy) {
return new JdbcScheduler(
applicationContext,
flowListenersServiceSpy

View File

@@ -2,12 +2,13 @@ package io.kestra.schedulers.mysql;
import io.kestra.core.runners.FlowListeners;
import io.kestra.core.schedulers.AbstractScheduler;
import io.kestra.core.schedulers.SchedulerExecutionStateInterface;
import io.kestra.core.schedulers.SchedulerScheduleTest;
import io.kestra.jdbc.runner.JdbcScheduler;
class MysqlSchedulerScheduleTest extends SchedulerScheduleTest {
@Override
protected AbstractScheduler scheduler(FlowListeners flowListenersServiceSpy) {
protected AbstractScheduler scheduler(FlowListeners flowListenersServiceSpy, SchedulerExecutionStateInterface executionStateSpy) {
return new JdbcScheduler(
applicationContext,
flowListenersServiceSpy

View File

@@ -2,12 +2,13 @@ package io.kestra.schedulers.postgres;
import io.kestra.core.runners.FlowListeners;
import io.kestra.core.schedulers.AbstractScheduler;
import io.kestra.core.schedulers.SchedulerExecutionStateInterface;
import io.kestra.core.schedulers.SchedulerScheduleTest;
import io.kestra.jdbc.runner.JdbcScheduler;
class PostgresSchedulerScheduleTest extends SchedulerScheduleTest {
@Override
protected AbstractScheduler scheduler(FlowListeners flowListenersServiceSpy) {
protected AbstractScheduler scheduler(FlowListeners flowListenersServiceSpy, SchedulerExecutionStateInterface executionStateSpy) {
return new JdbcScheduler(
applicationContext,
flowListenersServiceSpy

View File

@@ -8,6 +8,8 @@ import org.jooq.ExecuteContext;
import org.jooq.ExecuteListener;
import java.time.Duration;
import java.util.ArrayList;
import java.util.List;
import javax.sql.DataSource;
import jakarta.validation.constraints.NotNull;
@@ -31,7 +33,17 @@ public class JooqExecuteListenerFactory {
public void executeEnd(ExecuteContext ctx) {
Duration duration = Duration.ofMillis(System.currentTimeMillis() - startTime);
metricRegistry.timer(MetricRegistry.JDBC_QUERY_DURATION, "sql", ctx.sql())
List<String> tags = new ArrayList<>();
tags.add("batch");
tags.add(ctx.batchMode().name());
// in batch query, the query will be expanded without parameters, and will lead to overflow of metrics
if (ctx.batchMode() != ExecuteContext.BatchMode.MULTIPLE) {
tags.add("sql");
tags.add(ctx.sql());
}
metricRegistry.timer(MetricRegistry.JDBC_QUERY_DURATION, tags.toArray(new String[0]))
.record(duration);
if (log.isTraceEnabled()) {
@@ -44,5 +56,4 @@ public class JooqExecuteListenerFactory {
}
};
}
}

View File

@@ -350,7 +350,10 @@ public abstract class AbstractJdbcLogRepository extends AbstractJdbcRepository i
DSLContext context = DSL.using(configuration);
return context.delete(this.jdbcRepository.getTable())
.where(field("execution_id", String.class).eq(execution.getId()))
// The deleted field is not used, so ti will always be false.
// We add it here to be sure to use the correct index.
.where(field("deleted", Boolean.class).eq(false))
.and(field("execution_id", String.class).eq(execution.getId()))
.execute();
});
}

View File

@@ -150,7 +150,10 @@ public abstract class AbstractJdbcMetricRepository extends AbstractJdbcRepositor
DSLContext context = DSL.using(configuration);
return context.delete(this.jdbcRepository.getTable())
.where(field("execution_id", String.class).eq(execution.getId()))
// The deleted field is not used, so ti will always be false.
// We add it here to be sure to use the correct index.
.where(field("deleted", Boolean.class).eq(false))
.and(field("execution_id", String.class).eq(execution.getId()))
.execute();
});
}
@@ -168,8 +171,7 @@ public abstract class AbstractJdbcMetricRepository extends AbstractJdbcRepositor
.getDslContextWrapper()
.transactionResult(configuration -> {
DSLContext context = DSL.using(configuration);
SelectConditionStep<Record1<Object>> select = DSL
.using(configuration)
SelectConditionStep<Record1<Object>> select = context
.selectDistinct(field(field))
.from(this.jdbcRepository.getTable())
.where(this.defaultFilter(tenantId));
@@ -185,8 +187,7 @@ public abstract class AbstractJdbcMetricRepository extends AbstractJdbcRepositor
.getDslContextWrapper()
.transactionResult(configuration -> {
DSLContext context = DSL.using(configuration);
SelectConditionStep<Record1<Object>> select = DSL
.using(configuration)
SelectConditionStep<Record1<Object>> select = context
.select(field("value"))
.from(this.jdbcRepository.getTable())
.where(this.defaultFilter(tenantId));

View File

@@ -214,7 +214,6 @@ public abstract class AbstractJdbcTriggerRepository extends AbstractJdbcReposito
Trigger current = optionalTrigger.get();
current = current.toBuilder()
.executionId(trigger.getExecutionId())
.executionCurrentState(trigger.getExecutionCurrentState())
.updatedDate(trigger.getUpdatedDate())
.build();
this.save(context, current);

View File

@@ -4,6 +4,7 @@ import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.google.common.base.CaseFormat;
import io.kestra.core.exceptions.DeserializationException;
import io.kestra.core.metrics.MetricRegistry;
import io.kestra.core.models.executions.Execution;
import io.kestra.core.queues.QueueException;
import io.kestra.core.queues.QueueInterface;
@@ -64,6 +65,8 @@ public abstract class JdbcQueue<T> implements QueueInterface<T> {
protected final MessageProtectionConfiguration messageProtectionConfiguration;
private final MetricRegistry metricRegistry;
protected final Table<Record> table;
protected final JdbcQueueIndexer jdbcQueueIndexer;
@@ -80,6 +83,7 @@ public abstract class JdbcQueue<T> implements QueueInterface<T> {
this.dslContextWrapper = applicationContext.getBean(JooqDSLContextWrapper.class);
this.configuration = applicationContext.getBean(Configuration.class);
this.messageProtectionConfiguration = applicationContext.getBean(MessageProtectionConfiguration.class);
this.metricRegistry = applicationContext.getBean(MetricRegistry.class);
JdbcTableConfigs jdbcTableConfigs = applicationContext.getBean(JdbcTableConfigs.class);
@@ -97,6 +101,10 @@ public abstract class JdbcQueue<T> implements QueueInterface<T> {
}
if (messageProtectionConfiguration.enabled && bytes.length >= messageProtectionConfiguration.limit) {
metricRegistry
.counter(MetricRegistry.QUEUE_BIG_MESSAGE_COUNT, MetricRegistry.TAG_CLASS_NAME, cls.getName())
.increment();
// we let terminated execution messages to go through anyway
if (!(message instanceof Execution execution) || !execution.getState().isTerminated()) {
throw new MessageTooBigException("Message of size " + bytes.length + " has exceeded the configured limit of " + messageProtectionConfiguration.limit);

View File

@@ -31,10 +31,10 @@ import java.util.function.BiConsumer;
public class JdbcScheduler extends AbstractScheduler {
private final QueueInterface<Execution> executionQueue;
private final TriggerRepositoryInterface triggerRepository;
private final ConditionService conditionService;
private final FlowRepositoryInterface flowRepository;
private final JooqDSLContextWrapper dslContextWrapper;
private final ConditionService conditionService;
@SuppressWarnings("unchecked")
@@ -48,6 +48,7 @@ public class JdbcScheduler extends AbstractScheduler {
executionQueue = applicationContext.getBean(QueueInterface.class, Qualifiers.byName(QueueFactoryInterface.EXECUTION_NAMED));
triggerRepository = applicationContext.getBean(AbstractJdbcTriggerRepository.class);
triggerState = applicationContext.getBean(SchedulerTriggerStateInterface.class);
executionState = applicationContext.getBean(SchedulerExecutionState.class);
conditionService = applicationContext.getBean(ConditionService.class);
flowRepository = applicationContext.getBean(FlowRepositoryInterface.class);
dslContextWrapper = applicationContext.getBean(JooqDSLContextWrapper.class);
@@ -75,14 +76,6 @@ public class JdbcScheduler extends AbstractScheduler {
.ifPresent(trigger -> {
this.triggerState.update(trigger.resetExecution(execution.getState().getCurrent()));
});
} else {
// update execution state on each state change so the scheduler knows the execution is running
triggerRepository
.findByExecution(execution)
.filter(trigger -> execution.getState().getCurrent() != trigger.getExecutionCurrentState())
.ifPresent(trigger -> {
((JdbcSchedulerTriggerState) this.triggerState).updateExecution(Trigger.of(execution, trigger));
});
}
}
}
@@ -105,7 +98,7 @@ public class JdbcScheduler extends AbstractScheduler {
public void handleNext(List<Flow> flows, ZonedDateTime now, BiConsumer<List<Trigger>, ScheduleContextInterface> consumer) {
JdbcSchedulerContext schedulerContext = new JdbcSchedulerContext(this.dslContextWrapper);
schedulerContext.startTransaction(scheduleContextInterface -> {
schedulerContext.doInTransaction(scheduleContextInterface -> {
List<Trigger> triggers = this.triggerState.findByNextExecutionDateReadyForAllTenants(now, scheduleContextInterface);
consumer.accept(triggers, scheduleContextInterface);

View File

@@ -18,17 +18,14 @@ public class JdbcSchedulerContext implements ScheduleContextInterface {
this.dslContextWrapper = dslContextWrapper;
}
public void startTransaction(Consumer<ScheduleContextInterface> consumer) {
@Override
public void doInTransaction(Consumer<ScheduleContextInterface> consumer) {
this.dslContextWrapper.transaction(configuration -> {
this.context = DSL.using(configuration);
consumer.accept(this);
this.commit();
this.context.commit();
});
}
public void commit() {
this.context.commit();
}
}

View File

@@ -54,6 +54,18 @@ public class JdbcSchedulerTriggerState implements SchedulerTriggerStateInterface
return trigger;
}
@Override
public Trigger create(Trigger trigger, String headerContent) {
return this.triggerRepository.create(trigger);
}
@Override
public Trigger save(Trigger trigger, ScheduleContextInterface scheduleContextInterface, String headerContent) {
this.triggerRepository.save(trigger, scheduleContextInterface);
return trigger;
}
@Override
public Trigger create(Trigger trigger) {
@@ -84,7 +96,4 @@ public class JdbcSchedulerTriggerState implements SchedulerTriggerStateInterface
public List<Trigger> findByNextExecutionDateReadyForGivenFlows(List<Flow> flows, ZonedDateTime now, ScheduleContextInterface scheduleContext) {
throw new NotImplementedException();
}
@Override
public void unlock(Trigger trigger) {}
}

View File

@@ -119,6 +119,7 @@ dependencies {
api "org.junit-pioneer:junit-pioneer:2.2.0"
api 'org.hamcrest:hamcrest:3.0'
api 'org.hamcrest:hamcrest-library:3.0'
api 'org.assertj:assertj-core:3.27.3'
api group: 'org.exparity', name: 'hamcrest-date', version: '2.0.8'
api 'com.github.tomakehurst:wiremock-jre8:3.0.1'
api "org.apache.kafka:kafka-streams-test-utils:$kafkaVersion"

View File

@@ -9,6 +9,8 @@ import com.github.dockerjava.api.model.*;
import com.github.dockerjava.core.DefaultDockerClientConfig;
import com.github.dockerjava.core.DockerClientConfig;
import com.github.dockerjava.core.NameParser;
import com.github.dockerjava.transport.DomainSocket;
import com.sun.jna.LastErrorException;
import io.kestra.core.exceptions.IllegalVariableEvaluationException;
import io.kestra.core.models.annotations.Example;
import io.kestra.core.models.annotations.Plugin;
@@ -27,7 +29,6 @@ import jakarta.validation.constraints.NotNull;
import lombok.*;
import lombok.experimental.SuperBuilder;
import org.apache.commons.compress.archivers.ArchiveEntry;
import org.apache.commons.compress.archivers.ArchiveOutputStream;
import org.apache.commons.compress.archivers.tar.TarArchiveEntry;
import org.apache.commons.compress.archivers.tar.TarArchiveInputStream;
import org.apache.commons.compress.archivers.tar.TarArchiveOutputStream;
@@ -39,8 +40,8 @@ import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.net.Socket;
import java.nio.file.Files;
import java.nio.file.LinkOption;
import java.nio.file.Path;
import java.nio.file.StandardCopyOption;
import java.time.Duration;
@@ -331,7 +332,8 @@ public class Docker extends TaskRunner {
String image = runContext.render(this.image, additionalVars);
try (DockerClient dockerClient = dockerClient(runContext, image)) {
String resolvedHost = DockerService.findHost(runContext, this.host);
try (DockerClient dockerClient = dockerClient(runContext, image, resolvedHost)) {
// pull image
if (this.getPullPolicy() != PullPolicy.NEVER) {
pullImage(dockerClient, image, this.getPullPolicy(), logger);
@@ -530,6 +532,21 @@ public class Docker extends TaskRunner {
}
}
} catch (RuntimeException e) {
try {
if (e.getCause() instanceof IOException io &&
io.getCause() instanceof LastErrorException socketException &&
socketException.getMessage().contains("No such file or directory") &&
Socket.class.isAssignableFrom(Class.forName(io.getStackTrace()[0].getClassName()))) {
throw new IllegalStateException("Docker socket is not accessible or not found. " +
"Please make sure you properly mounted the Docker socket into your Kestra container (`-v /var/run/docker.sock:/var/run/docker.sock`) and that your user or group has at least the read and write privilege. " +
"Tried socket: " + resolvedHost, e);
}
} catch (ClassNotFoundException ignored) {
// If we can't check if the stacktrace class is a Socket, we just ignore the exception
throw e;
}
throw e;
}
}
@@ -562,9 +579,9 @@ public class Docker extends TaskRunner {
return vars;
}
private DockerClient dockerClient(RunContext runContext, String image) throws IOException, IllegalVariableEvaluationException {
private DockerClient dockerClient(RunContext runContext, String image, String host) throws IOException, IllegalVariableEvaluationException {
DefaultDockerClientConfig.Builder dockerClientConfigBuilder = DefaultDockerClientConfig.createDefaultConfigBuilder()
.withDockerHost(DockerService.findHost(runContext, this.host));
.withDockerHost(host);
if (this.getConfig() != null || this.getCredentials() != null) {
Path config = DockerService.createConfig(

View File

@@ -7,6 +7,6 @@ import io.kestra.core.models.tasks.runners.TaskRunner;
class DockerTest extends AbstractTaskRunnerTest {
@Override
protected TaskRunner taskRunner() {
return Docker.builder().image("centos").build();
return Docker.builder().image("rockylinux:9.3-minimal").build();
}
}

View File

@@ -161,7 +161,7 @@ public class LocalStorage implements StorageInterface {
}
}
return URI.create("kestra://" + uri.getPath());
return URI.create("kestra://" + uri.getRawPath());
}
@Override
@@ -237,7 +237,7 @@ public class LocalStorage implements StorageInterface {
Path prefix = (tenantId == null) ?
basePath.toAbsolutePath() :
Path.of(basePath.toAbsolutePath().toString(), tenantId);
return URI.create("kestra:///" + prefix.relativize(path));
return URI.create("kestra:///" + prefix.relativize(path).toString().replace("\\", "/"));
}
private void parentTraversalGuard(URI uri) {

View File

@@ -10,7 +10,8 @@
"preview": "vite preview",
"test:unit": "vitest run",
"test:lint": "eslint . --ext .vue,.js,.jsx,.cjs,.mjs",
"lint": "eslint . --ext .vue,.js,.jsx,.cjs,.mjs --fix"
"lint": "eslint . --ext .vue,.js,.jsx,.cjs,.mjs --fix",
"translations:check": "node ./src/translations/check.js"
},
"dependencies": {
"@js-joda/core": "^5.6.3",

View File

@@ -0,0 +1,53 @@
<svg width="169" height="146" viewBox="0 0 169 146" fill="none" xmlns="http://www.w3.org/2000/svg">
<path opacity="0.4" d="M129.725 83.5475C123.348 107.696 98.6012 122.103 74.4526 115.725C50.3039 109.348 35.8975 84.6014 42.2749 60.4528C48.6524 36.3041 73.3987 21.8977 97.5473 28.2752C121.696 34.6526 136.102 59.3989 129.725 83.5475Z" fill="#1C1E27" stroke="#E93ED1" stroke-linejoin="round"/>
<g filter="url(#filter0_d_3247_30504)">
<path d="M127.096 42.8848C130.859 48.1869 133.626 54.2556 135.113 60.8241L134.684 60.9214C135.393 64.0538 135.809 67.3012 135.9 70.6344C135.991 73.9675 135.754 77.2329 135.217 80.3994L135.651 80.473C134.525 87.1131 132.095 93.324 128.627 98.8241L128.254 98.5893C124.76 104.131 120.203 108.944 114.861 112.737L115.116 113.096C109.814 116.859 103.745 119.626 97.1762 121.113L97.079 120.684C93.9466 121.393 90.6991 121.809 87.366 121.9C84.0328 121.991 80.7675 121.754 77.601 121.217L77.5274 121.651C70.8873 120.525 64.6763 118.095 59.1763 114.627L59.4111 114.254C53.8696 110.76 49.056 106.203 45.2638 100.861L44.9048 101.116C41.1411 95.8135 38.3747 89.7448 36.8874 83.1762L37.3167 83.079C36.6075 79.9466 36.1916 76.6991 36.1004 73.366C36.0091 70.0328 36.2468 66.7675 36.7836 63.6009L36.3496 63.5273C37.4753 56.8873 39.9057 50.6763 43.3737 45.1763L43.7461 45.4111C47.2404 39.8695 51.7975 35.0559 57.1396 31.2638L56.8848 30.9048C62.1869 27.141 68.2556 24.3746 74.8242 22.8873L74.9214 23.3167C78.0538 22.6074 81.3013 22.1916 84.6344 22.1003C87.9676 22.0091 91.2329 22.2467 94.3994 22.7836L94.473 22.3495C101.113 23.4753 107.324 25.9056 112.824 29.3737L112.589 29.7461C118.131 33.2404 122.944 37.7975 126.737 43.1396L127.096 42.8848Z" stroke="#9470FF" stroke-width="0.880475" stroke-linejoin="round" stroke-dasharray="21.13 21.13" shape-rendering="crispEdges"/>
</g>
<line x1="165.701" y1="72.5" x2="141.883" y2="72.5" stroke="#FD7278" stroke-dasharray="2 2"/>
<line x1="42.6736" y1="36.2307" x2="26.1508" y2="19.0765" stroke="#3991FF" stroke-dasharray="2 2"/>
<line y1="-0.5" x2="23.8174" y2="-0.5" transform="matrix(0.73486 -0.678218 -0.678218 -0.73486 130.917 35.3833)" stroke="#3991FF" stroke-dasharray="2 2"/>
<line x1="132.256" y1="118.383" x2="148.779" y2="135.537" stroke="#3991FF" stroke-dasharray="2 2"/>
<line y1="-0.5" x2="23.8174" y2="-0.5" transform="matrix(-0.73486 0.678218 0.678218 0.73486 44.0134 119.23)" stroke="#3991FF" stroke-dasharray="2 2"/>
<g filter="url(#filter1_dii_3247_30504)">
<path d="M74.9999 70.625C80.0599 70.625 84.1666 66.425 84.1666 61.25C84.1666 56.075 80.0599 51.875 74.9999 51.875C69.9399 51.875 65.8333 56.075 65.8333 61.25C65.8333 66.425 69.9399 70.625 74.9999 70.625Z" fill="#ED3ED5"/>
<path d="M102.5 76.25H93.3333C91.3166 76.25 89.6666 77.9375 89.6666 80V89.375C89.6666 91.4375 91.3166 93.125 93.3333 93.125H102.5C104.517 93.125 106.167 91.4375 106.167 89.375V80C106.167 77.9375 104.517 76.25 102.5 76.25Z" fill="#ED3ED5"/>
<path d="M96.4683 64.4375C97.2016 64.7937 97.9899 65 98.8333 65C101.858 65 104.333 62.4687 104.333 59.375C104.333 56.2813 101.858 53.75 98.8333 53.75C95.8083 53.75 93.3333 56.2813 93.3333 59.375C93.3333 60.2375 93.5349 61.0437 93.8833 61.7937L75.5316 80.5625C74.7983 80.2062 74.0099 80 73.1666 80C70.1416 80 67.6666 82.5313 67.6666 85.625C67.6666 88.7187 70.1416 91.25 73.1666 91.25C76.1916 91.25 78.6666 88.7187 78.6666 85.625C78.6666 84.7625 78.4649 83.9562 78.1166 83.2062L96.4683 64.4375Z" fill="#ED3ED5"/>
</g>
<line x1="86.5" y1="126.021" x2="86.5" y2="134.802" stroke="#FD7278" stroke-dasharray="2 2"/>
<line x1="86.5" y1="7.27393" x2="86.5" y2="16.0542" stroke="#FD7278" stroke-dasharray="2 2"/>
<line x1="30.1165" y1="72.5" x2="6.29907" y2="72.5" stroke="#FD7278" stroke-dasharray="2 2"/>
<defs>
<filter id="filter0_d_3247_30504" x="32.9997" y="21.6411" width="106.001" height="106.001" filterUnits="userSpaceOnUse" color-interpolation-filters="sRGB">
<feFlood flood-opacity="0" result="BackgroundImageFix"/>
<feColorMatrix in="SourceAlpha" type="matrix" values="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 127 0" result="hardAlpha"/>
<feOffset dy="2.64143"/>
<feGaussianBlur stdDeviation="1.32071"/>
<feComposite in2="hardAlpha" operator="out"/>
<feColorMatrix type="matrix" values="0 0 0 0 0.432266 0 0 0 0 0.00354165 0 0 0 0 0.846458 0 0 0 1 0"/>
<feBlend mode="screen" in2="BackgroundImageFix" result="effect1_dropShadow_3247_30504"/>
<feBlend mode="normal" in="SourceGraphic" in2="effect1_dropShadow_3247_30504" result="shape"/>
</filter>
<filter id="filter1_dii_3247_30504" x="50.8333" y="36.875" width="70.3333" height="71.25" filterUnits="userSpaceOnUse" color-interpolation-filters="sRGB">
<feFlood flood-opacity="0" result="BackgroundImageFix"/>
<feColorMatrix in="SourceAlpha" type="matrix" values="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 127 0" result="hardAlpha"/>
<feOffset/>
<feGaussianBlur stdDeviation="7.5"/>
<feComposite in2="hardAlpha" operator="out"/>
<feColorMatrix type="matrix" values="0 0 0 0 0.950882 0 0 0 0 0.165557 0 0 0 0 0.859261 0 0 0 0.62 0"/>
<feBlend mode="normal" in2="BackgroundImageFix" result="effect1_dropShadow_3247_30504"/>
<feBlend mode="normal" in="SourceGraphic" in2="effect1_dropShadow_3247_30504" result="shape"/>
<feColorMatrix in="SourceAlpha" type="matrix" values="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 127 0" result="hardAlpha"/>
<feOffset dy="4"/>
<feGaussianBlur stdDeviation="4"/>
<feComposite in2="hardAlpha" operator="arithmetic" k2="-1" k3="1"/>
<feColorMatrix type="matrix" values="0 0 0 0 0.108171 0 0 0 0 0.108171 0 0 0 0 0.108171 0 0 0 0.35 0"/>
<feBlend mode="normal" in2="shape" result="effect2_innerShadow_3247_30504"/>
<feColorMatrix in="SourceAlpha" type="matrix" values="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 127 0" result="hardAlpha"/>
<feOffset dy="2"/>
<feGaussianBlur stdDeviation="3"/>
<feComposite in2="hardAlpha" operator="arithmetic" k2="-1" k3="1"/>
<feColorMatrix type="matrix" values="0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0.45 0"/>
<feBlend mode="plus-lighter" in2="effect2_innerShadow_3247_30504" result="effect3_innerShadow_3247_30504"/>
</filter>
</defs>
</svg>

After

Width:  |  Height:  |  Size: 5.9 KiB

View File

@@ -0,0 +1,53 @@
<svg width="169" height="146" viewBox="0 0 169 146" fill="none" xmlns="http://www.w3.org/2000/svg">
<path opacity="0.4" d="M129.725 83.5475C123.348 107.696 98.6012 122.103 74.4526 115.725C50.3039 109.348 35.8975 84.6014 42.2749 60.4528C48.6524 36.3041 73.3987 21.8977 97.5473 28.2752C121.696 34.6526 136.102 59.3989 129.725 83.5475Z" fill="#D1CFE9" stroke="#E93ED1" stroke-linejoin="round"/>
<g filter="url(#filter0_d_3247_30783)">
<path d="M127.096 42.8848C130.859 48.1869 133.626 54.2556 135.113 60.8241L134.684 60.9214C135.393 64.0538 135.809 67.3012 135.9 70.6344C135.991 73.9675 135.754 77.2329 135.217 80.3994L135.651 80.473C134.525 87.1131 132.095 93.324 128.627 98.8241L128.254 98.5893C124.76 104.131 120.203 108.944 114.861 112.737L115.116 113.096C109.814 116.859 103.745 119.626 97.1762 121.113L97.079 120.684C93.9466 121.393 90.6991 121.809 87.366 121.9C84.0328 121.991 80.7675 121.754 77.601 121.217L77.5274 121.651C70.8873 120.525 64.6763 118.095 59.1763 114.627L59.4111 114.254C53.8696 110.76 49.056 106.203 45.2638 100.861L44.9048 101.116C41.1411 95.8135 38.3747 89.7448 36.8874 83.1762L37.3167 83.079C36.6075 79.9466 36.1916 76.6991 36.1004 73.366C36.0091 70.0328 36.2468 66.7675 36.7836 63.6009L36.3496 63.5273C37.4753 56.8873 39.9057 50.6763 43.3737 45.1763L43.7461 45.4111C47.2404 39.8695 51.7975 35.0559 57.1396 31.2638L56.8848 30.9048C62.1869 27.141 68.2556 24.3746 74.8242 22.8873L74.9214 23.3167C78.0538 22.6074 81.3013 22.1916 84.6344 22.1003C87.9676 22.0091 91.2329 22.2467 94.3994 22.7836L94.473 22.3495C101.113 23.4753 107.324 25.9056 112.824 29.3737L112.589 29.7461C118.131 33.2404 122.944 37.7975 126.737 43.1396L127.096 42.8848Z" stroke="#9470FF" stroke-width="0.880475" stroke-linejoin="round" stroke-dasharray="21.13 21.13" shape-rendering="crispEdges"/>
</g>
<line x1="165.701" y1="72.5" x2="141.883" y2="72.5" stroke="#FD7278" stroke-dasharray="2 2"/>
<line x1="42.6736" y1="36.2307" x2="26.1508" y2="19.0765" stroke="#3991FF" stroke-dasharray="2 2"/>
<line y1="-0.5" x2="23.8174" y2="-0.5" transform="matrix(0.73486 -0.678218 -0.678218 -0.73486 130.917 35.3833)" stroke="#3991FF" stroke-dasharray="2 2"/>
<line x1="132.256" y1="118.383" x2="148.779" y2="135.537" stroke="#3991FF" stroke-dasharray="2 2"/>
<line y1="-0.5" x2="23.8174" y2="-0.5" transform="matrix(-0.73486 0.678218 0.678218 0.73486 44.0134 119.23)" stroke="#3991FF" stroke-dasharray="2 2"/>
<g filter="url(#filter1_dii_3247_30783)">
<path d="M74.9999 70.625C80.0599 70.625 84.1666 66.425 84.1666 61.25C84.1666 56.075 80.0599 51.875 74.9999 51.875C69.9399 51.875 65.8333 56.075 65.8333 61.25C65.8333 66.425 69.9399 70.625 74.9999 70.625Z" fill="#ED3ED5"/>
<path d="M102.5 76.25H93.3333C91.3166 76.25 89.6666 77.9375 89.6666 80V89.375C89.6666 91.4375 91.3166 93.125 93.3333 93.125H102.5C104.517 93.125 106.167 91.4375 106.167 89.375V80C106.167 77.9375 104.517 76.25 102.5 76.25Z" fill="#ED3ED5"/>
<path d="M96.4683 64.4375C97.2016 64.7937 97.9899 65 98.8333 65C101.858 65 104.333 62.4687 104.333 59.375C104.333 56.2813 101.858 53.75 98.8333 53.75C95.8083 53.75 93.3333 56.2813 93.3333 59.375C93.3333 60.2375 93.5349 61.0437 93.8833 61.7937L75.5316 80.5625C74.7983 80.2062 74.0099 80 73.1666 80C70.1416 80 67.6666 82.5313 67.6666 85.625C67.6666 88.7187 70.1416 91.25 73.1666 91.25C76.1916 91.25 78.6666 88.7187 78.6666 85.625C78.6666 84.7625 78.4649 83.9562 78.1166 83.2062L96.4683 64.4375Z" fill="#ED3ED5"/>
</g>
<line x1="86.5" y1="126.021" x2="86.5" y2="134.802" stroke="#FD7278" stroke-dasharray="2 2"/>
<line x1="86.5" y1="7.27393" x2="86.5" y2="16.0542" stroke="#FD7278" stroke-dasharray="2 2"/>
<line x1="30.1165" y1="72.5" x2="6.29907" y2="72.5" stroke="#FD7278" stroke-dasharray="2 2"/>
<defs>
<filter id="filter0_d_3247_30783" x="32.9997" y="21.6411" width="106.001" height="106.001" filterUnits="userSpaceOnUse" color-interpolation-filters="sRGB">
<feFlood flood-opacity="0" result="BackgroundImageFix"/>
<feColorMatrix in="SourceAlpha" type="matrix" values="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 127 0" result="hardAlpha"/>
<feOffset dy="2.64143"/>
<feGaussianBlur stdDeviation="1.32071"/>
<feComposite in2="hardAlpha" operator="out"/>
<feColorMatrix type="matrix" values="0 0 0 0 0.432266 0 0 0 0 0.00354165 0 0 0 0 0.846458 0 0 0 1 0"/>
<feBlend mode="screen" in2="BackgroundImageFix" result="effect1_dropShadow_3247_30783"/>
<feBlend mode="normal" in="SourceGraphic" in2="effect1_dropShadow_3247_30783" result="shape"/>
</filter>
<filter id="filter1_dii_3247_30783" x="50.8333" y="36.875" width="70.3333" height="71.25" filterUnits="userSpaceOnUse" color-interpolation-filters="sRGB">
<feFlood flood-opacity="0" result="BackgroundImageFix"/>
<feColorMatrix in="SourceAlpha" type="matrix" values="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 127 0" result="hardAlpha"/>
<feOffset/>
<feGaussianBlur stdDeviation="7.5"/>
<feComposite in2="hardAlpha" operator="out"/>
<feColorMatrix type="matrix" values="0 0 0 0 0.950882 0 0 0 0 0.165557 0 0 0 0 0.859261 0 0 0 0.05 0"/>
<feBlend mode="normal" in2="BackgroundImageFix" result="effect1_dropShadow_3247_30783"/>
<feBlend mode="normal" in="SourceGraphic" in2="effect1_dropShadow_3247_30783" result="shape"/>
<feColorMatrix in="SourceAlpha" type="matrix" values="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 127 0" result="hardAlpha"/>
<feOffset dy="4"/>
<feGaussianBlur stdDeviation="4"/>
<feComposite in2="hardAlpha" operator="arithmetic" k2="-1" k3="1"/>
<feColorMatrix type="matrix" values="0 0 0 0 0.108171 0 0 0 0 0.108171 0 0 0 0 0.108171 0 0 0 0.35 0"/>
<feBlend mode="normal" in2="shape" result="effect2_innerShadow_3247_30783"/>
<feColorMatrix in="SourceAlpha" type="matrix" values="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 127 0" result="hardAlpha"/>
<feOffset dy="2"/>
<feGaussianBlur stdDeviation="3"/>
<feComposite in2="hardAlpha" operator="arithmetic" k2="-1" k3="1"/>
<feColorMatrix type="matrix" values="0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0.45 0"/>
<feBlend mode="plus-lighter" in2="effect2_innerShadow_3247_30783" result="effect3_innerShadow_3247_30783"/>
</filter>
</defs>
</svg>

After

Width:  |  Height:  |  Size: 5.9 KiB

View File

@@ -10,10 +10,15 @@
>
<template #label>
<component :is="embedActiveTab || tab.disabled || tab.locked ? 'a' : 'router-link'" @click="embeddedTabChange(tab)" :to="embedActiveTab ? undefined : to(tab)" :data-test-id="tab.name">
<enterprise-tooltip :disabled="tab.locked" :term="tab.name" content="tabs">
{{ tab.title }}
<el-badge :type="tab.count > 0 ? 'danger' : 'primary'" :value="tab.count" v-if="tab.count !== undefined" />
</enterprise-tooltip>
<el-tooltip v-if="tab.disabled && tab.props && tab.props.showTooltip" :content="$t('add-trigger-in-editor')" placement="top">
<span><strong>{{ tab.title }}</strong></span>
</el-tooltip>
<span v-if="!tab.hideTitle">
<enterprise-tooltip :disabled="tab.locked" :term="tab.name" content="tabs">
{{ tab.title }}
<el-badge :type="tab.count > 0 ? 'danger' : 'primary'" :value="tab.count" v-if="tab.count !== undefined" />
</enterprise-tooltip>
</span>
</component>
</template>
</el-tab-pane>
@@ -133,6 +138,7 @@
},
to(tab) {
if (this.activeTab === tab) {
this.setActiveName()
return this.$route;
} else {
return {
@@ -224,4 +230,3 @@
flex-direction: column;
}
</style>

View File

@@ -80,12 +80,14 @@
</bulk-select>
</template>
<el-table-column
v-if="visibleColumns.triggerId"
prop="triggerId"
sortable="custom"
:sort-orders="['ascending', 'descending']"
:label="$t('id')"
/>
<el-table-column
v-if="visibleColumns.flowId"
prop="flowId"
sortable="custom"
:sort-orders="['ascending', 'descending']"
@@ -105,6 +107,7 @@
</template>
</el-table-column>
<el-table-column
v-if="visibleColumns.namespace"
prop="namespace"
sortable="custom"
:sort-orders="['ascending', 'descending']"
@@ -115,7 +118,7 @@
</template>
</el-table-column>
<el-table-column :label="$t('current execution')">
<el-table-column v-if="visibleColumns.executionId" :label="$t('current execution')">
<template #default="scope">
<router-link
v-if="scope.row.executionId"
@@ -125,17 +128,7 @@
</router-link>
</template>
</el-table-column>
<el-table-column :label="$t('state')">
<template #default="scope">
<status
v-if="scope.row.executionCurrentState"
:status="scope.row.executionCurrentState"
size="small"
/>
</template>
</el-table-column>
<el-table-column prop="workerId" :label="$t('workerId')">
<el-table-column v-if="visibleColumns.workerId" prop="workerId" :label="$t('workerId')">
<template #default="scope">
<id
:value="scope.row.workerId"
@@ -143,22 +136,22 @@
/>
</template>
</el-table-column>
<el-table-column :label="$t('date')">
<el-table-column v-if="visibleColumns.date" :label="$t('date')">
<template #default="scope">
<date-ago :inverted="true" :date="scope.row.date" />
</template>
</el-table-column>
<el-table-column :label="$t('updated date')">
<el-table-column v-if="visibleColumns.updatedDate" :label="$t('updated date')">
<template #default="scope">
<date-ago :inverted="true" :date="scope.row.updatedDate" />
</template>
</el-table-column>
<el-table-column :label="$t('next execution date')">
<el-table-column v-if="visibleColumns.nextExecutionDate" :label="$t('next execution date')">
<template #default="scope">
<date-ago :inverted="true" :date="scope.row.nextExecutionDate" />
</template>
</el-table-column>
<el-table-column :label="$t('evaluation lock date')">
<el-table-column v-if="visibleColumns.evaluateRunningDate" :label="$t('evaluation lock date')">
<template #default="scope">
<date-ago :inverted="true" :date="scope.row.evaluateRunningDate" />
</template>
@@ -270,7 +263,6 @@
import RefreshButton from "../layout/RefreshButton.vue";
import DateAgo from "../layout/DateAgo.vue";
import Id from "../Id.vue";
import Status from "../Status.vue";
import {mapState} from "vuex";
import SelectTableActions from "../../mixins/selectTableActions";
import _merge from "lodash/merge";
@@ -285,7 +277,6 @@
SearchField,
NamespaceSelect,
DateAgo,
Status,
Id,
LogsWrapper
},
@@ -475,6 +466,25 @@
const disabled = this.state === "DISABLED" ? true : false;
return all.filter(trigger => trigger.disabled === disabled);
},
visibleColumns() {
const columns = [
{prop: "triggerId", label: this.$t("id")},
{prop: "flowId", label: this.$t("flow")},
{prop: "namespace", label: this.$t("namespace")},
{prop: "executionId", label: this.$t("current execution")},
{prop: "executionCurrentState", label: this.$t("state")},
{prop: "workerId", label: this.$t("workerId")},
{prop: "date", label: this.$t("date")},
{prop: "updatedDate", label: this.$t("updated date")},
{prop: "nextExecutionDate", label: this.$t("next execution date")},
{prop: "evaluateRunningDate", label: this.$t("evaluation lock date")},
];
return columns.reduce((acc, column) => {
acc[column.prop] = this.triggersMerged.some(trigger => trigger[column.prop]);
return acc;
}, {});
}
}
};

View File

@@ -6,7 +6,7 @@
width="222.67px"
height="125px"
loading="lazy"
:src="$store.getters['doc/resourceUrl']('/v1/docs/tutorial/logos/logo-dark-version.png')"
:src="$store.getters['doc/resourceUrl']('/docs/tutorial/logos/logo-dark-version.png')"
alt="Dark version logo"
>
<p class="title">
@@ -23,7 +23,7 @@
width="222.67px"
height="125px"
loading="lazy"
:src="$store.getters['doc/resourceUrl']('/v1/docs/tutorial/logos/logo-light-version.png')"
:src="$store.getters['doc/resourceUrl']('/docs/tutorial/logos/logo-light-version.png')"
alt="Light version logo"
>
<p class="title">
@@ -40,7 +40,7 @@
width="222.67px"
height="125px"
loading="lazy"
:src="$store.getters['doc/resourceUrl']('/v1/docs/tutorial/logos/logo-monogram-version.png')"
:src="$store.getters['doc/resourceUrl']('/docs/tutorial/logos/logo-monogram-version.png')"
alt="Monogram version logo"
>
<p class="title">

View File

@@ -48,7 +48,7 @@
<el-col :xs="24" :sm="8" :lg="4">
<refresh-button
class="float-right"
@refresh="fetchAll()"
@refresh="refresh()"
:can-auto-refresh="canAutoRefresh"
/>
</el-col>
@@ -61,6 +61,7 @@
<Card
:icon="CheckBold"
:label="t('dashboard.success_ratio')"
:tooltip="t('dashboard.success_ratio_tooltip')"
:value="stats.success"
:redirect="{
name: 'executions/list',
@@ -77,6 +78,7 @@
<Card
:icon="Alert"
:label="t('dashboard.failure_ratio')"
:tooltip="t('dashboard.failure_ratio_tooltip')"
:value="stats.failed"
:redirect="{
name: 'executions/list',
@@ -140,7 +142,10 @@
v-model="descriptionDialog"
:title="$t('description')"
>
<Markdown :source="description" class="p-4 description" />
<Markdown
:source="description"
class="p-4 description"
/>
</el-dialog>
</span>
@@ -197,7 +202,6 @@
import {useI18n} from "vue-i18n";
import moment from "moment";
import _cloneDeep from "lodash/cloneDeep";
import {apiUrl} from "override/utils/route";
import State from "../../utils/state";
@@ -228,6 +232,7 @@
import BookOpenOutline from "vue-material-design-icons/BookOpenOutline.vue";
import permission from "../../models/permission.js";
import action from "../../models/action.js";
import {storageKeys} from "../../utils/constants";
const router = useRouter();
const route = useRoute();
@@ -235,6 +240,7 @@
const {t} = useI18n({useScope: "global"});
const user = store.getters["auth/user"];
const defaultNamespace = localStorage.getItem(storageKeys.DEFAULT_NAMESPACE) || null;
const props = defineProps({
embed: {
type: Boolean,
@@ -254,6 +260,10 @@
required: false,
default: null,
},
restoreURL:{
type: Boolean,
default: true,
}
});
const descriptionDialog = ref(false);
@@ -271,6 +281,13 @@
scope: ["USER"],
});
const refresh = async () => {
await updateParams({
startDate: filters.value.startDate,
endDate: moment().toISOString(true),
});
fetchAll();
};
const canAutoRefresh = ref(false);
const toggleAutoRefresh = (event) => {
canAutoRefresh.value = event;
@@ -290,29 +307,45 @@
const executions = ref({raw: {}, all: {}, yesterday: {}, today: {}});
const stats = computed(() => {
const counts = executions?.value?.all?.executionCounts || {};
const total = Object.values(counts).reduce((sum, count) => sum + count, 0);
const terminatedStates = State.getTerminatedStates();
const statesToCount = Object.fromEntries(
Object.entries(counts).filter(([key]) =>
terminatedStates.includes(key),
),
);
function percentage(count, total) {
return total ? ((count / total) * 100).toFixed(2) : "0.00";
}
const total = Object.values(statesToCount).reduce(
(sum, count) => sum + count,
0,
);
const successStates = ["SUCCESS", "CANCELLED", "WARNING"];
const failedStates = ["FAILED", "KILLED", "RETRIED"];
const sumStates = (states) =>
states.reduce((sum, state) => sum + (statesToCount[state] || 0), 0);
const successRatio =
total > 0 ? (sumStates(successStates) / total) * 100 : 0;
const failedRatio = total > 0 ? (sumStates(failedStates) / total) * 100 : 0;
return {
total,
success: `${percentage(counts[State.SUCCESS] || 0, total)}%`,
failed: `${percentage(counts[State.FAILED] || 0, total)}%`,
success: `${successRatio.toFixed(2)}%`,
failed: `${failedRatio.toFixed(2)}%`,
};
});
const transformer = (data) => {
return data.reduce((accumulator, value) => {
if (!accumulator) accumulator = _cloneDeep(value);
else {
for (const key in value.executionCounts) {
accumulator.executionCounts[key] += value.executionCounts[key];
}
accumulator = accumulator || {executionCounts: {}, duration: {}};
for (const key in value.duration) {
accumulator.duration[key] += value.duration[key];
}
for (const key in value.executionCounts) {
accumulator.executionCounts[key] =
(accumulator.executionCounts[key] || 0) +
value.executionCounts[key];
}
for (const key in value.duration) {
accumulator.duration[key] =
(accumulator.duration[key] || 0) + value.duration[key];
}
return accumulator;
@@ -427,7 +460,15 @@
});
onBeforeMount(() => {
filters.value.namespace = route.query.namespace ?? null;
if (!route.query.namespace && props.restoreURL) {
router.replace({query: {...route.query, namespace: defaultNamespace}});
filters.value.namespace = route.query.namespace || defaultNamespace;
}
else {
filters.value.namespace = null
}
updateParams();
});
</script>

View File

@@ -2,7 +2,14 @@
<div class="p-4 card">
<div class="d-flex pb-2 justify-content-between">
<div class="d-flex align-items-center">
<component :is="icon" class="me-2 fs-4 icons" />
<el-tooltip
v-if="tooltip"
:content="tooltip"
popper-class="dashboard-card-tooltip"
>
<component :is="icon" class="me-2 fs-4 icons" />
</el-tooltip>
<component v-else :is="icon" class="me-2 fs-4 icons" />
<p class="m-0 fs-6 label">
{{ label }}
@@ -31,6 +38,10 @@
type: String,
required: true,
},
tooltip: {
type: String,
default: undefined,
},
value: {
type: [String, Number],
required: true,
@@ -63,3 +74,9 @@
}
}
</style>
<style lang="scss">
.dashboard-card-tooltip {
width: 300px;
}
</style>

View File

@@ -1,7 +1,7 @@
<template>
<div class="p-4">
<div class="d-flex flex justify-content-between pb-4">
<div>
<div class="p-4 responsive-container">
<div class="d-flex flex-wrap justify-content-between pb-4 info-container">
<div class="info-block">
<p class="m-0 fs-6">
<span class="fw-bold">{{ t("executions") }}</span>
<span class="fw-light small">
@@ -13,8 +13,8 @@
</p>
</div>
<div>
<div class="d-flex justify-content-end align-items-center">
<div class="switch-container">
<div class="d-flex justify-content-end align-items-center switch-content">
<span class="pe-2 fw-light small">{{ t("duration") }}</span>
<el-switch
v-model="duration"
@@ -35,7 +35,7 @@
</template>
<script setup>
import {computed, ref} from "vue";
import {computed, ref, onMounted, onUnmounted} from "vue";
import {useI18n} from "vue-i18n";
import moment from "moment";
@@ -50,6 +50,7 @@
import Check from "vue-material-design-icons/Check.vue";
const {t} = useI18n({useScope: "global"});
const isSmallScreen = ref(window.innerWidth < 610);
const props = defineProps({
data: {
@@ -106,9 +107,20 @@
};
});
onMounted(() => {
const handleResize = () => {
isSmallScreen.value = window.innerWidth < 610;
};
window.addEventListener("resize", handleResize);
onUnmounted(() => {
window.removeEventListener("resize", handleResize);
});
});
const options = computed(() =>
defaultConfig({
barThickness: 12,
barThickness: isSmallScreen.value ? 8 : 12,
skipNull: true,
borderSkipped: false,
borderColor: "transparent",
@@ -141,7 +153,7 @@
display: true,
stacked: true,
ticks: {
maxTicksLimit: 8,
maxTicksLimit: isSmallScreen.value ? 5 : 8,
callback: function (value) {
const label = this.getLabelForValue(value);
const date = moment(new Date(label));
@@ -156,7 +168,7 @@
},
y: {
title: {
display: true,
display: !isSmallScreen.value,
text: t("executions"),
},
grid: {
@@ -166,12 +178,12 @@
position: "left",
stacked: true,
ticks: {
maxTicksLimit: 8,
maxTicksLimit: isSmallScreen.value ? 5 : 8,
},
},
yB: {
title: {
display: duration.value,
display: duration.value && !isSmallScreen.value,
text: t("duration"),
},
grid: {
@@ -180,7 +192,7 @@
display: duration.value,
position: "right",
ticks: {
maxTicksLimit: 8,
maxTicksLimit: isSmallScreen.value ? 5 : 8,
callback: function (value) {
return `${this.getLabelForValue(value)}s`;
},
@@ -193,22 +205,65 @@
const duration = ref(true);
</script>
Copy code
<style lang="scss" scoped>
@import "@kestra-io/ui-libs/src/scss/variables";
$height: 200px;
.tall {
height: $height;
max-height: $height;
height: $height;
max-height: $height;
}
.small {
font-size: $font-size-xs;
color: $gray-700;
font-size: $font-size-xs;
color: $gray-700;
html.dark & {
color: $gray-300;
}
html.dark & {
color: $gray-300;
}
}
</style>
@media (max-width: 610px) {
.responsive-container {
padding: 2px;
}
.info-container {
flex-direction: column;
text-align: center;
}
.info-block {
margin-bottom: 15px;
}
.switch-container {
display: flex;
justify-content: center;
width: 100%;
}
.switch-content {
justify-content: center;
}
.fs-2 {
font-size: 1.5rem;
}
.fs-6 {
font-size: 0.875rem;
}
.small {
font-size: 0.75rem;
}
.pe-2 {
padding-right: 0.5rem;
}
}
</style>

View File

@@ -120,7 +120,7 @@
y: {
title: {
display: true,
text: t("executions"),
text: t("logs"),
},
grid: {
display: false,

View File

@@ -167,5 +167,12 @@ code {
.inprogress {
--el-table-tr-bg-color: var(--bs-body-bg) !important;
background: var(--bs-body-bg);
& a {
color: #8e71f7;
html.dark & {
color: #e0e0fc;
}
}
}
</style>

View File

@@ -7,7 +7,7 @@
<div class="pt-4">
<el-table
:data="executions.results"
class="inprogress"
class="nextscheduled"
:height="240"
>
<el-table-column class-name="next-toggle" width="50">
@@ -28,7 +28,10 @@
v-else
:model-value="!scope.row.disabled"
@change="
toggleState(scope.row.triggerContext);
toggleState(
scope.row.triggerContext,
!scope.row.disabled,
);
scope.row.disabled = !scope.row.disabled;
"
:active-icon="Check"
@@ -194,11 +197,8 @@
() => loadExecutions(),
);
const toggleState = (trigger) => {
store.dispatch("trigger/update", {
...trigger,
disabled: !trigger.disabled,
});
const toggleState = (trigger, disabled) => {
store.dispatch("trigger/update", {...trigger, disabled});
};
onBeforeMount(() => {
@@ -211,9 +211,16 @@ code {
color: var(--bs-code-color);
}
.inprogress {
.nextscheduled {
--el-table-tr-bg-color: var(--bs-body-bg) !important;
background: var(--bs-body-bg);
& a {
color: #8e71f7;
html.dark & {
color: #e0e0fc;
}
}
}
.next-toggle {

View File

@@ -4,7 +4,7 @@
:persistent="false"
transition=""
:hide-after="0"
:content="$t('change status tooltip')"
:content="$t('change state tooltip')"
raw-content
:placement="tooltipPosition"
>
@@ -15,7 +15,7 @@
:disabled="!enabled"
class="ms-0 me-1"
>
{{ $t('change status') }}
{{ $t('change state') }}
</component>
</el-tooltip>
@@ -25,7 +25,7 @@
</template>
<template #default>
<p v-html="$t('change execution status confirm', {id: execution.id})" />
<p v-html="$t('change execution state confirm', {id: execution.id})" />
<p>
Current status is : <status size="small" class="me-1" :status="execution.state.current" />
@@ -186,4 +186,4 @@
padding-left: 10px;
}
}
</style>
</style>

View File

@@ -5,7 +5,7 @@
@click="visible = !visible"
:disabled="!enabled"
>
<span v-if="component !== 'el-button'">{{ $t('change status') }}</span>
<span v-if="component !== 'el-button'">{{ $t('change_status') }}</span>
<el-dialog v-if="enabled && visible" v-model="visible" :id="uuid" destroy-on-close :append-to-body="true">
<template #header>
@@ -13,7 +13,7 @@
</template>
<template #default>
<p v-html="$t('change status confirm', {id: execution.id, task: taskRun.taskId})" />
<p v-html="$t('change state confirm', {id: execution.id, task: taskRun.taskId})" />
<p>
Current status is : <status size="small" class="me-1" :status="taskRun.state.current" />

View File

@@ -92,7 +92,10 @@
if (isEnd) {
this.closeSSE();
}
this.throttledExecutionUpdate(executionEvent);
// we are receiving a first "fake" event to force initializing the connection: ignoring it
if (executionEvent.lastEventId !== "start") {
this.throttledExecutionUpdate(executionEvent);
}
if (isEnd) {
this.throttledExecutionUpdate.flush();
}

View File

@@ -44,8 +44,9 @@
</el-form-item>
<el-form-item v-if="$route.name !== 'flows/update'">
<namespace-select
:value="selectedNamespace"
data-type="flow"
:value="$route.query.namespace"
:disabled="!!namespace"
@update:model-value="onDataTableValue('namespace', $event)"
/>
</el-form-item>
@@ -133,17 +134,10 @@
</el-form-item>
</template>
<template #top v-if="showStatChart()">
<state-global-chart
v-if="daily"
class="mb-4"
:ready="dailyReady"
:data="daily"
:start-date="startDate"
:end-date="endDate"
:namespace="namespace"
:flow-id="flowId"
/>
<template #top>
<el-card v-if="showStatChart()" shadow="never" class="mb-4">
<ExecutionsBar v-if="daily" :data="daily" :total="executionsCount" />
</el-card>
</template>
<template #table>
@@ -456,7 +450,6 @@
import Filters from "../saved-filters/Filters.vue";
import StatusFilterButtons from "../layout/StatusFilterButtons.vue"
import ScopeFilterButtons from "../layout/ScopeFilterButtons.vue"
import StateGlobalChart from "../../components/stats/StateGlobalChart.vue";
import Kicon from "../Kicon.vue"
import Labels from "../layout/Labels.vue"
import RestoreUrl from "../../mixins/restoreUrl";
@@ -471,6 +464,7 @@
import {ElMessageBox, ElSwitch, ElFormItem, ElAlert, ElCheckbox} from "element-plus";
import DateAgo from "../layout/DateAgo.vue";
import {h, ref} from "vue";
import ExecutionsBar from "../../components/dashboard/components/charts/executions/Bar.vue"
import {filterLabels} from "./utils"
@@ -488,13 +482,13 @@
Filters,
StatusFilterButtons,
ScopeFilterButtons,
StateGlobalChart,
Kicon,
Labels,
Id,
TriggerFlow,
TopNavBar,
LabelInput
LabelInput,
ExecutionsBar
},
emits: ["state-count"],
props: {
@@ -614,11 +608,6 @@
selectedStatus: undefined
};
},
beforeCreate(){
if(!this.$route.query.scope) {
this.$route.query.scope = this.namespace === "system" ? ["SYSTEM"] : ["USER"];
}
},
created() {
// allow to have different storage key for flow executions list
if (this.$route.name === "flows/update") {
@@ -694,6 +683,26 @@
};
});
},
executionsCount() {
return [...this.daily].reduce((a, b) => {
return a + Object.values(b.executionCounts).reduce((a, b) => a + b, 0);
}, 0);
},
selectedNamespace(){
return this.namespace !== null && this.namespace !== undefined ? this.namespace : this.$route.query?.namespace;
}
},
beforeRouteEnter(to, from, next) {
const defaultNamespace = localStorage.getItem(storageKeys.DEFAULT_NAMESPACE);
const query = {...to.query};
if (defaultNamespace) {
query.namespace = defaultNamespace;
} if (!query.scope) {
query.scope = defaultNamespace === "system" ? ["SYSTEM"] : ["USER"];
}
next(vm => {
vm.$router?.replace({query});
});
},
methods: {
executionParams(row) {
@@ -857,7 +866,7 @@
);
},
changeStatusToast() {
return this.$t("bulk change execution status", {"executionCount": this.queryBulkAction ? this.total : this.selection.length});
return this.$t("bulk change state", {"executionCount": this.queryBulkAction ? this.total : this.selection.length});
},
deleteExecutions() {
const includeNonTerminated = ref(false);

View File

@@ -68,6 +68,7 @@
:target-execution="execution"
:target-flow="flow"
:show-logs="taskTypeByTaskRunId[item.id] !== 'io.kestra.plugin.core.flow.ForEachItem' && taskTypeByTaskRunId[item.id] !== 'io.kestra.core.tasks.flows.ForEachItem'"
class="mh-100"
/>
</div>
</div>
@@ -109,6 +110,7 @@
this.selectedTaskRuns = [];
this.paint();
}
newValue.state?.current === State.SUCCESS && (this.compute());
},
forEachItemsTaskRunIds: {
handler(newValue, oldValue) {

View File

@@ -26,6 +26,7 @@
:total-count="countByLogLevel[logLevel]"
@previous="previousLogForLevel(logLevel)"
@next="nextLogForLevel(logLevel)"
@close="logCursor = undefined"
/>
</el-form-item>
<el-form-item>
@@ -37,7 +38,7 @@
<el-tooltip
:content="!raw_view ? $t('logs_view.raw_details') : $t('logs_view.compact_details')"
>
<el-button @click="setRawView()">
<el-button @click="toggleViewType">
{{ !raw_view ? $t('logs_view.raw') : $t('logs_view.compact') }}
</el-button>
</el-tooltip>
@@ -70,15 +71,36 @@
:target-flow="flow"
:show-progress-bar="false"
/>
<el-card v-else>
<template v-for="log in logs" :key="`${log.timestamp}-${log.taskRun}`">
<log-line
:level="level"
filter=""
:log="log"
title
/>
</template>
<el-card v-else class="attempt-wrapper">
<DynamicScroller
ref="logScroller"
:items="temporalLogs"
:min-item-size="50"
key-field="index"
class="log-lines"
:buffer="200"
:prerender="20"
>
<template #default="{item, active}">
<DynamicScrollerItem
:item="item"
:active="active"
:size-dependencies="[item.message]"
:data-index="item.index"
>
<log-line
@click="logCursor = item.index.toString()"
class="line"
:class="{['log-bg-' + cursorLogLevel?.toLowerCase()]: cursorLogLevel === item.level, 'opacity-40': cursorLogLevel && cursorLogLevel !== item.level}"
:cursor="item.index.toString() === logCursor"
:level="level"
:filter="filter"
:log="item"
title
/>
</DynamicScrollerItem>
</template>
</DynamicScroller>
</el-card>
</div>
</template>
@@ -91,6 +113,8 @@
import Kicon from "../Kicon.vue";
import LogLevelSelector from "../logs/LogLevelSelector.vue";
import LogLevelNavigator from "../logs/LogLevelNavigator.vue";
import {DynamicScroller, DynamicScrollerItem} from "vue-virtual-scroller";
import "vue-virtual-scroller/dist/vue-virtual-scroller.css"
import Collapse from "../layout/Collapse.vue";
import State from "../../utils/state";
import Utils from "../../utils/utils";
@@ -108,7 +132,9 @@
Download,
Magnify,
Collapse,
Restart
Restart,
DynamicScroller,
DynamicScrollerItem,
},
data() {
return {
@@ -125,10 +151,42 @@
this.level = (this.$route.query.level || localStorage.getItem("defaultLogLevel") || "INFO");
this.filter = (this.$route.query.q || undefined);
},
watch:{
level: {
handler() {
if (this.raw_view) {
this.$store.dispatch("execution/loadLogs", {
executionId: this.execution.id,
minLevel: this.level
})
}
}
},
logCursor(newValue) {
if (newValue !== undefined && this.raw_view) {
this.scrollToLog(newValue);
}
}
},
computed: {
State() {
return State
},
temporalLogs() {
if (!this.logs?.length) {
return [];
}
const filtered = this.logs.filter(log => {
if (!this.filter) return true;
return log.message?.toLowerCase().includes(this.filter.toLowerCase());
});
return filtered.map((logLine, index) => ({
...logLine,
index
}));
},
...mapState("execution", ["execution", "logs", "flow"]),
downloadName() {
return `kestra-execution-${this.$moment().format("YYYYMMDDHHmmss")}-${this.execution.id}.log`
@@ -140,13 +198,32 @@
return LogUtils.levelOrLower(this.level);
},
countByLogLevel() {
return Object.fromEntries(Object.entries(this.logIndicesByLevel).map(([level, indices]) => [level, indices.length]));
return Object.fromEntries(Object.entries(this.viewTypeAwareLogIndicesByLevel).map(([level, indices]) => [level, indices.length]));
},
cursorLogLevel() {
return Object.entries(this.logIndicesByLevel).find(([_, indices]) => indices.includes(this.logCursor))?.[0];
return Object.entries(this.viewTypeAwareLogIndicesByLevel).find(([_, indices]) => indices.includes(this.logCursor))?.[0];
},
cursorIdxForLevel() {
return this.logIndicesByLevel?.[this.cursorLogLevel]?.toSorted(this.sortLogsByViewOrder)?.indexOf(this.logCursor);
return this.viewTypeAwareLogIndicesByLevel?.[this.cursorLogLevel]?.toSorted(this.sortLogsByViewOrder)?.indexOf(this.logCursor);
},
temporalViewLogIndicesByLevel() {
const temporalViewLogIndicesByLevel = this.temporalLogs.reduce((acc, item) => {
if (!acc[item.level]) {
acc[item.level] = [];
}
acc[item.level].push(item.index.toString());
return acc;
}, {});
LogUtils.levelOrLower(undefined).forEach(level => {
if (!temporalViewLogIndicesByLevel[level]) {
temporalViewLogIndicesByLevel[level] = [];
}
});
return temporalViewLogIndicesByLevel
},
viewTypeAwareLogIndicesByLevel() {
return this.raw_view ? this.temporalViewLogIndicesByLevel : this.logIndicesByLevel;
}
},
methods: {
@@ -172,14 +249,9 @@
expandCollapseAll() {
this.$refs.logs.toggleExpandCollapseAll();
},
setRawView() {
toggleViewType() {
this.logCursor = undefined;
this.raw_view = !this.raw_view;
if(this.raw_view) {
this.$store.dispatch("execution/loadLogs", {
executionId: this.execution.id,
minLevel: this.level
})
}
},
sortLogsByViewOrder(a, b) {
const aSplit = a.split("/");
@@ -199,7 +271,7 @@
return Number.parseInt(taskRunIndexA) - Number.parseInt(taskRunIndexB);
},
previousLogForLevel(level) {
const logIndicesForLevel = this.logIndicesByLevel[level];
const logIndicesForLevel = this.viewTypeAwareLogIndicesByLevel[level];
if (this.logCursor === undefined) {
this.logCursor = logIndicesForLevel?.[logIndicesForLevel.length - 1];
return;
@@ -209,7 +281,7 @@
this.logCursor = sortedIndices?.[sortedIndices.indexOf(this.logCursor) - 1] ?? sortedIndices[sortedIndices.length - 1];
},
nextLogForLevel(level) {
const logIndicesForLevel = this.logIndicesByLevel[level];
const logIndicesForLevel = this.viewTypeAwareLogIndicesByLevel[level];
if (this.logCursor === undefined) {
this.logCursor = logIndicesForLevel?.[0];
return;
@@ -217,7 +289,54 @@
const sortedIndices = [...logIndicesForLevel, this.logCursor].filter(Utils.distinctFilter).sort(this.sortLogsByViewOrder);
this.logCursor = sortedIndices?.[sortedIndices.indexOf(this.logCursor) + 1] ?? sortedIndices[0];
},
scrollToLog(index) {
this.$refs.logScroller.scrollToItem(index);
}
}
};
</script>
<style lang="scss" scoped>
@import "@kestra-io/ui-libs/src/scss/variables";
.attempt-wrapper {
background-color: var(--bs-white);
:deep(.vue-recycle-scroller__item-view + .vue-recycle-scroller__item-view) {
border-top: 1px solid var(--bs-border-color);
}
html.dark & {
background-color: var(--bs-gray-100);
}
.attempt-wrapper & {
border-radius: .25rem;
}
}
.log-lines {
max-height: calc(100vh - 335px);
transition: max-height 0.2s ease-out;
margin-top: calc(var(--spacer) / 2);
.line {
padding: calc(var(--spacer) / 2);
&.cursor {
background-color: var(--bs-gray-300)
}
}
&::-webkit-scrollbar {
width: 5px;
}
&::-webkit-scrollbar-track {
background: var(--bs-gray-500);
}
&::-webkit-scrollbar-thumb {
background: var(--bs-primary);
}
}
</style>

View File

@@ -53,7 +53,7 @@
<p v-html="$t(replayOrRestart + ' confirm', {id: execution.id})" />
<el-form v-if="revisionsOptions && revisionsOptions.length > 1">
<p class="text-muted">
<p class="execution-description">
{{ $t("restart change revision") }}
</p>
<el-form-item :label="$t('revisions')">
@@ -227,3 +227,8 @@
},
};
</script>
<style scoped>
.execution-description {
color: var(--bs-gray-700);
}
</style>

View File

@@ -225,7 +225,10 @@
if (isEnd) {
this.closeSubExecutionSSE(subflow);
}
this.throttledExecutionUpdate(subflow, executionEvent);
// we are receiving a first "fake" event to force initializing the connection: ignoring it
if (executionEvent.lastEventId !== "start") {
this.throttledExecutionUpdate(subflow, executionEvent);
}
if (isEnd) {
this.throttledExecutionUpdate.flush();
}

View File

@@ -12,9 +12,9 @@
<el-button-group v-else-if="isURI(value)">
<a class="el-button el-button--small el-button--primary" :href="value" target="_blank">
<OpenInNew />
<OpenInNew /> &nbsp;
{{ $t('open') }}
</a>
</a>
</el-button-group>
<span v-else-if="value === null">
@@ -54,7 +54,7 @@
}
},
itemUrl(value) {
return `${apiUrl(this.$store)}/executions/${this.execution.id}/file?path=${value}`;
return `${apiUrl(this.$store)}/executions/${this.execution.id}/file?path=${encodeURI(value)}`;
},
getFileSize(){
if (this.isFile(this.value)) {

View File

@@ -373,4 +373,9 @@
.bordered {
border: 1px solid var(--bs-border-color)
}
.bordered > .el-collapse-item{
margin-bottom :0px !important
}
</style>

View File

@@ -62,8 +62,9 @@
} else if (this.$route.query.blueprintId && this.$route.query.blueprintSource) {
this.source = await this.queryBlueprint(this.$route.query.blueprintId)
} else {
const selectedNamespace = this.$route.query.namespace || "company.team";
this.source = `id: myflow
namespace: company.team
namespace: ${selectedNamespace}
tasks:
- id: hello
type: io.kestra.plugin.core.log.Log

Some files were not shown because too many files have changed in this diff Show More