Compare commits

...

105 Commits

Author SHA1 Message Date
Roman Acevedo
93d53b9d57 try out to run in parallel JdbcRunnerRetryTest, lower exec run duration 2025-09-17 12:24:53 +02:00
Roman Acevedo
11e5e14e4e fix Timeline flamegraph 2025-09-17 11:44:04 +02:00
Roman Acevedo
72ce317c3d Update workflow-backend-test.yml 2025-09-17 11:44:04 +02:00
Roman Acevedo
4da44013c1 add Timeline flamegraph to temp debug tests on this branch 2025-09-17 11:44:04 +02:00
nKwiatkowski
c9995c6f42 fix(tests): failing unit tests 2025-09-17 10:31:39 +02:00
nKwiatkowski
a409299dd8 feat(tests): play jdbc h2 tests in parallel 2025-09-16 19:23:07 +02:00
Roman Acevedo
34cf67b0a4 test: make AbstractExecutionRepositoryTest parallelizable 2025-09-16 19:23:07 +02:00
Ludovic DEHON
d092556bc2 chore(build): use remote actions 2025-09-16 18:09:54 +02:00
Roman Acevedo
308106d532 ci: make generated test report retrocompatible with older releases (#11308)
* ci: make generated test report retrocompatible with older realeases

* ci: fix cli
2025-09-16 15:21:56 +02:00
Piyush Bhaskar
8fe8f96278 refactor(core): use el-splitter instead of custom sliders (#11309)
Co-authored-by: Miloš Paunović <paun992@hotmail.com>
2025-09-16 18:35:57 +05:30
Miloš Paunović
a5cad6d87c chore(core): improve coloring scheme for dependencies graph (#11306) 2025-09-16 14:26:15 +02:00
Loïc Mathieu
199d67fbe2 chore(system): share the application.yaml config file between OSS and EE 2025-09-16 10:53:53 +02:00
Loïc Mathieu
558a2e3f01 fix(flows): properly coompute flow dependencies with preconditions
When both upstream flows and where are set, it should be a AND between the two as dependencies must match the upstream flows.

Fixes #11164
2025-09-16 10:43:55 +02:00
HARSH THAKARE
e1d2c30e54 fix(core): add validation to prevent empty label values in Labels task (#11273)
part of #11227

---------

Co-authored-by: harshinfomaticae <harsh.thakare@infomaticae.co.in>
Co-authored-by: Miloš Paunović <paun992@hotmail.com>
2025-09-16 10:26:46 +02:00
Loïc Mathieu
700c6de411 fix(system): allow flattening a map with duplicated keys 2025-09-16 10:24:43 +02:00
Florian Hussonnois
2b838a5012 fix(executions): add missing CrudEvent on purge execution
Related-to: kestra-io/kestra-ee#5061
2025-09-16 09:34:19 +02:00
Loïc Mathieu
617daa79db fix(executions): truncate the execution_running table as in 0.24 there was an issue in the purge
This table contains executions for flows that have a concurrency that are currently running.
It has been added in 0.24 but in that release there was a bug that may prevent some records to being correctly removed from this table.
To fix that, we truncate it once.
2025-09-15 17:29:28 +02:00
Roman Acevedo
1791127acb test: unflaky FileChangedEventListener and PluginDefaultServiceTest, debug log on JdbcServiceLivenessCoordinatorTest
* test: parallelize AbstractRunnerTest

* test: add TestsUtils.randomTenant(..) function

* test: i think i found a bug

* revert debug

* test: add comment on potential bug, make test pass

* test: fix test metadata

* test: unflaky PluginDefaultServiceTest by separating class

* test: add log on JdbcServiceLivenessCoordinatorTest to debug

* test: cleanup debug log

* fix
2025-09-15 17:07:37 +02:00
brian-mulier-p
7feb571fb3 fix(test): add tenant-in-path storage test (#11292)
part of kestra-io/storage-s3#166
2025-09-15 16:49:02 +02:00
brian-mulier-p
a315bd0e1c fix(security): enhance basic auth security (#11285)
closes kestra-io/kestra-ee#5111
2025-09-15 16:27:14 +02:00
Roman Acevedo
e2ac1e7e98 ci: prevent commenting PR test report when cancelled 2025-09-15 16:01:07 +02:00
Miloš Paunović
c6f40eff52 fix(core): adjust positioning of default tour elements (#11286)
The problem occurred when `No Code` was selected as the `Default Editor Type` in `Settings`. This `PR` resolves the issue.

Closes https://github.com/kestra-io/kestra/issues/9556.
2025-09-15 14:55:00 +02:00
Miloš Paunović
ccd42f7a1a chore(core): remove superfluous button attribute in settings page (#11283) 2025-09-15 12:27:19 +02:00
Florian Hussonnois
ef08c8ac30 fix(plugins): remove regex validation on version property
Changes:
* Fixes stable method in Version class
* Remove regex validation on 'version' property

Related-to: kestra-io/kestra-ee#5090
2025-09-15 11:54:10 +02:00
github-actions[bot]
7b527c85a9 chore(core): localize to languages other than english (#11280)
Extended localization support by adding translations for multiple languages using English as the base. This enhances accessibility and usability for non-English-speaking users while keeping English as the source reference.

Co-authored-by: GitHub Action <actions@github.com>
2025-09-15 11:09:17 +02:00
Hamza
d121867066 chore(flows): trigger editor autocompletion when backspace is pressed (#10797)
Closes https://github.com/kestra-io/kestra/issues/10776.

Co-authored-by: Miloš Paunović <paun992@hotmail.com>
2025-09-15 11:07:20 +02:00
Roman Acevedo
a084a9f6f0 ci: fix Summary report test path 2025-09-15 10:50:25 +02:00
Karthik D
f6fff11081 chore(core): add reset to defaults option to settings page (#11226)
Closes https://github.com/kestra-io/kestra/issues/10640.

Co-authored-by: Miloš Paunović <paun992@hotmail.com>
2025-09-15 10:45:11 +02:00
Roman Acevedo
3d5015938f ci: add total header to generateTestReportSummary 2025-09-15 10:32:22 +02:00
Florian Hussonnois
951c93cedb fix(core): fix CrudEvent model for DELETE operation
Refactor XxxRepository class to use new factory methods
from the CrudEvent class

Related-to: kestra-io/kestra-ee#4727
2025-09-15 10:06:52 +02:00
Antoine Gauthier
9c06b37989 chore(core): resolve button text overflow on system overview page (#11271)
Closes https://github.com/kestra-io/kestra/issues/11245.

Co-authored-by: Miloš Paunović <paun992@hotmail.com>
2025-09-15 09:57:10 +02:00
Anna Geller
a916a03fdd fix(stats): update edition comparison with latest features and improved descriptions (#11272) 2025-09-14 12:35:26 +02:00
Roman Acevedo
4e728da331 test: disable one last test 2025-09-12 20:24:08 +02:00
Roman Acevedo
166a3932c9 test: do not parallelize yet AbstractRunnerTest 2025-09-12 20:24:08 +02:00
Roman Acevedo
0a21971bbf ci: only comment PR with test report in PR 2025-09-12 20:24:08 +02:00
Roman Acevedo
8c4d7c0f9e test: disable failing tests, they will be fixed soon
- will be treated in https://github.com/kestra-io/kestra/issues/11269
2025-09-12 20:24:08 +02:00
Nicolas K.
b709913071 test: run core tests in parallel (#11265)
- advance on #11264

* feat(ci-cd): play tests in parallel and synchronize plugin registry init

* fix(tests): change memory to h2 because the configuration have changed

* feat(tests): use tenant id to run runner tests in parallel

* run AbstractRunnerTest test methods in parallel

* feat(tests): use tenant id to run runner tests in parallel

* feat(tests): remove unwanted generated files

---------

Co-authored-by: Roman Acevedo <roman.acevedo62@gmail.com>
Co-authored-by: nKwiatkowski <nkwiatkowski@kestra.io>
2025-09-12 19:29:38 +02:00
Roman Acevedo
5be401d23c ci: add a kestra-devtools cli, and comment PR with failed tests
this is a POC, I think it can already be useful. Next step will be to move kestra-devtools to a separate repo and publish it to npm
2025-09-12 18:48:12 +02:00
Roman Acevedo
bb9f4be8c2 Revert "chore(sanitycheck): refactor PurgeCurrentExecutionFiles (#11115)"
This reverts commit fc690bf7cd.
Python task cannot be used here, it is not available. This commit was
wrongly merged with a red CI
2025-09-12 17:49:02 +02:00
François Delbrayelle
01e8e46b77 Revert "feat(retry): use the retry policy on HttpClient (#10922)" (#11263)
This reverts commit a236688be6.
2025-09-12 17:46:28 +02:00
Miloš Paunović
d00f4b0768 chore(core): ensure editor suggestion widget renders above other elements (#11258)
Closes https://github.com/kestra-io/kestra/issues/10702.
Closes https://github.com/kestra-io/kestra/issues/11033.
2025-09-12 14:48:56 +02:00
Barthélémy Ledoux
279f59c874 fix(core): only display close all tabs when there is more than one tab (#11257) 2025-09-12 14:20:54 +02:00
Barthélémy Ledoux
d897509726 fix(flows): clear tasks list when last task is deleted (#11255) 2025-09-12 14:20:42 +02:00
Pradumna Saraf
0d592342af chore(sanitycheck): add for OutputValues (#11105) 2025-09-12 16:53:13 +05:30
Pradumna Saraf
fc690bf7cd chore(sanitycheck): refactor PurgeCurrentExecutionFiles (#11115) 2025-09-12 16:52:37 +05:30
Antoine Gauthier
0a1b919863 chore(logs): display copy button only on row hover (#11254)
Closes https://github.com/kestra-io/kestra/issues/11220.

Co-authored-by: Miloš Paunović <paun992@hotmail.com>
2025-09-12 12:00:08 +02:00
Piyush Bhaskar
2f4e981a29 fix(core): add gradient at footer to avoid hard cut (#11252) 2025-09-12 14:35:47 +05:30
brian-mulier-p
5e7739432e fix(core): add ability to remap sort keys (#11233)
part of kestra-io/kestra-ee#5075
2025-09-12 09:43:39 +02:00
Miloš Paunović
8aba863b8c feat(core): introduce close all panels functionality (#11225)
Closes https://github.com/kestra-io/kestra/issues/10785.
2025-09-12 09:01:24 +02:00
dependabot[bot]
7eaa43c50f build(deps): bump axios (#11243)
Bumps the npm_and_yarn group with 1 update in the /ui directory: [axios](https://github.com/axios/axios).


Updates `axios` from 1.11.0 to 1.12.0
- [Release notes](https://github.com/axios/axios/releases)
- [Changelog](https://github.com/axios/axios/blob/v1.x/CHANGELOG.md)
- [Commits](https://github.com/axios/axios/compare/v1.11.0...v1.12.0)

---
updated-dependencies:
- dependency-name: axios
  dependency-version: 1.12.0
  dependency-type: direct:production
  dependency-group: npm_and_yarn
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-09-12 08:36:02 +02:00
Piyush Bhaskar
267ff78bfe fix(admin): change the header and add description on hover (#11241)
Co-authored-by: GitHub Action <actions@github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: Miloš Paunović <paun992@hotmail.com>
2025-09-12 12:00:41 +05:30
François Delbrayelle
7272cfe01f feat(ai_copilot): gray italic placeholder + rename AiAgent to AiCopilot (#11235) 2025-09-11 20:24:04 +02:00
brian.mulier
91e2fdb2cc fix(ai): increase maxOutputToken default 2025-09-11 18:11:52 +02:00
François Delbrayelle
a236688be6 feat(retry): use the retry policy on HttpClient (#10922) 2025-09-11 15:00:25 +02:00
Antoine Gauthier
81763d40ae fix(docs): center main container in DocsLayout (#11222)
Co-authored-by: Piyush Bhaskar <impiyush0012@gmail.com>
2025-09-11 16:18:12 +05:30
Miloš Paunović
677efb6739 fix(namespaces): open details page at top (#11221)
Closes https://github.com/kestra-io/kestra/issues/10536.
2025-09-11 10:52:47 +02:00
Nicolas K.
b35924fef1 fix(tests): add server type mock in the kestra context (#11176)
Co-authored-by: nKwiatkowski <nkwiatkowski@kestra.io>
2025-09-11 09:45:51 +02:00
Jaem Dessources
9dd93294b6 fix(core): align copy logs button to each row’s right edge (#11216)
Closes https://github.com/kestra-io/kestra/issues/10898.

Co-authored-by: Miloš Paunović <paun992@hotmail.com>
2025-09-11 08:55:01 +02:00
Piyush Bhaskar
fac6dfe9a0 fix(core): update router usage in loadAutocomplete. (#11219) 2025-09-11 12:13:05 +05:30
Bisesh
3bf9764505 fix(core): make sidebar tab color consistent when unfocused (#11217)
Closes https://github.com/kestra-io/kestra/issues/11156.

Co-authored-by: Miloš Paunović <paun992@hotmail.com>
2025-09-11 08:33:57 +02:00
Piyush Bhaskar
c35cea5d19 fix(core): override the ns module. (#11218) 2025-09-11 11:53:00 +05:30
Barthélémy Ledoux
4d8e9479f1 refactor: finally get rid of vuex (#11211) 2025-09-10 22:44:21 +02:00
Florian Hussonnois
3f24e8e838 fix(core): make CRC32 for plugin JARs lazy
Make CRC32 calculation for lazy plugin JAR files
to avoid excessive startup time and performance impact.

Avoid byte buffer reallocation while computing CRC32.
2025-09-10 17:42:02 +02:00
Miloš Paunović
7175fcb666 fix(executions): refactor link creation to ensure the id is rendered as a clickable link (#11209)
Related to https://github.com/kestra-io/kestra/issues/10906.
2025-09-10 15:01:29 +02:00
Barthélémy Ledoux
2ddfa13b1b refactor: make-axios-composable (#11177) 2025-09-10 14:54:00 +02:00
Barthélémy Ledoux
ba2a5dfec8 chore: revert monaco update (#11207) 2025-09-10 13:34:33 +02:00
Loïc Mathieu
f84441dac7 fix(ci): disable publishing docker image on fork
I should have not trusted an AI for this but copy/paste what I know work: the Quarkus CI!
2025-09-10 12:17:25 +02:00
Barthélémy Ledoux
433b788e4a chore: a bunch of performance fixes detected by oxlint (eslint-unicorn) (#10050) 2025-09-10 11:35:07 +02:00
dependabot[bot]
65c5fd6331 build(deps): bump org.projectlombok:lombok from 1.18.38 to 1.18.40
Bumps [org.projectlombok:lombok](https://github.com/projectlombok/lombok) from 1.18.38 to 1.18.40.
- [Changelog](https://github.com/projectlombok/lombok/blob/master/doc/changelog.markdown)
- [Commits](https://github.com/projectlombok/lombok/compare/v1.18.38...v1.18.40)

---
updated-dependencies:
- dependency-name: org.projectlombok:lombok
  dependency-version: 1.18.40
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-10 11:12:24 +02:00
dependabot[bot]
421ab40276 build(deps): bump io.micrometer:micrometer-core from 1.15.3 to 1.15.4
Bumps [io.micrometer:micrometer-core](https://github.com/micrometer-metrics/micrometer) from 1.15.3 to 1.15.4.
- [Release notes](https://github.com/micrometer-metrics/micrometer/releases)
- [Commits](https://github.com/micrometer-metrics/micrometer/compare/v1.15.3...v1.15.4)

---
updated-dependencies:
- dependency-name: io.micrometer:micrometer-core
  dependency-version: 1.15.4
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-10 11:11:48 +02:00
dependabot[bot]
efb2779693 build(deps): bump flyingSaucerVersion from 9.13.3 to 10.0.0
Bumps `flyingSaucerVersion` from 9.13.3 to 10.0.0.

Updates `org.xhtmlrenderer:flying-saucer-core` from 9.13.3 to 10.0.0
- [Release notes](https://github.com/flyingsaucerproject/flyingsaucer/releases)
- [Changelog](https://github.com/flyingsaucerproject/flyingsaucer/blob/main/CHANGELOG.md)
- [Commits](https://github.com/flyingsaucerproject/flyingsaucer/compare/v9.13.3...v10.0.0)

Updates `org.xhtmlrenderer:flying-saucer-pdf` from 9.13.3 to 10.0.0
- [Release notes](https://github.com/flyingsaucerproject/flyingsaucer/releases)
- [Changelog](https://github.com/flyingsaucerproject/flyingsaucer/blob/main/CHANGELOG.md)
- [Commits](https://github.com/flyingsaucerproject/flyingsaucer/compare/v9.13.3...v10.0.0)

---
updated-dependencies:
- dependency-name: org.xhtmlrenderer:flying-saucer-core
  dependency-version: 10.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
- dependency-name: org.xhtmlrenderer:flying-saucer-pdf
  dependency-version: 10.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-10 11:10:59 +02:00
dependabot[bot]
74d371c0ca build(deps): bump com.azure:azure-sdk-bom from 1.2.37 to 1.2.38
Bumps [com.azure:azure-sdk-bom](https://github.com/azure/azure-sdk-for-java) from 1.2.37 to 1.2.38.
- [Release notes](https://github.com/azure/azure-sdk-for-java/releases)
- [Commits](https://github.com/azure/azure-sdk-for-java/compare/azure-sdk-bom_1.2.37...azure-sdk-bom_1.2.38)

---
updated-dependencies:
- dependency-name: com.azure:azure-sdk-bom
  dependency-version: 1.2.38
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-10 11:10:10 +02:00
Loïc Mathieu
90a7869020 fixsystem): always load netty from the app classloader
As Netty is used in core and a lot of plugins, and we already load project reactor from the app classloader that depends in Netty.

Fixes https://github.com/kestra-io/kestra-ee/issues/5038
2025-09-10 10:50:22 +02:00
dependabot[bot]
d9ccb50b0f build(deps): bump actions/github-script from 7 to 8
Bumps [actions/github-script](https://github.com/actions/github-script) from 7 to 8.
- [Release notes](https://github.com/actions/github-script/releases)
- [Commits](https://github.com/actions/github-script/compare/v7...v8)

---
updated-dependencies:
- dependency-name: actions/github-script
  dependency-version: '8'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-10 10:47:40 +02:00
dependabot[bot]
aea0b87ef8 build(deps): bump aquasecurity/trivy-action from 0.33.0 to 0.33.1
Bumps [aquasecurity/trivy-action](https://github.com/aquasecurity/trivy-action) from 0.33.0 to 0.33.1.
- [Release notes](https://github.com/aquasecurity/trivy-action/releases)
- [Commits](https://github.com/aquasecurity/trivy-action/compare/0.33.0...0.33.1)

---
updated-dependencies:
- dependency-name: aquasecurity/trivy-action
  dependency-version: 0.33.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-10 10:47:17 +02:00
Loïc Mathieu
9a144fc3fe fix(system): we don't need to advance the parser anymore to the first token 2025-09-10 10:46:44 +02:00
Loïc Mathieu
ddd9cebc63 chore(deps): upgrade to Jackson 2.20.0
Jackson annotation now uses a version scheme without micro version so it has been updated to 2.20.

Closes #11069
2025-09-10 10:46:44 +02:00
dependabot[bot]
1bebbb9b73 build(deps): bump com.gorylenko.gradle-git-properties
Bumps com.gorylenko.gradle-git-properties from 2.5.2 to 2.5.3.

---
updated-dependencies:
- dependency-name: com.gorylenko.gradle-git-properties
  dependency-version: 2.5.3
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-10 10:46:26 +02:00
dependabot[bot]
8de4dc867e build(deps): bump actions/setup-python from 5 to 6
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 5 to 6.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](https://github.com/actions/setup-python/compare/v5...v6)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-version: '6'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-10 10:46:08 +02:00
dependabot[bot]
fc49694e76 build(deps): bump actions/setup-node from 4 to 5
Bumps [actions/setup-node](https://github.com/actions/setup-node) from 4 to 5.
- [Release notes](https://github.com/actions/setup-node/releases)
- [Commits](https://github.com/actions/setup-node/compare/v4...v5)

---
updated-dependencies:
- dependency-name: actions/setup-node
  dependency-version: '5'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-10 10:45:47 +02:00
dependabot[bot]
152300abae build(deps): bump io.micronaut.openapi:micronaut-openapi-bom
Bumps [io.micronaut.openapi:micronaut-openapi-bom](https://github.com/micronaut-projects/micronaut-openapi) from 6.17.3 to 6.18.0.
- [Release notes](https://github.com/micronaut-projects/micronaut-openapi/releases)
- [Commits](https://github.com/micronaut-projects/micronaut-openapi/compare/v6.17.3...v6.18.0)

---
updated-dependencies:
- dependency-name: io.micronaut.openapi:micronaut-openapi-bom
  dependency-version: 6.18.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-10 10:45:13 +02:00
dependabot[bot]
1ff5dda4e1 build(deps): bump software.amazon.awssdk:bom from 2.33.2 to 2.33.5
Bumps software.amazon.awssdk:bom from 2.33.2 to 2.33.5.

---
updated-dependencies:
- dependency-name: software.amazon.awssdk:bom
  dependency-version: 2.33.5
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-10 10:44:50 +02:00
Miloš Paunović
84f9b8876d chore(deps): regular dependency update (#11200)
Performing a weekly round of dependency updates in the NPM ecosystem to keep everything up to date.
2025-09-10 10:18:33 +02:00
brian-mulier-p
575955567f fix(flows): avoid failing flow dependencies with dynamic defaults (#11166)
closes #11117
2025-09-10 09:59:51 +02:00
brian-mulier-p
d6d2580b45 fix(namespaces): avoid adding 'company.team' as default ns (#11174)
closes #11168
2025-09-09 17:13:48 +02:00
Miloš Paunović
070e54b902 chore(flows): display correct flow dependency count (#11169)
Closes https://github.com/kestra-io/kestra/issues/11127.
2025-09-09 13:56:17 +02:00
Roman Acevedo
829ca4380f fix(flows): topology would not load when having many flows and cyclic relations
- this will probably fix https://github.com/kestra-io/kestra-ee/issues/4980

the issue was recursiveFlowTopology was returning a lot of duplicates, it was aggravated when having many Flows and multiple Flow triggers
2025-09-09 13:06:20 +02:00
Karthik D
381c7a75ad chore(core): use simple search input on blueprints listing (#11034)
Closes https://github.com/kestra-io/kestra/issues/11002.

Co-authored-by: Miloš Paunović <paun992@hotmail.com>
2025-09-09 12:54:58 +02:00
louispy
1688c489a9 chore(flows): improve visibility of horizontal scroll bar on listing (#11163)
Closes https://github.com/kestra-io/kestra/issues/11158.

Co-authored-by: louispy <louisleslie98@gmail.com>
Co-authored-by: MilosPaunovic <paun992@hotmail.com>
2025-09-09 12:40:28 +02:00
AKSHAT GUPTA
93ccbf5f9b chore(core): separate data loading from graph node rendering on dependency view (#11155)
Relates to https://github.com/kestra-io/kestra/issues/11125.

Co-authored-by: Miloš Paunović <paun992@hotmail.com>
2025-09-09 12:25:58 +02:00
Barthélémy Ledoux
ac1cb235e5 refactor: avoid importing all of lodash when we only need groupBy (#10870) 2025-09-09 11:34:13 +02:00
dependabot[bot]
9d3d3642e8 build(deps): bump kafkaVersion from 4.0.0 to 4.1.0
Bumps `kafkaVersion` from 4.0.0 to 4.1.0.

Updates `org.apache.kafka:kafka-clients` from 4.0.0 to 4.1.0

Updates `org.apache.kafka:kafka-streams` from 4.0.0 to 4.1.0

Updates `org.apache.kafka:kafka-streams-test-utils` from 4.0.0 to 4.1.0

---
updated-dependencies:
- dependency-name: org.apache.kafka:kafka-clients
  dependency-version: 4.1.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
- dependency-name: org.apache.kafka:kafka-streams
  dependency-version: 4.1.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
- dependency-name: org.apache.kafka:kafka-streams-test-utils
  dependency-version: 4.1.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-09 09:56:38 +02:00
Suguresh
3d306a885e feat(core): add extra date format options (#10237)
Co-authored-by: Miloš Paunović <paun992@hotmail.com>
2025-09-09 09:31:49 +02:00
Antoine Gauthier
ef193c5774 feat(core): add a new date format option with milliseconds (#11108)
Closes https://github.com/kestra-io/kestra/issues/11028.

Co-authored-by: Miloš Paunović <paun992@hotmail.com>
2025-09-09 09:20:37 +02:00
AmbarMishra973
d0f46169f4 feat(executions): make the id field a link that can be opened in a new tab (#10963)
Closes https://github.com/kestra-io/kestra/issues/10906.

Co-authored-by: Miloš Paunović <paun992@hotmail.com>
2025-09-09 09:13:49 +02:00
François Delbrayelle
3005ab527c fix(outputs): open external file was not working (#11154) 2025-09-08 17:45:19 +02:00
Barthélémy Ledoux
688e2af12b chore: update eslint config for vue files (#9891) 2025-09-08 16:42:33 +02:00
Nicolas K.
4c0a05f484 fix(test): flaky Scheduler trigger change test (#11153)
Co-authored-by: nKwiatkowski <nkwiatkowski@kestra.io>
2025-09-08 16:33:23 +02:00
zaib shamsi
108f8fc2c7 feat(executions): nicer exception message for the HttpFunction
### What I did

- Improved the exception message in HttpFunction.java to make debugging easier.

### Why

- The original message was too generic. This change makes it clearer where the issue occurs.
2025-09-08 15:04:12 +02:00
Barthélémy Ledoux
8b81a37559 refactor: make folder structure of no-code use "no-code" (#11122) 2025-09-08 14:15:04 +02:00
Barthélémy Ledoux
9222f97d63 fix(core): multipanel split creates super big panels (#11123) 2025-09-08 14:14:40 +02:00
brian.mulier
43e3591417 chore(ci): fail-safe update-plugin-kestra-version.sh 2025-09-08 12:02:28 +02:00
brian.mulier
438dc9ecf6 chore(ci): create branch if not exist on update-plugin-kestra-version.sh 2025-09-08 11:45:15 +02:00
brian-mulier-p
7292837c58 chore(ci): add LTS tagging (#11131) 2025-09-08 11:13:16 +02:00
brian.mulier
7fa93d7764 chore(version): update to version 'v1.1.0-SNAPSHOT'. 2025-09-08 10:08:34 +02:00
485 changed files with 14625 additions and 5058 deletions

View File

@@ -26,7 +26,7 @@ jobs:
fetch-depth: 0
- name: Set up Python
uses: actions/setup-python@v5
uses: actions/setup-python@v6
with:
python-version: "3.x"
@@ -39,7 +39,7 @@ jobs:
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
- name: Set up Node
uses: actions/setup-node@v4
uses: actions/setup-node@v5
with:
node-version: "20.x"

View File

@@ -37,7 +37,7 @@ jobs:
path: kestra
# Setup build
- uses: kestra-io/actions/.github/actions/setup-build@main
- uses: kestra-io/actions/composite/setup-build@main
name: Setup - Build
id: build
with:

View File

@@ -25,21 +25,13 @@ jobs:
with:
fetch-depth: 0
# Checkout GitHub Actions
- uses: actions/checkout@v5
with:
repository: kestra-io/actions
path: actions
ref: main
# Setup build
- uses: ./actions/.github/actions/setup-build
- uses: kestra-io/actions/composite/setup-build@main
id: build
with:
java-enabled: true
node-enabled: true
python-enabled: true
caches-enabled: true
# Get Plugins List
- name: Get Plugins List
@@ -60,7 +52,7 @@ jobs:
GITHUB_PAT: ${{ secrets.GH_PERSONAL_TOKEN }}
run: |
chmod +x ./dev-tools/release-plugins.sh;
./dev-tools/release-plugins.sh \
--release-version=${{github.event.inputs.releaseVersion}} \
--next-version=${{github.event.inputs.nextVersion}} \
@@ -73,7 +65,7 @@ jobs:
GITHUB_PAT: ${{ secrets.GH_PERSONAL_TOKEN }}
run: |
chmod +x ./dev-tools/release-plugins.sh;
./dev-tools/release-plugins.sh \
--release-version=${{github.event.inputs.releaseVersion}} \
--next-version=${{github.event.inputs.nextVersion}} \

View File

@@ -38,15 +38,8 @@ jobs:
fetch-depth: 0
path: kestra
# Checkout GitHub Actions
- uses: actions/checkout@v5
with:
repository: kestra-io/actions
path: actions
ref: main
# Setup build
- uses: ./actions/.github/actions/setup-build
- uses: kestra-io/actions/composite/setup-build@main
id: build
with:
java-enabled: true

View File

@@ -0,0 +1,32 @@
name: kestra-devtools test
on:
pull_request:
branches:
- develop
paths:
- 'dev-tools/kestra-devtools/**'
env:
# to save corepack from itself
COREPACK_INTEGRITY_KEYS: 0
jobs:
test:
name: kestra-devtools tests
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v5
- name: Npm - install
working-directory: 'dev-tools/kestra-devtools'
run: npm ci
- name: Run tests
working-directory: 'dev-tools/kestra-devtools'
run: npm run test
- name: Npm - Run build
working-directory: 'dev-tools/kestra-devtools'
run: npm run build

View File

@@ -59,8 +59,6 @@ jobs:
needs:
- release
if: always()
env:
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_WEBHOOK_URL }}
steps:
- name: Trigger EE Workflow
uses: peter-evans/repository-dispatch@v3
@@ -70,14 +68,9 @@ jobs:
repository: kestra-io/kestra-ee
event-type: "oss-updated"
# Slack
- name: Slack - Notification
uses: Gamesight/slack-workflow-status@master
if: ${{ always() && env.SLACK_WEBHOOK_URL != 0 }}
if: ${{ failure() && env.SLACK_WEBHOOK_URL != 0 && (github.ref == 'refs/heads/master' || github.ref == 'refs/heads/main' || github.ref == 'refs/heads/develop') }}
uses: kestra-io/actions/composite/slack-status@main
with:
repo_token: ${{ secrets.GITHUB_TOKEN }}
slack_webhook_url: ${{ secrets.SLACK_WEBHOOK_URL }}
name: GitHub Actions
icon_emoji: ":github-actions:"
channel: "C02DQ1A7JLR" # _int_git channel
webhook-url: ${{ secrets.SLACK_WEBHOOK_URL }}

View File

@@ -60,19 +60,3 @@ jobs:
name: E2E - Tests
uses: ./.github/workflows/e2e.yml
end:
name: End
runs-on: ubuntu-latest
if: always()
needs: [frontend, backend]
steps:
# Slack
- name: Slack notification
uses: Gamesight/slack-workflow-status@master
if: ${{ always() && env.SLACK_WEBHOOK_URL != 0 }}
with:
repo_token: ${{ secrets.GITHUB_TOKEN }}
slack_webhook_url: ${{ secrets.SLACK_WEBHOOK_URL }}
name: GitHub Actions
icon_emoji: ":github-actions:"
channel: "C02DQ1A7JLR"

View File

@@ -21,13 +21,6 @@ jobs:
with:
fetch-depth: 0
# Checkout GitHub Actions
- uses: actions/checkout@v5
with:
repository: kestra-io/actions
path: actions
ref: main
# Setup build
- uses: ./actions/.github/actions/setup-build
id: build
@@ -70,15 +63,8 @@ jobs:
with:
fetch-depth: 0
# Checkout GitHub Actions
- uses: actions/checkout@v5
with:
repository: kestra-io/actions
path: actions
ref: main
# Setup build
- uses: ./actions/.github/actions/setup-build
- uses: kestra-io/actions/composite/setup-build@main
id: build
with:
java-enabled: false
@@ -87,7 +73,7 @@ jobs:
# Run Trivy image scan for Docker vulnerabilities, see https://github.com/aquasecurity/trivy-action
- name: Docker Vulnerabilities Check
uses: aquasecurity/trivy-action@0.33.0
uses: aquasecurity/trivy-action@0.33.1
with:
image-ref: kestra/kestra:develop
format: 'template'
@@ -115,24 +101,16 @@ jobs:
with:
fetch-depth: 0
# Checkout GitHub Actions
- uses: actions/checkout@v5
with:
repository: kestra-io/actions
path: actions
ref: main
# Setup build
- uses: ./actions/.github/actions/setup-build
- uses: kestra-io/actions/composite/setup-build@main
id: build
with:
java-enabled: false
node-enabled: false
caches-enabled: true
# Run Trivy image scan for Docker vulnerabilities, see https://github.com/aquasecurity/trivy-action
- name: Docker Vulnerabilities Check
uses: aquasecurity/trivy-action@0.33.0
uses: aquasecurity/trivy-action@0.33.1
with:
image-ref: kestra/kestra:latest
format: table

View File

@@ -20,6 +20,7 @@ permissions:
contents: write
checks: write
actions: read
pull-requests: write
jobs:
test:
@@ -35,7 +36,7 @@ jobs:
fetch-depth: 0
# Setup build
- uses: kestra-io/actions/.github/actions/setup-build@main
- uses: kestra-io/actions/composite/setup-build@main
name: Setup - Build
id: build
with:
@@ -59,6 +60,30 @@ jobs:
export GOOGLE_APPLICATION_CREDENTIALS=$HOME/.gcp-service-account.json
./gradlew check javadoc --parallel
- name: comment PR with test report
if: ${{ !cancelled() && github.event_name == 'pull_request' }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_AUTH_TOKEN }}
run: |
export KESTRA_PWD=$(pwd) && sh -c 'cd dev-tools/kestra-devtools && npm ci && npm run build && node dist/kestra-devtools-cli.cjs generateTestReportSummary --only-errors --ci $KESTRA_PWD' > report.md
cat report.md
# Gradle check
- name: 'generate Timeline flamegraph'
if: always()
env:
GOOGLE_SERVICE_ACCOUNT: ${{ secrets.GOOGLE_SERVICE_ACCOUNT }}
shell: bash
run: |
echo $GOOGLE_SERVICE_ACCOUNT | base64 -d > ~/.gcp-service-account.json
export GOOGLE_APPLICATION_CREDENTIALS=$HOME/.gcp-service-account.json
./gradlew mergeTestTimeline
- name: 'Upload Timeline flamegraph'
uses: actions/upload-artifact@v4
if: always()
with:
name: all-test-timelines.json
path: build/reports/test-timelines-report/all-test-timelines.json
retention-days: 5
# report test
- name: Test - Publish Test Results
uses: dorny/test-reporter@v2

View File

@@ -26,7 +26,7 @@ jobs:
run: npm ci
# Setup build
- uses: kestra-io/actions/.github/actions/setup-build@main
- uses: kestra-io/actions/composite/setup-build@main
name: Setup - Build
id: build
with:

View File

@@ -25,15 +25,6 @@ jobs:
fetch-depth: 0
submodules: true
# Checkout GitHub Actions
- name: Checkout - Actions
uses: actions/checkout@v5
with:
repository: kestra-io/actions
sparse-checkout-cone-mode: true
path: actions
sparse-checkout: |
.github/actions
# Download Exec
# Must be done after checkout actions
@@ -59,7 +50,7 @@ jobs:
# GitHub Release
- name: Create GitHub release
uses: ./actions/.github/actions/github-release
uses: kestra-io/actions/composite/github-release@main
if: ${{ startsWith(github.ref, 'refs/tags/v') }}
env:
MAKE_LATEST: ${{ steps.is_latest.outputs.latest }}
@@ -82,7 +73,7 @@ jobs:
- name: Merge Release Notes
if: ${{ startsWith(github.ref, 'refs/tags/v') }}
uses: ./actions/.github/actions/github-release-note-merge
uses: kestra-io/actions/composite/github-release-note-merge@main
env:
GITHUB_TOKEN: ${{ secrets.GH_PERSONAL_TOKEN }}
RELEASE_TAG: ${{ github.ref_name }}

View File

@@ -11,6 +11,14 @@ on:
options:
- "true"
- "false"
retag-lts:
description: 'Retag LTS Docker images'
required: true
type: choice
default: "false"
options:
- "true"
- "false"
release-tag:
description: 'Kestra Release Tag (by default, deduced with the ref)'
required: false
@@ -179,6 +187,11 @@ jobs:
run: |
regctl image copy ${{ format('kestra/kestra:{0}{1}', steps.vars.outputs.tag, matrix.image.name) }} ${{ format('kestra/kestra:latest{0}', matrix.image.name) }}
- name: Retag to LTS
if: startsWith(github.ref, 'refs/tags/v') && inputs.retag-lts == 'true'
run: |
regctl image copy ${{ format('kestra/kestra:{0}{1}', steps.vars.outputs.tag, matrix.image.name) }} ${{ format('kestra/kestra:latest-lts{0}', matrix.image.name) }}
end:
runs-on: ubuntu-latest
needs:
@@ -187,14 +200,9 @@ jobs:
env:
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_WEBHOOK_URL }}
steps:
# Slack
- name: Slack notification
uses: Gamesight/slack-workflow-status@master
if: ${{ always() && env.SLACK_WEBHOOK_URL != 0 }}
if: ${{ failure() && env.SLACK_WEBHOOK_URL != 0 }}
uses: kestra-io/actions/composite/slack-status@main
with:
repo_token: ${{ secrets.GITHUB_TOKEN }}
slack_webhook_url: ${{ secrets.SLACK_WEBHOOK_URL }}
name: GitHub Actions
icon_emoji: ':github-actions:'
channel: 'C02DQ1A7JLR' # _int_git channel
webhook-url: ${{ secrets.SLACK_WEBHOOK_URL }}

View File

@@ -29,7 +29,7 @@ jobs:
# Setup build
- name: Setup - Build
uses: kestra-io/actions/.github/actions/setup-build@main
uses: kestra-io/actions/composite/setup-build@main
id: build
with:
java-enabled: true

View File

@@ -7,7 +7,7 @@ on:
jobs:
publish:
name: Pull Request - Delete Docker
if: github.repository == github.event.pull_request.head.repo.full_name # prevent running on forks
if: github.repository == 'kestra-io/kestra' # prevent running on forks
runs-on: ubuntu-latest
steps:
- uses: dataaxiom/ghcr-cleanup-action@v1

View File

@@ -8,12 +8,12 @@ on:
jobs:
build-artifacts:
name: Build Artifacts
if: github.repository == github.event.pull_request.head.repo.full_name # prevent running on forks
if: github.repository == 'kestra-io/kestra' # prevent running on forks
uses: ./.github/workflows/workflow-build-artifacts.yml
publish:
name: Publish Docker
if: github.repository == github.event.pull_request.head.repo.full_name # prevent running on forks
if: github.repository == 'kestra-io/kestra' # prevent running on forks
runs-on: ubuntu-latest
needs: build-artifacts
env:
@@ -62,7 +62,7 @@ jobs:
# Add comment on pull request
- name: Add comment to PR
uses: actions/github-script@v7
uses: actions/github-script@v8
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |

View File

@@ -84,14 +84,12 @@ jobs:
name: Notify - Slack
runs-on: ubuntu-latest
needs: [ frontend, backend ]
if: github.event_name == 'schedule'
steps:
- name: Notify failed CI
id: send-ci-failed
if: |
always() && (needs.frontend.result != 'success' ||
needs.backend.result != 'success')
uses: kestra-io/actions/.github/actions/send-ci-failed@main
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_WEBHOOK_URL }}
always() &&
(needs.frontend.result != 'success' || needs.backend.result != 'success') &&
(github.ref == 'refs/heads/master' || github.ref == 'refs/heads/main' || github.ref == 'refs/heads/develop')
uses: kestra-io/actions/composite/slack-status@main
with:
webhook-url: ${{ secrets.SLACK_WEBHOOK_URL }}

View File

@@ -32,7 +32,7 @@ plugins {
// release
id 'net.researchgate.release' version '3.1.0'
id "com.gorylenko.gradle-git-properties" version "2.5.2"
id "com.gorylenko.gradle-git-properties" version "2.5.3"
id 'signing'
id "com.vanniktech.maven.publish" version "0.34.0"
@@ -168,8 +168,9 @@ allprojects {
/**********************************************************************************************************************\
* Test
**********************************************************************************************************************/
subprojects {
if (it.name != 'platform' && it.name != 'jmh-benchmarks') {
subprojects {subProj ->
if (subProj.name != 'platform' && subProj.name != 'jmh-benchmarks') {
apply plugin: "com.adarshr.test-logger"
java {
@@ -207,6 +208,13 @@ subprojects {
test {
useJUnitPlatform()
reports {
junitXml.required = true
junitXml.outputPerTestCase = true
junitXml.mergeReruns = true
junitXml.includeSystemErrLog = true;
junitXml.outputLocation = layout.buildDirectory.dir("test-results/test")
}
// set Xmx for test workers
maxHeapSize = '4g'
@@ -222,6 +230,52 @@ subprojects {
environment 'SECRET_PASSWORD', "cGFzc3dvcmQ="
environment 'ENV_TEST1', "true"
environment 'ENV_TEST2', "Pass by env"
// === Test Timeline Trace (Chrome trace format) ===
// Produces per-JVM ndjson under build/test-timelines/*.jsonl and a merged array via :mergeTestTimeline
// Each event has: start time (ts, µs since epoch), end via dur, and absolute duration (dur, µs)
doFirst {
file("${buildDir}/test-results/test-timelines").mkdirs()
}
def jvmName = java.lang.management.ManagementFactory.runtimeMXBean.name
def pid = jvmName.tokenize('@')[0]
def traceDir = file("${buildDir}/test-results/test-timelines")
def traceFile = new File(traceDir, "${project.name}-${name}-${pid}.jsonl")
def starts = new java.util.concurrent.ConcurrentHashMap<Object, Long>()
beforeTest { org.gradle.api.tasks.testing.TestDescriptor d ->
// epoch millis to allow cross-JVM merge
starts.put(d, System.currentTimeMillis())
}
afterTest { org.gradle.api.tasks.testing.TestDescriptor d, org.gradle.api.tasks.testing.TestResult r ->
def st = starts.remove(d)
if (st != null) {
def en = System.currentTimeMillis()
long tsMicros = st * 1000L // start time (µs since epoch)
long durMicros = (en - st) * 1000L // duration (µs)
def ev = [
name: (d.className ? d.className + '.' + d.name : d.name),
cat : 'test',
ph : 'X', // Complete event with duration
ts : tsMicros,
dur : durMicros,
pid : project.name, // group by project/module
tid : "${name}-worker-${pid}",
args: [result: r.resultType.toString()]
]
synchronized (traceFile.absolutePath.intern()) {
traceFile << (groovy.json.JsonOutput.toJson(ev) + System.lineSeparator())
}
}
}
if (subProj.name == 'core' || subProj.name == 'jdbc-h2') {
// JUnit 5 parallel settings
systemProperty 'junit.jupiter.execution.parallel.enabled', 'true'
systemProperty 'junit.jupiter.execution.parallel.mode.default', 'concurrent'
systemProperty 'junit.jupiter.execution.parallel.mode.classes.default', 'same_thread'
systemProperty 'junit.jupiter.execution.parallel.config.strategy', 'dynamic'
}
}
testlogger {
@@ -236,7 +290,53 @@ subprojects {
}
}
}
// Root-level aggregator: merge timelines from ALL modules into one Chrome trace
if (project == rootProject) {
tasks.register('mergeTestTimeline') {
group = 'verification'
description = 'Merge per-worker test timeline ndjson from all modules into a single Chrome Trace JSON array.'
doLast {
def collectedFiles = [] as List<File>
// Collect *.jsonl files from every subproject
rootProject.subprojects.each { p ->
def dir = p.file("${p.buildDir}/test-results/test-timelines")
if (dir.exists()) {
collectedFiles.addAll(p.fileTree(dir: dir, include: '*.jsonl').files)
}
}
if (collectedFiles.isEmpty()) {
logger.lifecycle("No timeline files found in any subproject. Run tests first (e.g., './gradlew test --parallel').")
return
}
collectedFiles = collectedFiles.sort { it.name }
def outDir = rootProject.file("${rootProject.buildDir}/reports/test-timelines-report")
outDir.mkdirs()
def out = new File(outDir, "all-test-timelines.json")
out.withWriter('UTF-8') { w ->
w << '['
boolean first = true
collectedFiles.each { f ->
f.eachLine { line ->
def trimmed = line?.trim()
if (trimmed) {
if (!first) w << ','
w << trimmed
first = false
}
}
}
w << ']'
}
logger.lifecycle("Merged ${collectedFiles.size()} files into ${out} — open it in chrome://tracing or Perfetto UI.")
}
}
}
/**********************************************************************************************************************\
* End-to-End Tests
**********************************************************************************************************************/

View File

@@ -40,5 +40,6 @@ dependencies {
implementation project(":worker")
//test
testImplementation project(':tests')
testImplementation "org.wiremock:wiremock-jetty12"
}

View File

@@ -89,11 +89,24 @@ public class App implements Callable<Integer> {
*/
protected static ApplicationContext applicationContext(Class<?> mainClass,
String[] args) {
return applicationContext(mainClass, new String [] { Environment.CLI }, args);
}
/**
* Create an {@link ApplicationContext} with additional properties based on configuration files (--config) and
* forced Properties from current command.
*
* @param args args passed to java app
* @return the application context created
*/
protected static ApplicationContext applicationContext(Class<?> mainClass,
String[] environments,
String[] args) {
ApplicationContextBuilder builder = ApplicationContext
.builder()
.mainClass(mainClass)
.environments(Environment.CLI);
.environments(environments);
CommandLine cmd = new CommandLine(mainClass, CommandLine.defaultFactory());
continueOnParsingErrors(cmd);

View File

@@ -262,6 +262,8 @@ public class FileChangedEventListener {
}
private String getTenantIdFromPath(Path path) {
// FIXME there is probably a bug here when a tenant has '_' in its name,
// a valid tenant name is defined with following regex: "^[a-z0-9][a-z0-9_-]*"
return path.getFileName().toString().split("_")[0];
}
}

View File

@@ -4,11 +4,11 @@ import io.kestra.core.models.flows.Flow;
import io.kestra.core.models.flows.GenericFlow;
import io.kestra.core.repositories.FlowRepositoryInterface;
import io.kestra.core.utils.Await;
import io.kestra.core.utils.TestsUtils;
import io.micronaut.test.extensions.junit5.annotation.MicronautTest;
import jakarta.inject.Inject;
import org.apache.commons.io.FileUtils;
import org.junit.jupiter.api.*;
import org.junitpioneer.jupiter.RetryingTest;
import java.io.IOException;
import java.nio.file.Files;
@@ -19,7 +19,6 @@ import java.util.concurrent.Executors;
import java.util.concurrent.TimeoutException;
import java.util.concurrent.atomic.AtomicBoolean;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static io.kestra.core.utils.Rethrow.throwRunnable;
import static org.assertj.core.api.Assertions.assertThat;
@@ -57,10 +56,11 @@ class FileChangedEventListenerTest {
}
}
@RetryingTest(5) // Flaky on CI but always pass locally
@Test
void test() throws IOException, TimeoutException {
var tenant = TestsUtils.randomTenant(FileChangedEventListenerTest.class.getSimpleName(), "test");
// remove the flow if it already exists
flowRepository.findByIdWithSource(MAIN_TENANT, "io.kestra.tests.watch", "myflow").ifPresent(flow -> flowRepository.delete(flow));
flowRepository.findByIdWithSource(tenant, "io.kestra.tests.watch", "myflow").ifPresent(flow -> flowRepository.delete(flow));
// create a basic flow
String flow = """
@@ -73,14 +73,14 @@ class FileChangedEventListenerTest {
message: Hello World! 🚀
""";
GenericFlow genericFlow = GenericFlow.fromYaml(MAIN_TENANT, flow);
GenericFlow genericFlow = GenericFlow.fromYaml(tenant, flow);
Files.write(Path.of(FILE_WATCH + "/" + genericFlow.uidWithoutRevision() + ".yaml"), flow.getBytes());
Await.until(
() -> flowRepository.findById(MAIN_TENANT, "io.kestra.tests.watch", "myflow").isPresent(),
() -> flowRepository.findById(tenant, "io.kestra.tests.watch", "myflow").isPresent(),
Duration.ofMillis(100),
Duration.ofSeconds(10)
);
Flow myflow = flowRepository.findById(MAIN_TENANT, "io.kestra.tests.watch", "myflow").orElseThrow();
Flow myflow = flowRepository.findById(tenant, "io.kestra.tests.watch", "myflow").orElseThrow();
assertThat(myflow.getTasks()).hasSize(1);
assertThat(myflow.getTasks().getFirst().getId()).isEqualTo("hello");
assertThat(myflow.getTasks().getFirst().getType()).isEqualTo("io.kestra.plugin.core.log.Log");
@@ -88,16 +88,17 @@ class FileChangedEventListenerTest {
// delete the flow
Files.delete(Path.of(FILE_WATCH + "/" + genericFlow.uidWithoutRevision() + ".yaml"));
Await.until(
() -> flowRepository.findById(MAIN_TENANT, "io.kestra.tests.watch", "myflow").isEmpty(),
() -> flowRepository.findById(tenant, "io.kestra.tests.watch", "myflow").isEmpty(),
Duration.ofMillis(100),
Duration.ofSeconds(10)
);
}
@RetryingTest(5) // Flaky on CI but always pass locally
@Test
void testWithPluginDefault() throws IOException, TimeoutException {
var tenant = TestsUtils.randomTenant(FileChangedEventListenerTest.class.getName(), "testWithPluginDefault");
// remove the flow if it already exists
flowRepository.findByIdWithSource(MAIN_TENANT, "io.kestra.tests.watch", "pluginDefault").ifPresent(flow -> flowRepository.delete(flow));
flowRepository.findByIdWithSource(tenant, "io.kestra.tests.watch", "pluginDefault").ifPresent(flow -> flowRepository.delete(flow));
// create a flow with plugin default
String pluginDefault = """
@@ -113,14 +114,14 @@ class FileChangedEventListenerTest {
values:
message: Hello World!
""";
GenericFlow genericFlow = GenericFlow.fromYaml(MAIN_TENANT, pluginDefault);
GenericFlow genericFlow = GenericFlow.fromYaml(tenant, pluginDefault);
Files.write(Path.of(FILE_WATCH + "/" + genericFlow.uidWithoutRevision() + ".yaml"), pluginDefault.getBytes());
Await.until(
() -> flowRepository.findById(MAIN_TENANT, "io.kestra.tests.watch", "pluginDefault").isPresent(),
() -> flowRepository.findById(tenant, "io.kestra.tests.watch", "pluginDefault").isPresent(),
Duration.ofMillis(100),
Duration.ofSeconds(10)
);
Flow pluginDefaultFlow = flowRepository.findById(MAIN_TENANT, "io.kestra.tests.watch", "pluginDefault").orElseThrow();
Flow pluginDefaultFlow = flowRepository.findById(tenant, "io.kestra.tests.watch", "pluginDefault").orElseThrow();
assertThat(pluginDefaultFlow.getTasks()).hasSize(1);
assertThat(pluginDefaultFlow.getTasks().getFirst().getId()).isEqualTo("helloWithDefault");
assertThat(pluginDefaultFlow.getTasks().getFirst().getType()).isEqualTo("io.kestra.plugin.core.log.Log");
@@ -128,7 +129,7 @@ class FileChangedEventListenerTest {
// delete both files
Files.delete(Path.of(FILE_WATCH + "/" + genericFlow.uidWithoutRevision() + ".yaml"));
Await.until(
() -> flowRepository.findById(MAIN_TENANT, "io.kestra.tests.watch", "pluginDefault").isEmpty(),
() -> flowRepository.findById(tenant, "io.kestra.tests.watch", "pluginDefault").isEmpty(),
Duration.ofMillis(100),
Duration.ofSeconds(10)
);

View File

@@ -3,30 +3,88 @@ package io.kestra.core.events;
import io.micronaut.core.annotation.Nullable;
import io.micronaut.http.HttpRequest;
import io.micronaut.http.context.ServerRequestContext;
import lombok.AllArgsConstructor;
import lombok.Getter;
@AllArgsConstructor
import java.util.Objects;
@Getter
public class CrudEvent<T> {
T model;
private final T model;
@Nullable
T previousModel;
CrudEventType type;
HttpRequest<?> request;
private final T previousModel;
private final CrudEventType type;
private final HttpRequest<?> request;
/**
* Static helper method for creating a new {@link CrudEventType#UPDATE} CrudEvent.
*
* @param model the new created model.
* @param <T> type of the model.
* @return the new {@link CrudEvent}.
*/
public static <T> CrudEvent<T> create(T model) {
Objects.requireNonNull(model, "Can't create CREATE event with a null model");
return new CrudEvent<>(model, null, CrudEventType.CREATE);
}
/**
* Static helper method for creating a new {@link CrudEventType#DELETE} CrudEvent.
*
* @param model the deleted model.
* @param <T> type of the model.
* @return the new {@link CrudEvent}.
*/
public static <T> CrudEvent<T> delete(T model) {
Objects.requireNonNull(model, "Can't create DELETE event with a null model");
return new CrudEvent<>(null, model, CrudEventType.DELETE);
}
/**
* Static helper method for creating a new CrudEvent.
*
* @param before the model before the update.
* @param after the model after the update.
* @param <T> type of the model.
* @return the new {@link CrudEvent}.
*/
public static <T> CrudEvent<T> of(T before, T after) {
if (before == null && after == null) {
throw new IllegalArgumentException("Both before and after cannot be null");
}
if (before == null) {
return create(after);
}
if (after == null) {
return delete(before);
}
return new CrudEvent<>(after, before, CrudEventType.UPDATE);
}
/**
* @deprecated use the static factory methods.
*/
@Deprecated
public CrudEvent(T model, CrudEventType type) {
this.model = model;
this.type = type;
this.previousModel = null;
this.request = ServerRequestContext.currentRequest().orElse(null);
this(
CrudEventType.DELETE.equals(type) ? null : model,
CrudEventType.DELETE.equals(type) ? model : null,
type,
ServerRequestContext.currentRequest().orElse(null)
);
}
public CrudEvent(T model, T previousModel, CrudEventType type) {
this(model, previousModel, type, ServerRequestContext.currentRequest().orElse(null));
}
public CrudEvent(T model, T previousModel, CrudEventType type, HttpRequest<?> request) {
this.model = model;
this.previousModel = previousModel;
this.type = type;
this.request = ServerRequestContext.currentRequest().orElse(null);
this.request = request;
}
}

View File

@@ -1,16 +1,33 @@
package io.kestra.core.models;
import io.swagger.v3.oas.annotations.media.Schema;
import jakarta.validation.Valid;
import jakarta.validation.constraints.Pattern;
import java.util.List;
import java.util.Map;
/**
* Interface that can be implemented by classes supporting plugin versioning.
*
* @see Plugin
*/
public interface PluginVersioning {
String TITLE = "Plugin Version";
String DESCRIPTION = """
Defines the version of the plugin to use.
@Pattern(regexp="\\d+\\.\\d+\\.\\d+(-[a-zA-Z0-9-]+)?|([a-zA-Z0-9]+)")
@Schema(title = "The version of the plugin to use.")
The version must follow the Semantic Versioning (SemVer) specification:
- A single-digit MAJOR version (e.g., `1`).
- A MAJOR.MINOR version (e.g., `1.1`).
- A MAJOR.MINOR.PATCH version, optionally with any qualifier
(e.g., `1.1.2`, `1.1.0-SNAPSHOT`).
""";
@Schema(
title = TITLE,
description = DESCRIPTION
)
String getVersion();
}

View File

@@ -56,7 +56,7 @@ public class DefaultPluginRegistry implements PluginRegistry {
*
* @return the {@link DefaultPluginRegistry}.
*/
public static DefaultPluginRegistry getOrCreate() {
public synchronized static DefaultPluginRegistry getOrCreate() {
DefaultPluginRegistry instance = LazyHolder.INSTANCE;
if (!instance.isInitialized()) {
instance.init();
@@ -74,7 +74,7 @@ public class DefaultPluginRegistry implements PluginRegistry {
/**
* Initializes the registry by loading all core plugins.
*/
protected void init() {
protected synchronized void init() {
if (initialized.compareAndSet(false, true)) {
register(scanner.scan());
}
@@ -200,7 +200,7 @@ public class DefaultPluginRegistry implements PluginRegistry {
if (existing != null && existing.crc32() == plugin.crc32()) {
return; // same plugin already registered
}
lock.lock();
try {
if (existing != null) {
@@ -212,7 +212,7 @@ public class DefaultPluginRegistry implements PluginRegistry {
lock.unlock();
}
}
protected void registerAll(Map<PluginIdentifier, PluginClassAndMetadata<? extends Plugin>> plugins) {
pluginClassByIdentifier.putAll(plugins);
}

View File

@@ -6,6 +6,12 @@ import lombok.Getter;
import lombok.ToString;
import java.net.URL;
import java.nio.ByteBuffer;
import java.nio.charset.StandardCharsets;
import java.util.Enumeration;
import java.util.jar.JarEntry;
import java.util.jar.JarFile;
import java.util.zip.CRC32;
@AllArgsConstructor
@Getter
@@ -14,5 +20,59 @@ import java.net.URL;
public class ExternalPlugin {
private final URL location;
private final URL[] resources;
private final long crc32;
private volatile Long crc32; // lazy-val
public ExternalPlugin(URL location, URL[] resources) {
this.location = location;
this.resources = resources;
}
public Long getCrc32() {
if (this.crc32 == null) {
synchronized (this) {
if (this.crc32 == null) {
this.crc32 = computeJarCrc32(location);
}
}
}
return crc32;
}
/**
* Compute a CRC32 of the JAR File without reading the whole file
*
* @param location of the JAR File.
* @return the CRC32 of {@code -1} if the checksum can't be computed.
*/
private static long computeJarCrc32(final URL location) {
CRC32 crc = new CRC32();
try (JarFile jar = new JarFile(location.toURI().getPath(), false)) {
Enumeration<JarEntry> entries = jar.entries();
byte[] buffer = new byte[Long.BYTES]; // reusable buffer to avoid re-allocation
while (entries.hasMoreElements()) {
JarEntry entry = entries.nextElement();
crc.update(entry.getName().getBytes(StandardCharsets.UTF_8));
updateCrc32WithLong(crc, buffer, entry.getSize());
updateCrc32WithLong(crc, buffer, entry.getCrc());
}
return crc.getValue();
} catch (Exception e) {
return -1;
}
}
private static void updateCrc32WithLong(CRC32 crc32, byte[] reusable, long val) {
// fast long -> byte conversion
reusable[0] = (byte) (val >>> 56);
reusable[1] = (byte) (val >>> 48);
reusable[2] = (byte) (val >>> 40);
reusable[3] = (byte) (val >>> 32);
reusable[4] = (byte) (val >>> 24);
reusable[5] = (byte) (val >>> 16);
reusable[6] = (byte) (val >>> 8);
reusable[7] = (byte) val;
crc32.update(reusable);;
}
}

View File

@@ -46,6 +46,7 @@ public class PluginClassLoader extends URLClassLoader {
+ "|dev.failsafe"
+ "|reactor"
+ "|io.opentelemetry"
+ "|io.netty"
+ ")\\..*$");
private final ClassLoader parent;

View File

@@ -51,8 +51,7 @@ public class PluginResolver {
final List<URL> resources = resolveUrlsForPluginPath(path);
plugins.add(new ExternalPlugin(
path.toUri().toURL(),
resources.toArray(new URL[0]),
computeJarCrc32(path)
resources.toArray(new URL[0])
));
}
} catch (final InvalidPathException | MalformedURLException e) {
@@ -124,33 +123,5 @@ public class PluginResolver {
return urls;
}
/**
* Compute a CRC32 of the JAR File without reading the whole file
*
* @param location of the JAR File.
* @return the CRC32 of {@code -1} if the checksum can't be computed.
*/
private static long computeJarCrc32(final Path location) {
CRC32 crc = new CRC32();
try (JarFile jar = new JarFile(location.toFile(), false)) {
Enumeration<JarEntry> entries = jar.entries();
while (entries.hasMoreElements()) {
JarEntry entry = entries.nextElement();
crc.update(entry.getName().getBytes());
crc.update(longToBytes(entry.getSize()));
crc.update(longToBytes(entry.getCrc()));
}
} catch (Exception e) {
return -1;
}
return crc.getValue();
}
private static byte[] longToBytes(long x) {
ByteBuffer buffer = ByteBuffer.allocate(Long.BYTES);
buffer.putLong(x);
return buffer.array();
}
}

View File

@@ -83,7 +83,9 @@ public class LocalFlowRepositoryLoader {
}
public void load(String tenantId, File basePath) throws IOException {
Map<String, FlowInterface> flowByUidInRepository = flowRepository.findAllForAllTenants().stream()
Map<String, FlowInterface> flowByUidInRepository = flowRepository.findAllForAllTenants()
.stream()
.filter(flow -> tenantId.equals(flow.getTenantId()))
.collect(Collectors.toMap(FlowId::uidWithoutRevision, Function.identity()));
try (Stream<Path> pathStream = Files.walk(basePath.toPath())) {

View File

@@ -10,6 +10,7 @@ import io.kestra.core.models.flows.FlowInterface;
import io.kestra.core.models.flows.Input;
import io.kestra.core.models.flows.State;
import io.kestra.core.models.flows.input.SecretInput;
import io.kestra.core.models.property.Property;
import io.kestra.core.models.property.PropertyContext;
import io.kestra.core.models.tasks.Task;
import io.kestra.core.models.triggers.AbstractTrigger;
@@ -282,15 +283,15 @@ public final class RunVariables {
if (flow != null && flow.getInputs() != null) {
// we add default inputs value from the flow if not already set, this will be useful for triggers
flow.getInputs().stream()
.filter(input -> input.getDefaults() != null && !inputs.containsKey(input.getId()))
.forEach(input -> {
try {
inputs.put(input.getId(), FlowInputOutput.resolveDefaultValue(input, propertyContext));
} catch (IllegalVariableEvaluationException e) {
throw new RuntimeException("Unable to inject default value for input '" + input.getId() + "'", e);
}
});
flow.getInputs().stream()
.filter(input -> input.getDefaults() != null && !inputs.containsKey(input.getId()))
.forEach(input -> {
try {
inputs.put(input.getId(), FlowInputOutput.resolveDefaultValue(input, propertyContext));
} catch (IllegalVariableEvaluationException e) {
// Silent catch, if an input depends on another input, or a variable that is populated at runtime / input filling time, we can't resolve it here.
}
});
}
if (!inputs.isEmpty()) {

View File

@@ -5,6 +5,8 @@ import io.kestra.core.http.HttpRequest;
import io.kestra.core.http.HttpResponse;
import io.kestra.core.http.client.HttpClient;
import io.kestra.core.http.client.HttpClientException;
import io.kestra.core.http.client.HttpClientRequestException;
import io.kestra.core.http.client.HttpClientResponseException;
import io.kestra.core.http.client.configurations.HttpConfiguration;
import io.kestra.core.runners.RunContext;
import io.kestra.core.runners.RunContextFactory;
@@ -101,8 +103,15 @@ public class HttpFunction<T> implements Function {
try (HttpClient httpClient = new HttpClient(runContext, httpConfiguration)) {
HttpResponse<Object> response = httpClient.request(httpRequest, Object.class);
return response.getBody();
} catch (HttpClientException | IllegalVariableEvaluationException | IOException e) {
throw new PebbleException(e, "Unable to execute HTTP request", lineNumber, self.getName());
} catch (HttpClientResponseException e) {
if (e.getResponse() != null) {
String msg = "Failed to execute HTTP Request, server respond with status " + e.getResponse().getStatus().getCode() + " : " + e.getResponse().getStatus().getReason();
throw new PebbleException(e, msg , lineNumber, self.getName());
} else {
throw new PebbleException( e, "Failed to execute HTTP request ", lineNumber, self.getName());
}
} catch(HttpClientException | IllegalVariableEvaluationException | IOException e ) {
throw new PebbleException( e, "Failed to execute HTTP request ", lineNumber, self.getName());
}
}

View File

@@ -180,23 +180,13 @@ public final class FileSerde {
}
private static <T> MappingIterator<T> createMappingIterator(ObjectMapper objectMapper, Reader reader, TypeReference<T> type) throws IOException {
// See https://github.com/FasterXML/jackson-dataformats-binary/issues/493
// There is a limitation with the MappingIterator that cannot differentiate between an array of things (of whatever shape)
// and a sequence/stream of things (of Array shape).
// To work around that, we need to create a JsonParser and advance to the first token.
try (var parser = objectMapper.createParser(reader)) {
parser.nextToken();
return objectMapper.readerFor(type).readValues(parser);
}
}
private static <T> MappingIterator<T> createMappingIterator(ObjectMapper objectMapper, Reader reader, Class<T> type) throws IOException {
// See https://github.com/FasterXML/jackson-dataformats-binary/issues/493
// There is a limitation with the MappingIterator that cannot differentiate between an array of things (of whatever shape)
// and a sequence/stream of things (of Array shape).
// To work around that, we need to create a JsonParser and advance to the first token.
try (var parser = objectMapper.createParser(reader)) {
parser.nextToken();
return objectMapper.readerFor(type).readValues(parser);
}
}

View File

@@ -172,22 +172,19 @@ public final class JacksonMapper {
return Pair.of(patchPrevToNew, patchNewToPrev);
}
public static String applyPatches(Object object, List<JsonNode> patches) throws JsonProcessingException {
public static JsonNode applyPatchesOnJsonNode(JsonNode jsonObject, List<JsonNode> patches) {
for (JsonNode patch : patches) {
try {
// Required for ES
if (patch.findValue("value") == null) {
((ObjectNode) patch.get(0)).set("value", (JsonNode) null);
((ObjectNode) patch.get(0)).set("value", null);
}
JsonNode current = MAPPER.valueToTree(object);
object = JsonPatch.fromJson(patch).apply(current);
jsonObject = JsonPatch.fromJson(patch).apply(jsonObject);
} catch (IOException | JsonPatchException e) {
throw new RuntimeException(e);
}
}
return MAPPER.writeValueAsString(object);
return jsonObject;
}
}

View File

@@ -3,12 +3,7 @@ package io.kestra.core.services;
import com.fasterxml.jackson.core.JsonProcessingException;
import io.kestra.core.exceptions.FlowProcessingException;
import io.kestra.core.models.executions.Execution;
import io.kestra.core.models.flows.Flow;
import io.kestra.core.models.flows.FlowId;
import io.kestra.core.models.flows.FlowInterface;
import io.kestra.core.models.flows.FlowWithException;
import io.kestra.core.models.flows.FlowWithSource;
import io.kestra.core.models.flows.GenericFlow;
import io.kestra.core.models.flows.*;
import io.kestra.core.models.tasks.RunnableTask;
import io.kestra.core.models.topologies.FlowTopology;
import io.kestra.core.models.triggers.AbstractTrigger;
@@ -30,16 +25,7 @@ import org.apache.commons.lang3.builder.EqualsBuilder;
import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method;
import java.lang.reflect.Modifier;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.NoSuchElementException;
import java.util.Objects;
import java.util.Optional;
import java.util.*;
import java.util.concurrent.atomic.AtomicInteger;
import java.util.regex.Pattern;
import java.util.stream.Collectors;
@@ -551,23 +537,24 @@ public class FlowService {
return expandAll ? recursiveFlowTopology(new ArrayList<>(), tenant, namespace, id, destinationOnly) : flowTopologyRepository.get().findByFlow(tenant, namespace, id, destinationOnly).stream();
}
private Stream<FlowTopology> recursiveFlowTopology(List<FlowId> flowIds, String tenantId, String namespace, String id, boolean destinationOnly) {
private Stream<FlowTopology> recursiveFlowTopology(List<String> visitedTopologies, String tenantId, String namespace, String id, boolean destinationOnly) {
if (flowTopologyRepository.isEmpty()) {
throw noRepositoryException();
}
List<FlowTopology> flowTopologies = flowTopologyRepository.get().findByFlow(tenantId, namespace, id, destinationOnly);
FlowId flowId = FlowId.of(tenantId, namespace, id, null);
if (flowIds.contains(flowId)) {
return flowTopologies.stream();
}
flowIds.add(flowId);
var flowTopologies = flowTopologyRepository.get().findByFlow(tenantId, namespace, id, destinationOnly);
return flowTopologies.stream()
.flatMap(topology -> Stream.of(topology.getDestination(), topology.getSource()))
// recursively fetch child nodes
.flatMap(node -> recursiveFlowTopology(flowIds, node.getTenantId(), node.getNamespace(), node.getId(), destinationOnly));
// ignore already visited topologies
.filter(x -> !visitedTopologies.contains(x.uid()))
.flatMap(topology -> {
visitedTopologies.add(topology.uid());
Stream<FlowTopology> subTopologies = Stream
.of(topology.getDestination(), topology.getSource())
// recursively visit children and parents nodes
.flatMap(relationNode -> recursiveFlowTopology(visitedTopologies, relationNode.getTenantId(), relationNode.getNamespace(), relationNode.getId(), destinationOnly));
return Stream.concat(Stream.of(topology), subTopologies);
});
}
private IllegalStateException noRepositoryException() {

View File

@@ -18,6 +18,7 @@ import io.kestra.core.repositories.FlowRepositoryInterface;
import io.kestra.core.repositories.FlowTopologyRepositoryInterface;
import io.kestra.core.services.ConditionService;
import io.kestra.core.utils.ListUtils;
import io.kestra.core.utils.MapUtils;
import io.kestra.plugin.core.condition.*;
import io.micronaut.core.annotation.Nullable;
import jakarta.inject.Inject;
@@ -175,9 +176,6 @@ public class FlowTopologyService {
protected boolean isTriggerChild(Flow parent, Flow child) {
List<AbstractTrigger> triggers = ListUtils.emptyOnNull(child.getTriggers());
// simulated execution: we add a "simulated" label so conditions can know that the evaluation is for a simulated execution
Execution execution = Execution.newExecution(parent, (f, e) -> null, List.of(SIMULATED_EXECUTION), Optional.empty());
// keep only flow trigger
List<io.kestra.plugin.core.trigger.Flow> flowTriggers = triggers
.stream()
@@ -189,13 +187,16 @@ public class FlowTopologyService {
return false;
}
// simulated execution: we add a "simulated" label so conditions can know that the evaluation is for a simulated execution
Execution execution = Execution.newExecution(parent, (f, e) -> null, List.of(SIMULATED_EXECUTION), Optional.empty());
boolean conditionMatch = flowTriggers
.stream()
.flatMap(flow -> ListUtils.emptyOnNull(flow.getConditions()).stream())
.allMatch(condition -> validateCondition(condition, parent, execution));
boolean preconditionMatch = flowTriggers.stream()
.anyMatch(flow -> flow.getPreconditions() == null || validateMultipleConditions(flow.getPreconditions().getConditions(), parent, execution));
.anyMatch(flow -> flow.getPreconditions() == null || validatePreconditions(flow.getPreconditions(), parent, execution));
return conditionMatch && preconditionMatch;
}
@@ -239,11 +240,24 @@ public class FlowTopologyService {
}
private boolean isMandatoryMultipleCondition(Condition condition) {
return Stream
.of(
Expression.class
)
.anyMatch(aClass -> condition.getClass().isAssignableFrom(aClass));
return condition.getClass().isAssignableFrom(Expression.class);
}
private boolean validatePreconditions(io.kestra.plugin.core.trigger.Flow.Preconditions preconditions, FlowInterface child, Execution execution) {
boolean upstreamFlowMatched = MapUtils.emptyOnNull(preconditions.getUpstreamFlowsConditions())
.values()
.stream()
.filter(c -> !isFilterCondition(c))
.anyMatch(c -> validateCondition(c, child, execution));
boolean whereMatched = MapUtils.emptyOnNull(preconditions.getWhereConditions())
.values()
.stream()
.filter(c -> !isFilterCondition(c))
.allMatch(c -> validateCondition(c, child, execution));
// to be a dependency, if upstream flow is set it must be either inside it so it's a AND between upstream flow and where
return upstreamFlowMatched && whereMatched;
}
private boolean isFilterCondition(Condition condition) {

View File

@@ -206,22 +206,17 @@ public class MapUtils {
/**
* Utility method that flatten a nested map.
* <p>
* NOTE: for simplicity, this method didn't allow to flatten maps with conflicting keys that would end up in different flatten keys,
* this could be related later if needed by flattening {k1: k2: {k3: v1}, k1: {k4: v2}} to {k1.k2.k3: v1, k1.k4: v2} is prohibited for now.
*
* @param nestedMap the nested map.
* @return the flattened map.
*
* @throws IllegalArgumentException if any entry contains a map of more than one element.
*/
public static Map<String, Object> nestedToFlattenMap(@NotNull Map<String, Object> nestedMap) {
Map<String, Object> result = new TreeMap<>();
for (Map.Entry<String, Object> entry : nestedMap.entrySet()) {
if (entry.getValue() instanceof Map<?, ?> map) {
Map.Entry<String, Object> flatten = flattenEntry(entry.getKey(), (Map<String, Object>) map);
result.put(flatten.getKey(), flatten.getValue());
Map<String, Object> flatten = flattenEntry(entry.getKey(), (Map<String, Object>) map);
result.putAll(flatten);
} else {
result.put(entry.getKey(), entry.getValue());
}
@@ -229,18 +224,19 @@ public class MapUtils {
return result;
}
private static Map.Entry<String, Object> flattenEntry(String key, Map<String, Object> value) {
if (value.size() > 1) {
throw new IllegalArgumentException("You cannot flatten a map with an entry that is a map of more than one element, conflicting key: " + key);
private static Map<String, Object> flattenEntry(String key, Map<String, Object> value) {
Map<String, Object> result = new TreeMap<>();
for (Map.Entry<String, Object> entry : value.entrySet()) {
String newKey = key + "." + entry.getKey();
Object newValue = entry.getValue();
if (newValue instanceof Map<?, ?> map) {
result.putAll(flattenEntry(newKey, (Map<String, Object>) map));
} else {
result.put(newKey, newValue);
}
}
Map.Entry<String, Object> entry = value.entrySet().iterator().next();
String newKey = key + "." + entry.getKey();
Object newValue = entry.getValue();
if (newValue instanceof Map<?, ?> map) {
return flattenEntry(newKey, (Map<String, Object>) map);
} else {
return Map.entry(newKey, newValue);
}
return result;
}
}

View File

@@ -32,17 +32,23 @@ public class Version implements Comparable<Version> {
* @param version the version.
* @return a new {@link Version} instance.
*/
public static Version of(String version) {
public static Version of(final Object version) {
if (version.startsWith("v")) {
version = version.substring(1);
if (Objects.isNull(version)) {
throw new IllegalArgumentException("Invalid version, cannot parse null version");
}
String strVersion = version.toString();
if (strVersion.startsWith("v")) {
strVersion = strVersion.substring(1);
}
int qualifier = version.indexOf("-");
int qualifier = strVersion.indexOf("-");
final String[] versions = qualifier > 0 ?
version.substring(0, qualifier).split("\\.") :
version.split("\\.");
strVersion.substring(0, qualifier).split("\\.") :
strVersion.split("\\.");
try {
final int majorVersion = Integer.parseInt(versions[0]);
final int minorVersion = versions.length > 1 ? Integer.parseInt(versions[1]) : 0;
@@ -52,28 +58,54 @@ public class Version implements Comparable<Version> {
majorVersion,
minorVersion,
incrementalVersion,
qualifier > 0 ? version.substring(qualifier + 1) : null,
version
qualifier > 0 ? strVersion.substring(qualifier + 1) : null,
strVersion
);
} catch (NumberFormatException e) {
throw new IllegalArgumentException("Invalid version, cannot parse '" + version + "'");
}
}
/**
* Static helper method for returning the most recent stable version for a current {@link Version}.
* Returns the most recent stable version compatible with the given {@link Version}.
*
* @param from the current version.
* @param versions the list of version.
* <p>Resolution strategy:</p>
* <ol>
* <li>If {@code from} is present in {@code versions}, return it directly.</li>
* <li>Otherwise, return the latest version with the same major and minor.</li>
* <li>If none found and {@code from.majorVersion() > 0}, return the latest with the same major.</li>
* <li>Return {@code null} if no compatible version is found.</li>
* </ol>
*
* @return the last stable version.
* @param from the current version
* @param versions available versions
* @return the most recent compatible stable version, or {@code null} if none
*/
public static Version getStable(final Version from, final Collection<Version> versions) {
List<Version> compatibleVersions = versions.stream()
if (versions.contains(from)) {
return from;
}
// Prefer same major+minor stable versions
List<Version> sameMinorStable = versions.stream()
.filter(v -> v.majorVersion() == from.majorVersion() && v.minorVersion() == from.minorVersion())
.toList();
if (compatibleVersions.isEmpty()) return null;
return Version.getLatest(compatibleVersions);
if (!sameMinorStable.isEmpty()) {
return Version.getLatest(sameMinorStable);
}
if (from.majorVersion() > 0) {
// Fallback: any stable version with the same major
List<Version> sameMajorStable = versions.stream()
.filter(v -> v.majorVersion() == from.majorVersion())
.toList();
if (!sameMajorStable.isEmpty()) {
return Version.getLatest(sameMajorStable);
}
}
return null;
}
/**

View File

@@ -127,9 +127,24 @@ public class Labels extends Task implements ExecutionUpdatableTask {
}
// check for system labels: none can be passed at runtime
Optional<Map.Entry<String, String>> first = labelsAsMap.entrySet().stream().filter(entry -> entry.getKey().startsWith(SYSTEM_PREFIX)).findFirst();
if (first.isPresent()) {
throw new IllegalArgumentException("System labels can only be set by Kestra itself, offending label: " + first.get().getKey() + "=" + first.get().getValue());
Optional<Map.Entry<String, String>> systemLabel = labelsAsMap.entrySet().stream()
.filter(entry -> entry.getKey().startsWith(SYSTEM_PREFIX))
.findFirst();
if (systemLabel.isPresent()) {
throw new IllegalArgumentException(
"System labels can only be set by Kestra itself, offending label: " +
systemLabel.get().getKey() + "=" + systemLabel.get().getValue()
);
}
// check for empty label values
Optional<Map.Entry<String, String>> emptyValue = labelsAsMap.entrySet().stream()
.filter(entry -> entry.getValue().isEmpty())
.findFirst();
if (emptyValue.isPresent()) {
throw new IllegalArgumentException(
"Label values cannot be empty, offending label: " + emptyValue.get().getKey()
);
}
Map<String, String> newLabels = ListUtils.emptyOnNull(execution.getLabels()).stream()

View File

@@ -202,7 +202,7 @@ import static io.kestra.core.utils.Rethrow.throwPredicate;
code = """
id: sentry_execution_example
namespace: company.team
tasks:
- id: send_alert
type: io.kestra.plugin.notifications.sentry.SentryExecution
@@ -221,7 +221,7 @@ import static io.kestra.core.utils.Rethrow.throwPredicate;
- WARNING
- type: io.kestra.plugin.core.condition.ExecutionNamespace
namespace: company.payroll
prefix: false"""
prefix: false"""
)
},
@@ -405,6 +405,28 @@ public class Flow extends AbstractTrigger implements TriggerOutput<Flow.Output>
return conditions;
}
@JsonIgnore
public Map<String, Condition> getUpstreamFlowsConditions() {
AtomicInteger conditionId = new AtomicInteger();
return ListUtils.emptyOnNull(flows).stream()
.map(upstreamFlow -> Map.entry(
"condition_" + conditionId.incrementAndGet(),
new UpstreamFlowCondition(upstreamFlow)
))
.collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue));
}
@JsonIgnore
public Map<String, Condition> getWhereConditions() {
AtomicInteger conditionId = new AtomicInteger();
return ListUtils.emptyOnNull(where).stream()
.map(filter -> Map.entry(
"condition_" + conditionId.incrementAndGet() + "_" + filter.getId(),
new FilterCondition(filter)
))
.collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue));
}
@Override
public Logger logger() {
return log;

View File

@@ -21,10 +21,13 @@ import java.nio.file.Paths;
import java.util.Arrays;
import java.util.List;
import java.util.Objects;
import org.junit.jupiter.api.parallel.Execution;
import org.junit.jupiter.api.parallel.ExecutionMode;
import static org.assertj.core.api.Assertions.assertThat;
@KestraTest
@Execution(ExecutionMode.SAME_THREAD)
class DocumentationGeneratorTest {
@Inject
JsonSchemaGenerator jsonSchemaGenerator;

View File

@@ -0,0 +1,121 @@
package io.kestra.core.events;
import org.junit.jupiter.api.Test;
import static org.assertj.core.api.AssertionsForClassTypes.assertThat;
import static org.assertj.core.api.AssertionsForClassTypes.assertThatThrownBy;
class CrudEventTest {
@Test
void shouldReturnCreateEventWhenModelIsProvided() {
// Given
String model = "testModel";
// When
CrudEvent<String> event = CrudEvent.create(model);
// Then
assertThat(event.getModel()).isEqualTo(model);
assertThat(event.getPreviousModel()).isNull();
assertThat(event.getType()).isEqualTo(CrudEventType.CREATE);
assertThat(event.getRequest()).isNull();
}
@Test
void shouldThrowExceptionWhenCreateEventWithNullModel() {
// Given
String model = null;
// When / Then
assertThatThrownBy(() -> CrudEvent.create(model))
.isInstanceOf(NullPointerException.class)
.hasMessage("Can't create CREATE event with a null model");
}
@Test
void shouldReturnDeleteEventWhenModelIsProvided() {
// Given
String model = "testModel";
// When
CrudEvent<String> event = CrudEvent.delete(model);
// Then
assertThat(event.getModel()).isNull();
assertThat(event.getPreviousModel()).isEqualTo(model);
assertThat(event.getType()).isEqualTo(CrudEventType.DELETE);
assertThat(event.getRequest()).isNull();
}
@Test
void shouldThrowExceptionWhenDeleteEventWithNullModel() {
// Given
String model = null;
// When / Then
assertThatThrownBy(() -> CrudEvent.delete(model))
.isInstanceOf(NullPointerException.class)
.hasMessage("Can't create DELETE event with a null model");
}
@Test
void shouldReturnUpdateEventWhenBeforeAndAfterAreProvided() {
// Given
String before = "oldModel";
String after = "newModel";
// When
CrudEvent<String> event = CrudEvent.of(before, after);
// Then
assertThat(event.getModel()).isEqualTo(after);
assertThat(event.getPreviousModel()).isEqualTo(before);
assertThat(event.getType()).isEqualTo(CrudEventType.UPDATE);
assertThat(event.getRequest()).isNull();
}
@Test
void shouldReturnCreateEventWhenBeforeIsNullAndAfterIsProvided() {
// Given
String before = null;
String after = "newModel";
// When
CrudEvent<String> event = CrudEvent.of(before, after);
// Then
assertThat(event.getModel()).isEqualTo(after);
assertThat(event.getPreviousModel()).isNull();
assertThat(event.getType()).isEqualTo(CrudEventType.CREATE);
assertThat(event.getRequest()).isNull();
}
@Test
void shouldReturnDeleteEventWhenAfterIsNullAndBeforeIsProvided() {
// Given
String before = "oldModel";
String after = null;
// When
CrudEvent<String> event = CrudEvent.of(before, after);
// Then
assertThat(event.getModel()).isNull();
assertThat(event.getPreviousModel()).isEqualTo(before);
assertThat(event.getType()).isEqualTo(CrudEventType.DELETE);
assertThat(event.getRequest()).isNull();
}
@Test
void shouldThrowExceptionWhenBothBeforeAndAfterAreNull() {
// Given
String before = null;
String after = null;
// When / Then
assertThatThrownBy(() -> CrudEvent.of(before, after))
.isInstanceOf(IllegalArgumentException.class)
.hasMessage("Both before and after cannot be null");
}
}

View File

@@ -37,6 +37,7 @@ import lombok.Value;
import org.apache.commons.io.IOUtils;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.parallel.ExecutionMode;
import org.junit.jupiter.params.ParameterizedTest;
import org.junit.jupiter.params.provider.Arguments;
import org.junit.jupiter.params.provider.MethodSource;
@@ -67,6 +68,7 @@ import static org.junit.jupiter.api.Assertions.assertThrows;
@KestraTest
@Testcontainers
@org.junit.jupiter.api.parallel.Execution(ExecutionMode.SAME_THREAD)
class HttpClientTest {
@Inject
private ApplicationContext applicationContext;

View File

@@ -13,19 +13,19 @@ import java.util.Map;
import static org.assertj.core.api.Assertions.assertThat;
class ExecutionTest {
private static final TaskRun.TaskRunBuilder TASK_RUN = TaskRun.builder()
.id("test");
@Test
void hasTaskRunJoinableTrue() {
Execution execution = Execution.builder()
.taskRunList(Collections.singletonList(TASK_RUN
.taskRunList(Collections.singletonList(TaskRun.builder()
.id("test")
.state(new State(State.Type.RUNNING, new State()))
.build())
)
.build();
assertThat(execution.hasTaskRunJoinable(TASK_RUN
assertThat(execution.hasTaskRunJoinable(TaskRun.builder()
.id("test")
.state(new State(State.Type.FAILED, new State()
.withState(State.Type.RUNNING)
))
@@ -36,13 +36,15 @@ class ExecutionTest {
@Test
void hasTaskRunJoinableSameState() {
Execution execution = Execution.builder()
.taskRunList(Collections.singletonList(TASK_RUN
.taskRunList(Collections.singletonList(TaskRun.builder()
.id("test")
.state(new State())
.build())
)
.build();
assertThat(execution.hasTaskRunJoinable(TASK_RUN
assertThat(execution.hasTaskRunJoinable(TaskRun.builder()
.id("test")
.state(new State())
.build()
)).isFalse();
@@ -51,7 +53,8 @@ class ExecutionTest {
@Test
void hasTaskRunJoinableFailedExecutionFromExecutor() {
Execution execution = Execution.builder()
.taskRunList(Collections.singletonList(TASK_RUN
.taskRunList(Collections.singletonList(TaskRun.builder()
.id("test")
.state(new State(State.Type.FAILED, new State()
.withState(State.Type.RUNNING)
))
@@ -59,7 +62,8 @@ class ExecutionTest {
)
.build();
assertThat(execution.hasTaskRunJoinable(TASK_RUN
assertThat(execution.hasTaskRunJoinable(TaskRun.builder()
.id("test")
.state(new State(State.Type.RUNNING, new State()))
.build()
)).isFalse();
@@ -68,7 +72,8 @@ class ExecutionTest {
@Test
void hasTaskRunJoinableRestartFailed() {
Execution execution = Execution.builder()
.taskRunList(Collections.singletonList(TASK_RUN
.taskRunList(Collections.singletonList(TaskRun.builder()
.id("test")
.state(new State(State.Type.CREATED, new State()
.withState(State.Type.RUNNING)
.withState(State.Type.FAILED)
@@ -77,7 +82,8 @@ class ExecutionTest {
)
.build();
assertThat(execution.hasTaskRunJoinable(TASK_RUN
assertThat(execution.hasTaskRunJoinable(TaskRun.builder()
.id("test")
.state(new State(State.Type.FAILED, new State()
.withState(State.Type.RUNNING)
))
@@ -88,7 +94,8 @@ class ExecutionTest {
@Test
void hasTaskRunJoinableRestartSuccess() {
Execution execution = Execution.builder()
.taskRunList(Collections.singletonList(TASK_RUN
.taskRunList(Collections.singletonList(TaskRun.builder()
.id("test")
.state(new State(State.Type.CREATED, new State()
.withState(State.Type.RUNNING)
.withState(State.Type.SUCCESS)
@@ -97,7 +104,8 @@ class ExecutionTest {
)
.build();
assertThat(execution.hasTaskRunJoinable(TASK_RUN
assertThat(execution.hasTaskRunJoinable(TaskRun.builder()
.id("test")
.state(new State(State.Type.SUCCESS, new State()
.withState(State.Type.RUNNING)
.withState(State.Type.SUCCESS)
@@ -109,7 +117,8 @@ class ExecutionTest {
@Test
void hasTaskRunJoinableAfterRestart() {
Execution execution = Execution.builder()
.taskRunList(Collections.singletonList(TASK_RUN
.taskRunList(Collections.singletonList(TaskRun.builder()
.id("test")
.state(new State(State.Type.CREATED, new State()
.withState(State.Type.RUNNING)
.withState(State.Type.FAILED)
@@ -118,7 +127,8 @@ class ExecutionTest {
)
.build();
assertThat(execution.hasTaskRunJoinable(TASK_RUN
assertThat(execution.hasTaskRunJoinable(TaskRun.builder()
.id("test")
.state(new State(State.Type.SUCCESS, new State()
.withState(State.Type.RUNNING)
.withState(State.Type.FAILED)

View File

@@ -261,10 +261,10 @@ class FlowGraphTest {
}
@Test
@LoadFlows({"flows/valids/task-flow.yaml",
"flows/valids/switch.yaml"})
@LoadFlows(value = {"flows/valids/task-flow.yaml",
"flows/valids/switch.yaml"}, tenantId = "tenant1")
void subflow() throws IllegalVariableEvaluationException, IOException, FlowProcessingException {
FlowWithSource flow = this.parse("flows/valids/task-flow.yaml");
FlowWithSource flow = this.parse("flows/valids/task-flow.yaml", "tenant1");
FlowGraph flowGraph = GraphUtils.flowGraph(flow, null);
assertThat(flowGraph.getNodes().size()).isEqualTo(6);
@@ -293,15 +293,15 @@ class FlowGraphTest {
}
@Test
@LoadFlows({"flows/valids/task-flow-dynamic.yaml",
"flows/valids/switch.yaml"})
@LoadFlows(value = {"flows/valids/task-flow-dynamic.yaml",
"flows/valids/switch.yaml"}, tenantId = "tenant2")
void dynamicIdSubflow() throws IllegalVariableEvaluationException, TimeoutException, QueueException, IOException, FlowProcessingException {
FlowWithSource flow = this.parse("flows/valids/task-flow-dynamic.yaml").toBuilder().revision(1).build();
FlowWithSource flow = this.parse("flows/valids/task-flow-dynamic.yaml", "tenant2").toBuilder().revision(1).build();
IllegalArgumentException illegalArgumentException = Assertions.assertThrows(IllegalArgumentException.class, () -> graphService.flowGraph(flow, Collections.singletonList("root.launch")));
assertThat(illegalArgumentException.getMessage()).isEqualTo("Can't expand subflow task 'launch' because namespace and/or flowId contains dynamic values. This can only be viewed on an execution.");
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "task-flow-dynamic", 1, (f, e) -> Map.of(
Execution execution = runnerUtils.runOne("tenant2", "io.kestra.tests", "task-flow-dynamic", 1, (f, e) -> Map.of(
"namespace", f.getNamespace(),
"flowId", "switch"
));
@@ -373,13 +373,17 @@ class FlowGraphTest {
}
private FlowWithSource parse(String path) throws IOException {
return parse(path, MAIN_TENANT);
}
private FlowWithSource parse(String path, String tenantId) throws IOException {
URL resource = TestsUtils.class.getClassLoader().getResource(path);
assert resource != null;
File file = new File(resource.getFile());
return YamlParser.parse(file, FlowWithSource.class).toBuilder()
.tenantId(MAIN_TENANT)
.tenantId(tenantId)
.source(Files.readString(file.toPath()))
.build();
}

View File

@@ -4,6 +4,7 @@ import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.runners.*;
import io.kestra.core.storages.StorageContext;
import io.kestra.core.storages.StorageInterface;
import io.kestra.core.utils.IdUtils;
import jakarta.inject.Inject;
import org.junit.jupiter.api.Test;
import org.mockito.Mockito;
@@ -84,8 +85,9 @@ class URIFetcherTest {
@Test
void shouldFetchFromNsfile() throws IOException {
URI uri = createNsFile(false);
RunContext runContext = runContextFactory.of(Map.of("flow", Map.of("namespace", "namespace")));
String namespace = IdUtils.create();
URI uri = createNsFile(namespace, false);
RunContext runContext = runContextFactory.of(Map.of("flow", Map.of("namespace", namespace)));
try (var fetch = URIFetcher.of(uri).fetch(runContext)) {
String fetchedContent = new String(fetch.readAllBytes());
@@ -95,7 +97,8 @@ class URIFetcherTest {
@Test
void shouldFetchFromNsfileFromOtherNs() throws IOException {
URI uri = createNsFile(true);
String namespace = IdUtils.create();
URI uri = createNsFile(namespace, true);
RunContext runContext = runContextFactory.of(Map.of("flow", Map.of("namespace", "other")));
try (var fetch = URIFetcher.of(uri).fetch(runContext)) {
@@ -139,8 +142,7 @@ class URIFetcherTest {
);
}
private URI createNsFile(boolean nsInAuthority) throws IOException {
String namespace = "namespace";
private URI createNsFile(String namespace, boolean nsInAuthority) throws IOException {
String filePath = "file.txt";
storage.createDirectory(MAIN_TENANT, namespace, URI.create(StorageContext.namespaceFilePrefix(namespace)));
storage.put(MAIN_TENANT, namespace, URI.create(StorageContext.namespaceFilePrefix(namespace) + "/" + filePath), new ByteArrayInputStream("Hello World".getBytes()));

View File

@@ -10,6 +10,7 @@ import io.kestra.core.models.tasks.Task;
import io.kestra.core.runners.RunContext;
import io.kestra.core.runners.RunContextFactory;
import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.utils.IdUtils;
import jakarta.inject.Inject;
import org.junit.jupiter.api.Test;
@@ -33,17 +34,15 @@ class ScriptServiceTest {
@Test
void replaceInternalStorage() throws IOException {
var runContext = runContextFactory.of();
String tenant = IdUtils.create();
var runContext = runContextFactory.of("id", "namespace", tenant);
var command = ScriptService.replaceInternalStorage(runContext, null, false);
assertThat(command).isEqualTo("");
command = ScriptService.replaceInternalStorage(runContext, "my command", false);
assertThat(command).isEqualTo("my command");
Path path = Path.of("/tmp/unittest/main/file.txt");
if (!path.toFile().exists()) {
Files.createFile(path);
}
Path path = createFile(tenant, "file");
String internalStorageUri = "kestra://some/file.txt";
File localFile = null;
@@ -70,12 +69,10 @@ class ScriptServiceTest {
@Test
void replaceInternalStorageUnicode() throws IOException {
var runContext = runContextFactory.of();
String tenant = IdUtils.create();
var runContext = runContextFactory.of("id", "namespace", tenant);
Path path = Path.of("/tmp/unittest/main/file-龍.txt");
if (!path.toFile().exists()) {
Files.createFile(path);
}
Path path = createFile(tenant, "file-龍");
String internalStorageUri = "kestra://some/file-龍.txt";
File localFile = null;
@@ -95,12 +92,10 @@ class ScriptServiceTest {
@Test
void uploadInputFiles() throws IOException {
var runContext = runContextFactory.of();
String tenant = IdUtils.create();
var runContext = runContextFactory.of("id", "namespace", tenant);
Path path = Path.of("/tmp/unittest/main/file.txt");
if (!path.toFile().exists()) {
Files.createFile(path);
}
Path path = createFile(tenant, "file");
List<File> filesToDelete = new ArrayList<>();
String internalStorageUri = "kestra://some/file.txt";
@@ -143,13 +138,11 @@ class ScriptServiceTest {
@Test
void uploadOutputFiles() throws IOException {
var runContext = runContextFactory.of();
Path path = Path.of("/tmp/unittest/main/file.txt");
if (!path.toFile().exists()) {
Files.createFile(path);
}
String tenant = IdUtils.create();
var runContext = runContextFactory.of("id", "namespace", tenant);
Path path = createFile(tenant, "file");
var outputFiles = ScriptService.uploadOutputFiles(runContext, Path.of("/tmp/unittest/main"));
var outputFiles = ScriptService.uploadOutputFiles(runContext, Path.of("/tmp/unittest/%s".formatted(tenant)));
assertThat(outputFiles, not(anEmptyMap()));
assertThat(outputFiles.get("file.txt")).isEqualTo(URI.create("kestra:///file.txt"));
@@ -232,4 +225,13 @@ class ScriptServiceTest {
.build();
return runContextFactory.of(flow, task, execution, taskRun);
}
private static Path createFile(String tenant, String fileName) throws IOException {
Path path = Path.of("/tmp/unittest/%s/%s.txt".formatted(tenant, fileName));
if (!path.toFile().exists()) {
Files.createDirectory(Path.of("/tmp/unittest/%s".formatted(tenant)));
Files.createFile(path);
}
return path;
}
}

View File

@@ -3,6 +3,7 @@ package io.kestra.core.models.triggers.multipleflows;
import com.google.common.collect.ImmutableMap;
import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.models.property.Property;
import io.kestra.core.utils.TestsUtils;
import org.apache.commons.lang3.tuple.Pair;
import org.junit.jupiter.api.Test;
import io.kestra.plugin.core.condition.ExecutionFlow;
@@ -33,8 +34,9 @@ public abstract class AbstractMultipleConditionStorageTest {
@Test
void allDefault() {
MultipleConditionStorageInterface multipleConditionStorage = multipleConditionStorage();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Pair<Flow, MultipleCondition> pair = mockFlow(TimeWindow.builder().build());
Pair<Flow, MultipleCondition> pair = mockFlow(tenant, TimeWindow.builder().build());
MultipleConditionWindow window = multipleConditionStorage.getOrCreate(pair.getKey(), pair.getRight(), Collections.emptyMap());
@@ -50,8 +52,9 @@ public abstract class AbstractMultipleConditionStorageTest {
@Test
void daily() {
MultipleConditionStorageInterface multipleConditionStorage = multipleConditionStorage();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Pair<Flow, MultipleCondition> pair = mockFlow(TimeWindow.builder().window(Duration.ofDays(1)).windowAdvance(Duration.ofSeconds(0)).build());
Pair<Flow, MultipleCondition> pair = mockFlow(tenant, TimeWindow.builder().window(Duration.ofDays(1)).windowAdvance(Duration.ofSeconds(0)).build());
MultipleConditionWindow window = multipleConditionStorage.getOrCreate(pair.getKey(), pair.getRight(), Collections.emptyMap());
@@ -67,8 +70,9 @@ public abstract class AbstractMultipleConditionStorageTest {
@Test
void dailyAdvance() {
MultipleConditionStorageInterface multipleConditionStorage = multipleConditionStorage();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Pair<Flow, MultipleCondition> pair = mockFlow(TimeWindow.builder().window(Duration.ofDays(1)).windowAdvance(Duration.ofHours(4).negated()).build());
Pair<Flow, MultipleCondition> pair = mockFlow(tenant, TimeWindow.builder().window(Duration.ofDays(1)).windowAdvance(Duration.ofHours(4).negated()).build());
MultipleConditionWindow window = multipleConditionStorage.getOrCreate(pair.getKey(), pair.getRight(), Collections.emptyMap());
@@ -84,8 +88,9 @@ public abstract class AbstractMultipleConditionStorageTest {
@Test
void hourly() {
MultipleConditionStorageInterface multipleConditionStorage = multipleConditionStorage();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Pair<Flow, MultipleCondition> pair = mockFlow(TimeWindow.builder().window(Duration.ofHours(1)).windowAdvance(Duration.ofHours(4).negated()).build());
Pair<Flow, MultipleCondition> pair = mockFlow(tenant, TimeWindow.builder().window(Duration.ofHours(1)).windowAdvance(Duration.ofHours(4).negated()).build());
MultipleConditionWindow window = multipleConditionStorage.getOrCreate(pair.getKey(), pair.getRight(), Collections.emptyMap());
@@ -102,8 +107,9 @@ public abstract class AbstractMultipleConditionStorageTest {
@Test
void minutely() {
MultipleConditionStorageInterface multipleConditionStorage = multipleConditionStorage();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Pair<Flow, MultipleCondition> pair = mockFlow(TimeWindow.builder().window(Duration.ofMinutes(15)).windowAdvance(Duration.ofMinutes(5).negated()).build());
Pair<Flow, MultipleCondition> pair = mockFlow(tenant, TimeWindow.builder().window(Duration.ofMinutes(15)).windowAdvance(Duration.ofMinutes(5).negated()).build());
MultipleConditionWindow window = multipleConditionStorage.getOrCreate(pair.getKey(), pair.getRight(), Collections.emptyMap());
@@ -115,8 +121,9 @@ public abstract class AbstractMultipleConditionStorageTest {
@Test
void expiration() throws Exception {
MultipleConditionStorageInterface multipleConditionStorage = multipleConditionStorage();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Pair<Flow, MultipleCondition> pair = mockFlow(TimeWindow.builder().window(Duration.ofSeconds(2)).windowAdvance(Duration.ofMinutes(0).negated()).build());
Pair<Flow, MultipleCondition> pair = mockFlow(tenant, TimeWindow.builder().window(Duration.ofSeconds(2)).windowAdvance(Duration.ofMinutes(0).negated()).build());
MultipleConditionWindow window = multipleConditionStorage.getOrCreate(pair.getKey(), pair.getRight(), Collections.emptyMap());
this.save(multipleConditionStorage, pair.getLeft(), Collections.singletonList(window.with(ImmutableMap.of("a", true))));
@@ -136,8 +143,9 @@ public abstract class AbstractMultipleConditionStorageTest {
@Test
void expired() throws Exception {
MultipleConditionStorageInterface multipleConditionStorage = multipleConditionStorage();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Pair<Flow, MultipleCondition> pair = mockFlow(TimeWindow.builder().window(Duration.ofSeconds(2)).windowAdvance(Duration.ofMinutes(0).negated()).build());
Pair<Flow, MultipleCondition> pair = mockFlow(tenant, TimeWindow.builder().window(Duration.ofSeconds(2)).windowAdvance(Duration.ofMinutes(0).negated()).build());
MultipleConditionWindow window = multipleConditionStorage.getOrCreate(pair.getKey(), pair.getRight(), Collections.emptyMap());
this.save(multipleConditionStorage, pair.getLeft(), Collections.singletonList(window.with(ImmutableMap.of("a", true))));
@@ -146,20 +154,21 @@ public abstract class AbstractMultipleConditionStorageTest {
assertThat(window.getResults().get("a")).isTrue();
List<MultipleConditionWindow> expired = multipleConditionStorage.expired(null);
List<MultipleConditionWindow> expired = multipleConditionStorage.expired(tenant);
assertThat(expired.size()).isZero();
Thread.sleep(2005);
expired = multipleConditionStorage.expired(null);
expired = multipleConditionStorage.expired(tenant);
assertThat(expired.size()).isEqualTo(1);
}
@Test
void dailyTimeDeadline() throws Exception {
MultipleConditionStorageInterface multipleConditionStorage = multipleConditionStorage();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Pair<Flow, MultipleCondition> pair = mockFlow(TimeWindow.builder().type(Type.DAILY_TIME_DEADLINE).deadline(LocalTime.now().plusSeconds(2)).build());
Pair<Flow, MultipleCondition> pair = mockFlow(tenant, TimeWindow.builder().type(Type.DAILY_TIME_DEADLINE).deadline(LocalTime.now().plusSeconds(2)).build());
MultipleConditionWindow window = multipleConditionStorage.getOrCreate(pair.getKey(), pair.getRight(), Collections.emptyMap());
this.save(multipleConditionStorage, pair.getLeft(), Collections.singletonList(window.with(ImmutableMap.of("a", true))));
@@ -168,20 +177,21 @@ public abstract class AbstractMultipleConditionStorageTest {
assertThat(window.getResults().get("a")).isTrue();
List<MultipleConditionWindow> expired = multipleConditionStorage.expired(null);
List<MultipleConditionWindow> expired = multipleConditionStorage.expired(tenant);
assertThat(expired.size()).isZero();
Thread.sleep(2005);
expired = multipleConditionStorage.expired(null);
expired = multipleConditionStorage.expired(tenant);
assertThat(expired.size()).isEqualTo(1);
}
@Test
void dailyTimeDeadline_Expired() throws Exception {
MultipleConditionStorageInterface multipleConditionStorage = multipleConditionStorage();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Pair<Flow, MultipleCondition> pair = mockFlow(TimeWindow.builder().type(Type.DAILY_TIME_DEADLINE).deadline(LocalTime.now().minusSeconds(1)).build());
Pair<Flow, MultipleCondition> pair = mockFlow(tenant, TimeWindow.builder().type(Type.DAILY_TIME_DEADLINE).deadline(LocalTime.now().minusSeconds(1)).build());
MultipleConditionWindow window = multipleConditionStorage.getOrCreate(pair.getKey(), pair.getRight(), Collections.emptyMap());
this.save(multipleConditionStorage, pair.getLeft(), Collections.singletonList(window.with(ImmutableMap.of("a", true))));
@@ -190,16 +200,17 @@ public abstract class AbstractMultipleConditionStorageTest {
assertThat(window.getResults()).isEmpty();
List<MultipleConditionWindow> expired = multipleConditionStorage.expired(null);
List<MultipleConditionWindow> expired = multipleConditionStorage.expired(tenant);
assertThat(expired.size()).isEqualTo(1);
}
@Test
void dailyTimeWindow() throws Exception {
void dailyTimeWindow() {
MultipleConditionStorageInterface multipleConditionStorage = multipleConditionStorage();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
LocalTime startTime = LocalTime.now().truncatedTo(ChronoUnit.MINUTES);
Pair<Flow, MultipleCondition> pair = mockFlow(TimeWindow.builder().type(Type.DAILY_TIME_WINDOW).startTime(startTime).endTime(startTime.plusMinutes(5)).build());
Pair<Flow, MultipleCondition> pair = mockFlow(tenant, TimeWindow.builder().type(Type.DAILY_TIME_WINDOW).startTime(startTime).endTime(startTime.plusMinutes(5)).build());
MultipleConditionWindow window = multipleConditionStorage.getOrCreate(pair.getKey(), pair.getRight(), Collections.emptyMap());
this.save(multipleConditionStorage, pair.getLeft(), Collections.singletonList(window.with(ImmutableMap.of("a", true))));
@@ -208,15 +219,16 @@ public abstract class AbstractMultipleConditionStorageTest {
assertThat(window.getResults().get("a")).isTrue();
List<MultipleConditionWindow> expired = multipleConditionStorage.expired(null);
List<MultipleConditionWindow> expired = multipleConditionStorage.expired(tenant);
assertThat(expired.size()).isZero();
}
@Test
void slidingWindow() throws Exception {
MultipleConditionStorageInterface multipleConditionStorage = multipleConditionStorage();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Pair<Flow, MultipleCondition> pair = mockFlow(TimeWindow.builder().type(Type.SLIDING_WINDOW).window(Duration.ofHours(1)).build());
Pair<Flow, MultipleCondition> pair = mockFlow(tenant, TimeWindow.builder().type(Type.SLIDING_WINDOW).window(Duration.ofHours(1)).build());
MultipleConditionWindow window = multipleConditionStorage.getOrCreate(pair.getKey(), pair.getRight(), Collections.emptyMap());
this.save(multipleConditionStorage, pair.getLeft(), Collections.singletonList(window.with(ImmutableMap.of("a", true))));
@@ -225,13 +237,13 @@ public abstract class AbstractMultipleConditionStorageTest {
assertThat(window.getResults().get("a")).isTrue();
List<MultipleConditionWindow> expired = multipleConditionStorage.expired(null);
List<MultipleConditionWindow> expired = multipleConditionStorage.expired(tenant);
assertThat(expired.size()).isZero();
}
private static Pair<Flow, MultipleCondition> mockFlow(TimeWindow sla) {
private static Pair<Flow, MultipleCondition> mockFlow(String tenantId, TimeWindow sla) {
var multipleCondition = MultipleCondition.builder()
.id("condition-multiple")
.id("condition-multiple-%s".formatted(tenantId))
.conditions(ImmutableMap.of(
"flow-a", ExecutionFlow.builder()
.flowId(Property.ofValue("flow-a"))
@@ -248,6 +260,7 @@ public abstract class AbstractMultipleConditionStorageTest {
Flow flow = Flow.builder()
.namespace(NAMESPACE)
.id("multiple-flow")
.tenantId(tenantId)
.revision(1)
.triggers(Collections.singletonList(io.kestra.plugin.core.trigger.Flow.builder()
.id("trigger-flow")

View File

@@ -18,10 +18,10 @@ import static org.assertj.core.api.Assertions.assertThat;
@KestraTest
class SystemInformationReportTest {
@Inject
private SystemInformationReport systemInformationReport;
@Test
void shouldGetReport() {
SystemInformationReport.SystemInformationEvent event = systemInformationReport.report(Instant.now());
@@ -32,34 +32,34 @@ class SystemInformationReportTest {
assertThat(event.host().getHardware().getLogicalProcessorCount()).isNotNull();
assertThat(event.host().getJvm().getName()).isNotNull();
assertThat(event.host().getOs().getFamily()).isNotNull();
assertThat(event.configurations().getRepositoryType()).isEqualTo("memory");
assertThat(event.configurations().getQueueType()).isEqualTo("memory");
assertThat(event.configurations().getRepositoryType()).isEqualTo("h2");
assertThat(event.configurations().getQueueType()).isEqualTo("h2");
}
@MockBean(SettingRepositoryInterface.class)
@Singleton
static class TestSettingRepository implements SettingRepositoryInterface {
public static Object UUID = null;
@Override
public Optional<Setting> findByKey(String key) {
return Optional.empty();
}
@Override
public List<Setting> findAll() {
return new ArrayList<>();
}
@Override
public Setting save(Setting setting) throws ConstraintViolationException {
if (setting.getKey().equals(Setting.INSTANCE_UUID)) {
UUID = setting.getValue();
}
return setting;
}
@Override
public Setting delete(Setting setting) {
return setting;

View File

@@ -25,6 +25,7 @@ import io.kestra.core.models.tasks.ResolvedTask;
import io.kestra.core.repositories.ExecutionRepositoryInterface.ChildFilter;
import io.kestra.core.utils.IdUtils;
import io.kestra.core.utils.NamespaceUtils;
import io.kestra.core.utils.TestsUtils;
import io.kestra.plugin.core.dashboard.data.Executions;
import io.kestra.plugin.core.debug.Return;
import io.micronaut.data.model.Pageable;
@@ -48,7 +49,6 @@ import java.util.stream.Collectors;
import java.util.stream.Stream;
import static io.kestra.core.models.flows.FlowScope.USER;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
import static org.junit.jupiter.api.Assertions.assertThrows;
import static org.mockito.Mockito.doReturn;
@@ -62,17 +62,17 @@ public abstract class AbstractExecutionRepositoryTest {
@Inject
protected ExecutionRepositoryInterface executionRepository;
public static Execution.ExecutionBuilder builder(State.Type state, String flowId) {
return builder(state, flowId, NAMESPACE);
public static Execution.ExecutionBuilder builder(String tenantId, State.Type state, String flowId) {
return builder(tenantId, state, flowId, NAMESPACE);
}
public static Execution.ExecutionBuilder builder(State.Type state, String flowId, String namespace) {
public static Execution.ExecutionBuilder builder(String tenantId, State.Type state, String flowId, String namespace) {
State finalState = randomDuration(state);
Execution.ExecutionBuilder execution = Execution.builder()
.id(FriendlyId.createFriendlyId())
.namespace(namespace)
.tenantId(MAIN_TENANT)
.tenantId(tenantId)
.flowId(flowId == null ? FLOW : flowId)
.flowRevision(1)
.state(finalState);
@@ -126,11 +126,11 @@ public abstract class AbstractExecutionRepositoryTest {
return finalState;
}
protected void inject() {
inject(null);
protected void inject(String tenantId) {
inject(tenantId, null);
}
protected void inject(String executionTriggerId) {
protected void inject(String tenantId, String executionTriggerId) {
ExecutionTrigger executionTrigger = null;
if (executionTriggerId != null) {
@@ -139,7 +139,7 @@ public abstract class AbstractExecutionRepositoryTest {
.build();
}
executionRepository.save(builder(State.Type.RUNNING, null)
executionRepository.save(builder(tenantId, State.Type.RUNNING, null)
.labels(List.of(
new Label("key", "value"),
new Label("key2", "value2")
@@ -149,6 +149,7 @@ public abstract class AbstractExecutionRepositoryTest {
);
for (int i = 1; i < 28; i++) {
executionRepository.save(builder(
tenantId,
i < 5 ? State.Type.RUNNING : (i < 8 ? State.Type.FAILED : State.Type.SUCCESS),
i < 15 ? null : "second"
).trigger(executionTrigger).build());
@@ -156,6 +157,7 @@ public abstract class AbstractExecutionRepositoryTest {
// add a test execution, this should be ignored in search & statistics
executionRepository.save(builder(
tenantId,
State.Type.SUCCESS,
null
)
@@ -167,9 +169,10 @@ public abstract class AbstractExecutionRepositoryTest {
@ParameterizedTest
@MethodSource("filterCombinations")
void should_find_all(QueryFilter filter, int expectedSize){
inject("executionTriggerId");
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
inject(tenant, "executionTriggerId");
ArrayListTotal<Execution> entries = executionRepository.find(Pageable.UNPAGED, MAIN_TENANT, List.of(filter));
ArrayListTotal<Execution> entries = executionRepository.find(Pageable.UNPAGED, tenant, List.of(filter));
assertThat(entries).hasSize(expectedSize);
}
@@ -192,7 +195,8 @@ public abstract class AbstractExecutionRepositoryTest {
@ParameterizedTest
@MethodSource("errorFilterCombinations")
void should_fail_to_find_all(QueryFilter filter){
assertThrows(InvalidQueryFiltersException.class, () -> executionRepository.find(Pageable.UNPAGED, MAIN_TENANT, List.of(filter)));
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
assertThrows(InvalidQueryFiltersException.class, () -> executionRepository.find(Pageable.UNPAGED, tenant, List.of(filter)));
}
static Stream<QueryFilter> errorFilterCombinations() {
@@ -208,9 +212,10 @@ public abstract class AbstractExecutionRepositoryTest {
@Test
protected void find() {
inject();
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
inject(tenant);
ArrayListTotal<Execution> executions = executionRepository.find(Pageable.from(1, 10), MAIN_TENANT, null);
ArrayListTotal<Execution> executions = executionRepository.find(Pageable.from(1, 10), tenant, null);
assertThat(executions.getTotal()).isEqualTo(28L);
assertThat(executions.size()).isEqualTo(10);
@@ -219,7 +224,7 @@ public abstract class AbstractExecutionRepositoryTest {
.operation(QueryFilter.Op.EQUALS)
.value( List.of(State.Type.RUNNING, State.Type.FAILED))
.build());
executions = executionRepository.find(Pageable.from(1, 10), MAIN_TENANT, filters);
executions = executionRepository.find(Pageable.from(1, 10), tenant, filters);
assertThat(executions.getTotal()).isEqualTo(8L);
filters = List.of(QueryFilter.builder()
@@ -227,7 +232,7 @@ public abstract class AbstractExecutionRepositoryTest {
.operation(QueryFilter.Op.EQUALS)
.value(Map.of("key", "value"))
.build());
executions = executionRepository.find(Pageable.from(1, 10), MAIN_TENANT, filters);
executions = executionRepository.find(Pageable.from(1, 10), tenant, filters);
assertThat(executions.getTotal()).isEqualTo(1L);
filters = List.of(QueryFilter.builder()
@@ -235,7 +240,7 @@ public abstract class AbstractExecutionRepositoryTest {
.operation(QueryFilter.Op.EQUALS)
.value(Map.of("key", "value2"))
.build());
executions = executionRepository.find(Pageable.from(1, 10), MAIN_TENANT, filters);
executions = executionRepository.find(Pageable.from(1, 10), tenant, filters);
assertThat(executions.getTotal()).isEqualTo(0L);
filters = List.of(QueryFilter.builder()
@@ -244,7 +249,7 @@ public abstract class AbstractExecutionRepositoryTest {
.value(Map.of("key", "value", "keyTest", "valueTest"))
.build()
);
executions = executionRepository.find(Pageable.from(1, 10), MAIN_TENANT, filters);
executions = executionRepository.find(Pageable.from(1, 10), tenant, filters);
assertThat(executions.getTotal()).isEqualTo(0L);
filters = List.of(QueryFilter.builder()
@@ -252,7 +257,7 @@ public abstract class AbstractExecutionRepositoryTest {
.operation(QueryFilter.Op.EQUALS)
.value("second")
.build());
executions = executionRepository.find(Pageable.from(1, 10), MAIN_TENANT, filters);
executions = executionRepository.find(Pageable.from(1, 10), tenant, filters);
assertThat(executions.getTotal()).isEqualTo(13L);
filters = List.of(QueryFilter.builder()
@@ -266,7 +271,7 @@ public abstract class AbstractExecutionRepositoryTest {
.value(NAMESPACE)
.build()
);
executions = executionRepository.find(Pageable.from(1, 10), MAIN_TENANT, filters);
executions = executionRepository.find(Pageable.from(1, 10), tenant, filters);
assertThat(executions.getTotal()).isEqualTo(13L);
filters = List.of(QueryFilter.builder()
@@ -274,7 +279,7 @@ public abstract class AbstractExecutionRepositoryTest {
.operation(QueryFilter.Op.STARTS_WITH)
.value("io.kestra")
.build());
executions = executionRepository.find(Pageable.from(1, 10), MAIN_TENANT, filters);
executions = executionRepository.find(Pageable.from(1, 10), tenant, filters);
assertThat(executions.getTotal()).isEqualTo(28L);
}
@@ -282,15 +287,16 @@ public abstract class AbstractExecutionRepositoryTest {
protected void findTriggerExecutionId() {
String executionTriggerId = IdUtils.create();
inject(executionTriggerId);
inject();
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
inject(tenant, executionTriggerId);
inject(tenant);
var filters = List.of(QueryFilter.builder()
.field(QueryFilter.Field.TRIGGER_EXECUTION_ID)
.operation(QueryFilter.Op.EQUALS)
.value(executionTriggerId)
.build());
ArrayListTotal<Execution> executions = executionRepository.find(Pageable.from(1, 10), MAIN_TENANT, filters);
ArrayListTotal<Execution> executions = executionRepository.find(Pageable.from(1, 10), tenant, filters);
assertThat(executions.getTotal()).isEqualTo(28L);
assertThat(executions.size()).isEqualTo(10);
assertThat(executions.getFirst().getTrigger().getVariables().get("executionId")).isEqualTo(executionTriggerId);
@@ -300,7 +306,7 @@ public abstract class AbstractExecutionRepositoryTest {
.value(ExecutionRepositoryInterface.ChildFilter.CHILD)
.build());
executions = executionRepository.find(Pageable.from(1, 10), MAIN_TENANT, filters);
executions = executionRepository.find(Pageable.from(1, 10), tenant, filters);
assertThat(executions.getTotal()).isEqualTo(28L);
assertThat(executions.size()).isEqualTo(10);
assertThat(executions.getFirst().getTrigger().getVariables().get("executionId")).isEqualTo(executionTriggerId);
@@ -311,20 +317,21 @@ public abstract class AbstractExecutionRepositoryTest {
.value(ExecutionRepositoryInterface.ChildFilter.MAIN)
.build());
executions = executionRepository.find(Pageable.from(1, 10), MAIN_TENANT, filters );
executions = executionRepository.find(Pageable.from(1, 10), tenant, filters );
assertThat(executions.getTotal()).isEqualTo(28L);
assertThat(executions.size()).isEqualTo(10);
assertThat(executions.getFirst().getTrigger()).isNull();
executions = executionRepository.find(Pageable.from(1, 10), MAIN_TENANT, null);
executions = executionRepository.find(Pageable.from(1, 10), tenant, null);
assertThat(executions.getTotal()).isEqualTo(56L);
}
@Test
protected void findWithSort() {
inject();
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
inject(tenant);
ArrayListTotal<Execution> executions = executionRepository.find(Pageable.from(1, 10, Sort.of(Sort.Order.desc("id"))), MAIN_TENANT, null);
ArrayListTotal<Execution> executions = executionRepository.find(Pageable.from(1, 10, Sort.of(Sort.Order.desc("id"))), tenant, null);
assertThat(executions.getTotal()).isEqualTo(28L);
assertThat(executions.size()).isEqualTo(10);
@@ -333,15 +340,16 @@ public abstract class AbstractExecutionRepositoryTest {
.operation(QueryFilter.Op.EQUALS)
.value(List.of(State.Type.RUNNING, State.Type.FAILED))
.build());
executions = executionRepository.find(Pageable.from(1, 10), MAIN_TENANT, filters);
executions = executionRepository.find(Pageable.from(1, 10), tenant, filters);
assertThat(executions.getTotal()).isEqualTo(8L);
}
@Test
protected void findTaskRun() {
inject();
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
inject(tenant);
ArrayListTotal<TaskRun> taskRuns = executionRepository.findTaskRun(Pageable.from(1, 10), MAIN_TENANT, null);
ArrayListTotal<TaskRun> taskRuns = executionRepository.findTaskRun(Pageable.from(1, 10), tenant, null);
assertThat(taskRuns.getTotal()).isEqualTo(74L);
assertThat(taskRuns.size()).isEqualTo(10);
@@ -351,7 +359,7 @@ public abstract class AbstractExecutionRepositoryTest {
.value(Map.of("key", "value"))
.build());
taskRuns = executionRepository.findTaskRun(Pageable.from(1, 10), MAIN_TENANT, filters);
taskRuns = executionRepository.findTaskRun(Pageable.from(1, 10), tenant, filters);
assertThat(taskRuns.getTotal()).isEqualTo(1L);
assertThat(taskRuns.size()).isEqualTo(1);
}
@@ -359,74 +367,86 @@ public abstract class AbstractExecutionRepositoryTest {
@Test
protected void findById() {
executionRepository.save(ExecutionFixture.EXECUTION_1);
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
var execution1 = ExecutionFixture.EXECUTION_1(tenant);
executionRepository.save(execution1);
Optional<Execution> full = executionRepository.findById(MAIN_TENANT, ExecutionFixture.EXECUTION_1.getId());
Optional<Execution> full = executionRepository.findById(tenant, execution1.getId());
assertThat(full.isPresent()).isTrue();
full.ifPresent(current -> {
assertThat(full.get().getId()).isEqualTo(ExecutionFixture.EXECUTION_1.getId());
assertThat(full.get().getId()).isEqualTo(execution1.getId());
});
}
@Test
protected void shouldFindByIdTestExecution() {
executionRepository.save(ExecutionFixture.EXECUTION_TEST);
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
var executionTest = ExecutionFixture.EXECUTION_TEST(tenant);
executionRepository.save(executionTest);
Optional<Execution> full = executionRepository.findById(null, ExecutionFixture.EXECUTION_TEST.getId());
Optional<Execution> full = executionRepository.findById(tenant, executionTest.getId());
assertThat(full.isPresent()).isTrue();
full.ifPresent(current -> {
assertThat(full.get().getId()).isEqualTo(ExecutionFixture.EXECUTION_TEST.getId());
assertThat(full.get().getId()).isEqualTo(executionTest.getId());
});
}
@Test
protected void purge() {
executionRepository.save(ExecutionFixture.EXECUTION_1);
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
var execution1 = ExecutionFixture.EXECUTION_1(tenant);
executionRepository.save(execution1);
Optional<Execution> full = executionRepository.findById(MAIN_TENANT, ExecutionFixture.EXECUTION_1.getId());
Optional<Execution> full = executionRepository.findById(tenant, execution1.getId());
assertThat(full.isPresent()).isTrue();
executionRepository.purge(ExecutionFixture.EXECUTION_1);
executionRepository.purge(execution1);
full = executionRepository.findById(null, ExecutionFixture.EXECUTION_1.getId());
full = executionRepository.findById(tenant, execution1.getId());
assertThat(full.isPresent()).isFalse();
}
@Test
protected void delete() {
executionRepository.save(ExecutionFixture.EXECUTION_1);
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
var execution1 = ExecutionFixture.EXECUTION_1(tenant);
executionRepository.save(execution1);
Optional<Execution> full = executionRepository.findById(MAIN_TENANT, ExecutionFixture.EXECUTION_1.getId());
Optional<Execution> full = executionRepository.findById(tenant, execution1.getId());
assertThat(full.isPresent()).isTrue();
executionRepository.delete(ExecutionFixture.EXECUTION_1);
executionRepository.delete(execution1);
full = executionRepository.findById(MAIN_TENANT, ExecutionFixture.EXECUTION_1.getId());
full = executionRepository.findById(tenant, execution1.getId());
assertThat(full.isPresent()).isFalse();
}
@Test
protected void mappingConflict() {
executionRepository.save(ExecutionFixture.EXECUTION_2);
executionRepository.save(ExecutionFixture.EXECUTION_1);
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
executionRepository.save(ExecutionFixture.EXECUTION_2(tenant));
executionRepository.save(ExecutionFixture.EXECUTION_1(tenant));
ArrayListTotal<Execution> page1 = executionRepository.findByFlowId(MAIN_TENANT, NAMESPACE, FLOW, Pageable.from(1, 10));
ArrayListTotal<Execution> page1 = executionRepository.findByFlowId(tenant, NAMESPACE, FLOW, Pageable.from(1, 10));
assertThat(page1.size()).isEqualTo(2);
}
@Test
protected void dailyStatistics() throws InterruptedException {
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
for (int i = 0; i < 28; i++) {
executionRepository.save(builder(
tenant,
i < 5 ? State.Type.RUNNING : (i < 8 ? State.Type.FAILED : State.Type.SUCCESS),
i < 15 ? null : "second"
).build());
}
executionRepository.save(builder(
tenant,
State.Type.SUCCESS,
"second"
).namespace(NamespaceUtils.SYSTEM_FLOWS_DEFAULT_NAMESPACE).build());
@@ -436,7 +456,7 @@ public abstract class AbstractExecutionRepositoryTest {
List<DailyExecutionStatistics> result = executionRepository.dailyStatistics(
null,
MAIN_TENANT,
tenant,
null,
null,
null,
@@ -456,7 +476,7 @@ public abstract class AbstractExecutionRepositoryTest {
result = executionRepository.dailyStatistics(
null,
MAIN_TENANT,
tenant,
List.of(FlowScope.USER, FlowScope.SYSTEM),
null,
null,
@@ -471,7 +491,7 @@ public abstract class AbstractExecutionRepositoryTest {
result = executionRepository.dailyStatistics(
null,
MAIN_TENANT,
tenant,
List.of(FlowScope.USER),
null,
null,
@@ -485,7 +505,7 @@ public abstract class AbstractExecutionRepositoryTest {
result = executionRepository.dailyStatistics(
null,
MAIN_TENANT,
tenant,
List.of(FlowScope.SYSTEM),
null,
null,
@@ -500,21 +520,24 @@ public abstract class AbstractExecutionRepositoryTest {
@Test
protected void taskRunsDailyStatistics() {
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
for (int i = 0; i < 28; i++) {
executionRepository.save(builder(
tenant,
i < 5 ? State.Type.RUNNING : (i < 8 ? State.Type.FAILED : State.Type.SUCCESS),
i < 15 ? null : "second"
).build());
}
executionRepository.save(builder(
tenant,
State.Type.SUCCESS,
"second"
).namespace(NamespaceUtils.SYSTEM_FLOWS_DEFAULT_NAMESPACE).build());
List<DailyExecutionStatistics> result = executionRepository.dailyStatistics(
null,
MAIN_TENANT,
tenant,
null,
null,
null,
@@ -534,7 +557,7 @@ public abstract class AbstractExecutionRepositoryTest {
result = executionRepository.dailyStatistics(
null,
MAIN_TENANT,
tenant,
List.of(FlowScope.USER, FlowScope.SYSTEM),
null,
null,
@@ -549,7 +572,7 @@ public abstract class AbstractExecutionRepositoryTest {
result = executionRepository.dailyStatistics(
null,
MAIN_TENANT,
tenant,
List.of(FlowScope.USER),
null,
null,
@@ -563,7 +586,7 @@ public abstract class AbstractExecutionRepositoryTest {
result = executionRepository.dailyStatistics(
null,
MAIN_TENANT,
tenant,
List.of(FlowScope.SYSTEM),
null,
null,
@@ -579,8 +602,10 @@ public abstract class AbstractExecutionRepositoryTest {
@SuppressWarnings("OptionalGetWithoutIsPresent")
@Test
protected void executionsCount() throws InterruptedException {
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
for (int i = 0; i < 14; i++) {
executionRepository.save(builder(
tenant,
State.Type.SUCCESS,
i < 2 ? "first" : (i < 5 ? "second" : "third")
).build());
@@ -590,7 +615,7 @@ public abstract class AbstractExecutionRepositoryTest {
Thread.sleep(500);
List<ExecutionCount> result = executionRepository.executionCounts(
MAIN_TENANT,
tenant,
List.of(
new Flow(NAMESPACE, "first"),
new Flow(NAMESPACE, "second"),
@@ -609,7 +634,7 @@ public abstract class AbstractExecutionRepositoryTest {
assertThat(result.stream().filter(executionCount -> executionCount.getFlowId().equals("missing")).findFirst().get().getCount()).isEqualTo(0L);
result = executionRepository.executionCounts(
MAIN_TENANT,
tenant,
List.of(
new Flow(NAMESPACE, "first"),
new Flow(NAMESPACE, "second"),
@@ -626,7 +651,7 @@ public abstract class AbstractExecutionRepositoryTest {
assertThat(result.stream().filter(executionCount -> executionCount.getFlowId().equals("third")).findFirst().get().getCount()).isEqualTo(9L);
result = executionRepository.executionCounts(
MAIN_TENANT,
tenant,
null,
null,
null,
@@ -639,14 +664,15 @@ public abstract class AbstractExecutionRepositoryTest {
@Test
protected void update() {
Execution execution = ExecutionFixture.EXECUTION_1;
executionRepository.save(ExecutionFixture.EXECUTION_1);
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Execution execution = ExecutionFixture.EXECUTION_1(tenant);
executionRepository.save(execution);
Label label = new Label("key", "value");
Execution updated = execution.toBuilder().labels(List.of(label)).build();
executionRepository.update(updated);
Optional<Execution> validation = executionRepository.findById(MAIN_TENANT, updated.getId());
Optional<Execution> validation = executionRepository.findById(tenant, updated.getId());
assertThat(validation.isPresent()).isTrue();
assertThat(validation.get().getLabels().size()).isEqualTo(1);
assertThat(validation.get().getLabels().getFirst()).isEqualTo(label);
@@ -654,13 +680,14 @@ public abstract class AbstractExecutionRepositoryTest {
@Test
void shouldFindLatestExecutionGivenState() {
Execution earliest = buildWithCreatedDate(Instant.now().minus(Duration.ofMinutes(10)));
Execution latest = buildWithCreatedDate(Instant.now().minus(Duration.ofMinutes(5)));
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Execution earliest = buildWithCreatedDate(tenant, Instant.now().minus(Duration.ofMinutes(10)));
Execution latest = buildWithCreatedDate(tenant, Instant.now().minus(Duration.ofMinutes(5)));
executionRepository.save(earliest);
executionRepository.save(latest);
Optional<Execution> result = executionRepository.findLatestForStates(MAIN_TENANT, "io.kestra.unittest", "full", List.of(State.Type.CREATED));
Optional<Execution> result = executionRepository.findLatestForStates(tenant, "io.kestra.unittest", "full", List.of(State.Type.CREATED));
assertThat(result.isPresent()).isTrue();
assertThat(result.get().getId()).isEqualTo(latest.getId());
}
@@ -700,11 +727,11 @@ public abstract class AbstractExecutionRepositoryTest {
assertThat(data.get(0).get("date")).isEqualTo(DateTimeFormatter.ofPattern("yyyy-MM-dd'T'HH:mm:ss.SSSXXX").format(ZonedDateTime.ofInstant(startDate, ZoneId.systemDefault()).withSecond(0).withNano(0)));
}
private static Execution buildWithCreatedDate(Instant instant) {
private static Execution buildWithCreatedDate(String tenant, Instant instant) {
return Execution.builder()
.id(IdUtils.create())
.namespace("io.kestra.unittest")
.tenantId(MAIN_TENANT)
.tenantId(tenant)
.flowId("full")
.flowRevision(1)
.state(new State(State.Type.CREATED, List.of(new State.History(State.Type.CREATED, instant))))
@@ -715,22 +742,24 @@ public abstract class AbstractExecutionRepositoryTest {
@Test
protected void findAllAsync() {
inject();
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
inject(tenant);
List<Execution> executions = executionRepository.findAllAsync(MAIN_TENANT).collectList().block();
List<Execution> executions = executionRepository.findAllAsync(tenant).collectList().block();
assertThat(executions).hasSize(29); // used by the backup so it contains TEST executions
}
@Test
protected void shouldFindByLabel() {
inject();
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
inject(tenant);
List<QueryFilter> filters = List.of(QueryFilter.builder()
.field(QueryFilter.Field.LABELS)
.operation(QueryFilter.Op.EQUALS)
.value(Map.of("key", "value"))
.build());
List<Execution> executions = executionRepository.find(Pageable.from(1, 10), MAIN_TENANT, filters);
List<Execution> executions = executionRepository.find(Pageable.from(1, 10), tenant, filters);
assertThat(executions.size()).isEqualTo(1L);
// Filtering by two pairs of labels, since now its a and behavior, it should not return anything
@@ -739,15 +768,16 @@ public abstract class AbstractExecutionRepositoryTest {
.operation(QueryFilter.Op.EQUALS)
.value(Map.of("key", "value", "keyother", "valueother"))
.build());
executions = executionRepository.find(Pageable.from(1, 10), MAIN_TENANT, filters);
executions = executionRepository.find(Pageable.from(1, 10), tenant, filters);
assertThat(executions.size()).isEqualTo(0L);
}
@Test
protected void shouldReturnLastExecutionsWhenInputsAreNull() {
inject();
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
inject(tenant);
List<Execution> lastExecutions = executionRepository.lastExecutions(MAIN_TENANT, null);
List<Execution> lastExecutions = executionRepository.lastExecutions(tenant, null);
assertThat(lastExecutions).isNotEmpty();
Set<String> flowIds = lastExecutions.stream().map(Execution::getFlowId).collect(Collectors.toSet());

View File

@@ -1,7 +1,6 @@
package io.kestra.core.repositories;
import com.google.common.collect.ImmutableMap;
import io.kestra.core.Helpers;
import io.kestra.core.events.CrudEvent;
import io.kestra.core.events.CrudEventType;
import io.kestra.core.exceptions.InvalidQueryFiltersException;
@@ -10,7 +9,6 @@ import io.kestra.core.models.Label;
import io.kestra.core.models.QueryFilter;
import io.kestra.core.models.QueryFilter.Field;
import io.kestra.core.models.QueryFilter.Op;
import io.kestra.core.models.SearchResult;
import io.kestra.core.models.conditions.ConditionContext;
import io.kestra.core.models.executions.Execution;
import io.kestra.core.models.executions.ExecutionTrigger;
@@ -20,7 +18,6 @@ import io.kestra.core.models.property.Property;
import io.kestra.core.models.triggers.AbstractTrigger;
import io.kestra.core.models.triggers.PollingTriggerInterface;
import io.kestra.core.models.triggers.TriggerContext;
import io.kestra.core.queues.QueueException;
import io.kestra.core.repositories.ExecutionRepositoryInterface.ChildFilter;
import io.kestra.core.services.FlowService;
import io.kestra.core.utils.Await;
@@ -29,22 +26,19 @@ import io.kestra.core.utils.TestsUtils;
import io.kestra.plugin.core.debug.Return;
import io.micronaut.context.event.ApplicationEventListener;
import io.micronaut.data.model.Pageable;
import io.micronaut.data.model.Sort;
import jakarta.inject.Inject;
import jakarta.inject.Singleton;
import jakarta.validation.ConstraintViolationException;
import java.util.concurrent.CopyOnWriteArrayList;
import lombok.*;
import lombok.experimental.SuperBuilder;
import org.junit.jupiter.api.Assertions;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.BeforeAll;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.TestInstance;
import org.junit.jupiter.params.ParameterizedTest;
import org.junit.jupiter.params.provider.MethodSource;
import org.slf4j.event.Level;
import java.io.IOException;
import java.net.URISyntaxException;
import java.time.Duration;
import java.time.ZonedDateTime;
import java.util.*;
@@ -52,16 +46,12 @@ import java.util.concurrent.TimeoutException;
import java.util.stream.Stream;
import static io.kestra.core.models.flows.FlowScope.SYSTEM;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static io.kestra.core.utils.NamespaceUtils.SYSTEM_FLOWS_DEFAULT_NAMESPACE;
import static org.assertj.core.api.Assertions.assertThat;
import static org.junit.jupiter.api.Assertions.assertThrows;
// If some counts are wrong in this test it means that one of the tests is not properly deleting what it created
@KestraTest
@TestInstance(TestInstance.Lifecycle.PER_CLASS)
public abstract class AbstractFlowRepositoryTest {
public static final String TEST_TENANT_ID = "tenant";
public static final String TEST_NAMESPACE = "io.kestra.unittest";
public static final String TEST_FLOW_ID = "test";
@Inject
@@ -70,21 +60,18 @@ public abstract class AbstractFlowRepositoryTest {
@Inject
protected ExecutionRepositoryInterface executionRepository;
@Inject
private LocalFlowRepositoryLoader repositoryLoader;
@BeforeEach
protected void init() throws IOException, URISyntaxException {
TestsUtils.loads(MAIN_TENANT, repositoryLoader);
@BeforeAll
protected static void init() {
FlowListener.reset();
}
private static FlowWithSource.FlowWithSourceBuilder<?, ?> builder() {
return builder(IdUtils.create(), TEST_FLOW_ID);
private static FlowWithSource.FlowWithSourceBuilder<?, ?> builder(String tenantId) {
return builder(tenantId, IdUtils.create(), TEST_FLOW_ID);
}
private static FlowWithSource.FlowWithSourceBuilder<?, ?> builder(String flowId, String taskId) {
private static FlowWithSource.FlowWithSourceBuilder<?, ?> builder(String tenantId, String flowId, String taskId) {
return FlowWithSource.builder()
.tenantId(tenantId)
.id(flowId)
.namespace(TEST_NAMESPACE)
.tasks(Collections.singletonList(Return.builder().id(taskId).type(Return.class.getName()).format(Property.ofValue(TEST_FLOW_ID)).build()));
@@ -93,16 +80,16 @@ public abstract class AbstractFlowRepositoryTest {
@ParameterizedTest
@MethodSource("filterCombinations")
void should_find_all(QueryFilter filter){
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
FlowWithSource flow = FlowWithSource.builder()
.id("filterFlowId")
.namespace(SYSTEM_FLOWS_DEFAULT_NAMESPACE)
.tenantId(MAIN_TENANT)
.tenantId(tenant)
.labels(Label.from(Map.of("key", "value")))
.build();
flow = flowRepository.create(GenericFlow.of(flow));
try {
ArrayListTotal<Flow> entries = flowRepository.find(Pageable.UNPAGED, MAIN_TENANT, List.of(filter));
ArrayListTotal<Flow> entries = flowRepository.find(Pageable.UNPAGED, tenant, List.of(filter));
assertThat(entries).hasSize(1);
} finally {
@@ -113,16 +100,16 @@ public abstract class AbstractFlowRepositoryTest {
@ParameterizedTest
@MethodSource("filterCombinations")
void should_find_all_with_source(QueryFilter filter){
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
FlowWithSource flow = FlowWithSource.builder()
.id("filterFlowId")
.namespace(SYSTEM_FLOWS_DEFAULT_NAMESPACE)
.tenantId(MAIN_TENANT)
.tenantId(tenant)
.labels(Label.from(Map.of("key", "value")))
.build();
flow = flowRepository.create(GenericFlow.of(flow));
try {
ArrayListTotal<FlowWithSource> entries = flowRepository.findWithSource(Pageable.UNPAGED, MAIN_TENANT, List.of(filter));
ArrayListTotal<FlowWithSource> entries = flowRepository.findWithSource(Pageable.UNPAGED, tenant, List.of(filter));
assertThat(entries).hasSize(1);
} finally {
@@ -144,7 +131,7 @@ public abstract class AbstractFlowRepositoryTest {
void should_fail_to_find_all(QueryFilter filter){
assertThrows(
InvalidQueryFiltersException.class,
() -> flowRepository.find(Pageable.UNPAGED, MAIN_TENANT, List.of(filter)));
() -> flowRepository.find(Pageable.UNPAGED, TestsUtils.randomTenant(this.getClass().getSimpleName()), List.of(filter)));
}
@@ -153,7 +140,7 @@ public abstract class AbstractFlowRepositoryTest {
void should_fail_to_find_all_with_source(QueryFilter filter){
assertThrows(
InvalidQueryFiltersException.class,
() -> flowRepository.findWithSource(Pageable.UNPAGED, MAIN_TENANT, List.of(filter)));
() -> flowRepository.findWithSource(Pageable.UNPAGED, TestsUtils.randomTenant(this.getClass().getSimpleName()), List.of(filter)));
}
@@ -176,17 +163,17 @@ public abstract class AbstractFlowRepositoryTest {
@Test
void findById() {
FlowWithSource flow = builder()
.tenantId(MAIN_TENANT)
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
FlowWithSource flow = builder(tenant)
.revision(3)
.build();
flow = flowRepository.create(GenericFlow.of(flow));
try {
Optional<Flow> full = flowRepository.findById(MAIN_TENANT, flow.getNamespace(), flow.getId());
Optional<Flow> full = flowRepository.findById(tenant, flow.getNamespace(), flow.getId());
assertThat(full.isPresent()).isTrue();
assertThat(full.get().getRevision()).isEqualTo(1);
full = flowRepository.findById(MAIN_TENANT, flow.getNamespace(), flow.getId(), Optional.empty());
full = flowRepository.findById(tenant, flow.getNamespace(), flow.getId(), Optional.empty());
assertThat(full.isPresent()).isTrue();
} finally {
deleteFlow(flow);
@@ -195,17 +182,18 @@ public abstract class AbstractFlowRepositoryTest {
@Test
void findByIdWithoutAcl() {
FlowWithSource flow = builder()
.tenantId(MAIN_TENANT)
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
FlowWithSource flow = builder(tenant)
.tenantId(tenant)
.revision(3)
.build();
flow = flowRepository.create(GenericFlow.of(flow));
try {
Optional<Flow> full = flowRepository.findByIdWithoutAcl(MAIN_TENANT, flow.getNamespace(), flow.getId(), Optional.empty());
Optional<Flow> full = flowRepository.findByIdWithoutAcl(tenant, flow.getNamespace(), flow.getId(), Optional.empty());
assertThat(full.isPresent()).isTrue();
assertThat(full.get().getRevision()).isEqualTo(1);
full = flowRepository.findByIdWithoutAcl(MAIN_TENANT, flow.getNamespace(), flow.getId(), Optional.empty());
full = flowRepository.findByIdWithoutAcl(tenant, flow.getNamespace(), flow.getId(), Optional.empty());
assertThat(full.isPresent()).isTrue();
} finally {
deleteFlow(flow);
@@ -214,15 +202,16 @@ public abstract class AbstractFlowRepositoryTest {
@Test
void findByIdWithSource() {
FlowWithSource flow = builder()
.tenantId(MAIN_TENANT)
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
FlowWithSource flow = builder(tenant)
.tenantId(tenant)
.revision(3)
.build();
String source = "# comment\n" + flow.sourceOrGenerateIfNull();
flow = flowRepository.create(GenericFlow.fromYaml(MAIN_TENANT, source));
flow = flowRepository.create(GenericFlow.fromYaml(tenant, source));
try {
Optional<FlowWithSource> full = flowRepository.findByIdWithSource(MAIN_TENANT, flow.getNamespace(), flow.getId());
Optional<FlowWithSource> full = flowRepository.findByIdWithSource(tenant, flow.getNamespace(), flow.getId());
assertThat(full.isPresent()).isTrue();
full.ifPresent(current -> {
@@ -237,7 +226,8 @@ public abstract class AbstractFlowRepositoryTest {
@Test
void save() {
FlowWithSource flow = builder().revision(12).build();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
FlowWithSource flow = builder(tenant).revision(12).build();
FlowWithSource save = flowRepository.create(GenericFlow.of(flow));
try {
@@ -249,7 +239,8 @@ public abstract class AbstractFlowRepositoryTest {
@Test
void saveNoRevision() {
FlowWithSource flow = builder().build();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
FlowWithSource flow = builder(tenant).build();
FlowWithSource save = flowRepository.create(GenericFlow.of(flow));
try {
@@ -260,68 +251,17 @@ public abstract class AbstractFlowRepositoryTest {
}
@Test
void findAll() {
List<Flow> save = flowRepository.findAll(MAIN_TENANT);
assertThat((long) save.size()).isEqualTo(Helpers.FLOWS_COUNT);
}
@Test
void findAllWithSource() {
List<FlowWithSource> save = flowRepository.findAllWithSource(MAIN_TENANT);
assertThat((long) save.size()).isEqualTo(Helpers.FLOWS_COUNT);
}
@Test
void findAllForAllTenants() {
List<Flow> save = flowRepository.findAllForAllTenants();
assertThat((long) save.size()).isEqualTo(Helpers.FLOWS_COUNT);
}
@Test
void findAllWithSourceForAllTenants() {
List<FlowWithSource> save = flowRepository.findAllWithSourceForAllTenants();
assertThat((long) save.size()).isEqualTo(Helpers.FLOWS_COUNT);
}
@Test
void findByNamespace() {
List<Flow> save = flowRepository.findByNamespace(MAIN_TENANT, "io.kestra.tests");
assertThat((long) save.size()).isEqualTo(Helpers.FLOWS_COUNT - 24);
save = flowRepository.findByNamespace(MAIN_TENANT, "io.kestra.tests2");
assertThat((long) save.size()).isEqualTo(1L);
save = flowRepository.findByNamespace(MAIN_TENANT, "io.kestra.tests.minimal.bis");
assertThat((long) save.size()).isEqualTo(1L);
}
@Test
void findByNamespacePrefix() {
List<Flow> save = flowRepository.findByNamespacePrefix(MAIN_TENANT, "io.kestra.tests");
assertThat((long) save.size()).isEqualTo(Helpers.FLOWS_COUNT - 1);
save = flowRepository.findByNamespace(MAIN_TENANT, "io.kestra.tests2");
assertThat((long) save.size()).isEqualTo(1L);
save = flowRepository.findByNamespace(MAIN_TENANT, "io.kestra.tests.minimal.bis");
assertThat((long) save.size()).isEqualTo(1L);
}
@Test
void findByNamespaceWithSource() {
Flow flow = builder()
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Flow flow = builder(tenant)
.revision(3)
.build();
String flowSource = "# comment\n" + flow.sourceOrGenerateIfNull();
flow = flowRepository.create(GenericFlow.fromYaml(MAIN_TENANT, flowSource));
flow = flowRepository.create(GenericFlow.fromYaml(tenant, flowSource));
try {
List<FlowWithSource> save = flowRepository.findByNamespaceWithSource(MAIN_TENANT, flow.getNamespace());
List<FlowWithSource> save = flowRepository.findByNamespaceWithSource(tenant, flow.getNamespace());
assertThat((long) save.size()).isEqualTo(1L);
assertThat(save.getFirst().getSource()).isEqualTo(FlowService.cleanupSource(flowSource));
@@ -330,175 +270,15 @@ public abstract class AbstractFlowRepositoryTest {
}
}
@Test
void findByNamespacePrefixWithSource() {
List<FlowWithSource> save = flowRepository.findByNamespacePrefixWithSource(MAIN_TENANT, "io.kestra.tests");
assertThat((long) save.size()).isEqualTo(Helpers.FLOWS_COUNT - 1);
}
@Test
void find_paginationPartial() {
assertThat(flowRepository.find(Pageable.from(1, (int) Helpers.FLOWS_COUNT - 1, Sort.UNSORTED), MAIN_TENANT, null)
.size())
.describedAs("When paginating at MAX-1, it should return MAX-1")
.isEqualTo(Helpers.FLOWS_COUNT - 1);
assertThat(flowRepository.findWithSource(Pageable.from(1, (int) Helpers.FLOWS_COUNT - 1, Sort.UNSORTED), MAIN_TENANT, null)
.size())
.describedAs("When paginating at MAX-1, it should return MAX-1")
.isEqualTo(Helpers.FLOWS_COUNT - 1);
}
@Test
void find_paginationGreaterThanExisting() {
assertThat(flowRepository.find(Pageable.from(1, (int) Helpers.FLOWS_COUNT + 1, Sort.UNSORTED), MAIN_TENANT, null)
.size())
.describedAs("When paginating requesting a larger amount than existing, it should return existing MAX")
.isEqualTo(Helpers.FLOWS_COUNT);
assertThat(flowRepository.findWithSource(Pageable.from(1, (int) Helpers.FLOWS_COUNT + 1, Sort.UNSORTED), MAIN_TENANT, null)
.size())
.describedAs("When paginating requesting a larger amount than existing, it should return existing MAX")
.isEqualTo(Helpers.FLOWS_COUNT);
}
@Test
void find_prefixMatchingAllNamespaces() {
assertThat(flowRepository.find(
Pageable.UNPAGED,
MAIN_TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.NAMESPACE).operation(QueryFilter.Op.STARTS_WITH).value("io.kestra.tests").build()
)
).size())
.describedAs("When filtering on NAMESPACE START_WITH a pattern that match all, it should return all")
.isEqualTo(Helpers.FLOWS_COUNT);
assertThat(flowRepository.findWithSource(
Pageable.UNPAGED,
MAIN_TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.NAMESPACE).operation(QueryFilter.Op.STARTS_WITH).value("io.kestra.tests").build()
)
).size())
.describedAs("When filtering on NAMESPACE START_WITH a pattern that match all, it should return all")
.isEqualTo(Helpers.FLOWS_COUNT);
}
@Test
void find_aSpecifiedNamespace() {
assertThat(flowRepository.find(
Pageable.UNPAGED,
MAIN_TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.NAMESPACE).operation(QueryFilter.Op.EQUALS).value("io.kestra.tests2").build()
)
).size()).isEqualTo(1L);
assertThat(flowRepository.findWithSource(
Pageable.UNPAGED,
MAIN_TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.NAMESPACE).operation(QueryFilter.Op.EQUALS).value("io.kestra.tests2").build()
)
).size()).isEqualTo(1L);
}
@Test
void find_aSpecificSubNamespace() {
assertThat(flowRepository.find(
Pageable.UNPAGED,
MAIN_TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.NAMESPACE).operation(QueryFilter.Op.EQUALS).value("io.kestra.tests.minimal.bis").build()
)
).size())
.isEqualTo(1L);
assertThat(flowRepository.findWithSource(
Pageable.UNPAGED,
MAIN_TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.NAMESPACE).operation(QueryFilter.Op.EQUALS).value("io.kestra.tests.minimal.bis").build()
)
).size())
.isEqualTo(1L);
}
@Test
void find_aSpecificLabel() {
assertThat(
flowRepository.find(Pageable.UNPAGED, MAIN_TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.LABELS).operation(QueryFilter.Op.EQUALS).value(Map.of("country", "FR")).build()
)
).size())
.isEqualTo(1);
assertThat(
flowRepository.findWithSource(Pageable.UNPAGED, MAIN_TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.LABELS).operation(QueryFilter.Op.EQUALS).value(Map.of("country", "FR")).build()
)
).size())
.isEqualTo(1);
}
@Test
void find_aSpecificFlowByNamespaceAndLabel() {
assertThat(
flowRepository.find(Pageable.UNPAGED, MAIN_TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.NAMESPACE).operation(QueryFilter.Op.EQUALS).value("io.kestra.tests").build(),
QueryFilter.builder().field(QueryFilter.Field.LABELS).operation(QueryFilter.Op.EQUALS).value(Map.of("key2", "value2")).build()
)
).size())
.isEqualTo(1);
assertThat(
flowRepository.findWithSource(Pageable.UNPAGED, MAIN_TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.NAMESPACE).operation(QueryFilter.Op.EQUALS).value("io.kestra.tests").build(),
QueryFilter.builder().field(QueryFilter.Field.LABELS).operation(QueryFilter.Op.EQUALS).value(Map.of("key2", "value2")).build()
)
).size())
.isEqualTo(1);
}
@Test
void find_noResult_forAnUnknownNamespace() {
assertThat(
flowRepository.find(Pageable.UNPAGED, MAIN_TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.NAMESPACE).operation(QueryFilter.Op.EQUALS).value("io.kestra.tests").build(),
QueryFilter.builder().field(QueryFilter.Field.LABELS).operation(QueryFilter.Op.EQUALS).value(Map.of("key1", "value2")).build()
)
).size())
.isEqualTo(0);
assertThat(
flowRepository.findWithSource(Pageable.UNPAGED, MAIN_TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.NAMESPACE).operation(QueryFilter.Op.EQUALS).value("io.kestra.tests").build(),
QueryFilter.builder().field(QueryFilter.Field.LABELS).operation(QueryFilter.Op.EQUALS).value(Map.of("key1", "value2")).build()
)
).size())
.isEqualTo(0);
}
@Test
protected void findSpecialChars() {
ArrayListTotal<SearchResult<Flow>> save = flowRepository.findSourceCode(Pageable.unpaged(), "https://api.chucknorris.io", MAIN_TENANT, null);
assertThat((long) save.size()).isEqualTo(2L);
}
@Test
void delete() {
Flow flow = builder().tenantId(MAIN_TENANT).build();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Flow flow = builder(tenant).tenantId(tenant).build();
FlowWithSource save = flowRepository.create(GenericFlow.of(flow));
try {
assertThat(flowRepository.findById(MAIN_TENANT, save.getNamespace(), save.getId()).isPresent()).isTrue();
assertThat(flowRepository.findById(tenant, save.getNamespace(), save.getId()).isPresent()).isTrue();
} catch (Throwable e) {
deleteFlow(save);
throw e;
@@ -506,21 +286,22 @@ public abstract class AbstractFlowRepositoryTest {
Flow delete = flowRepository.delete(save);
assertThat(flowRepository.findById(MAIN_TENANT, flow.getNamespace(), flow.getId()).isPresent()).isFalse();
assertThat(flowRepository.findById(MAIN_TENANT, flow.getNamespace(), flow.getId(), Optional.of(save.getRevision())).isPresent()).isTrue();
assertThat(flowRepository.findById(tenant, flow.getNamespace(), flow.getId()).isPresent()).isFalse();
assertThat(flowRepository.findById(tenant, flow.getNamespace(), flow.getId(), Optional.of(save.getRevision())).isPresent()).isTrue();
List<FlowWithSource> revisions = flowRepository.findRevisions(MAIN_TENANT, flow.getNamespace(), flow.getId());
List<FlowWithSource> revisions = flowRepository.findRevisions(tenant, flow.getNamespace(), flow.getId());
assertThat(revisions.getLast().getRevision()).isEqualTo(delete.getRevision());
}
@Test
void updateConflict() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
String flowId = IdUtils.create();
Flow flow = Flow.builder()
.id(flowId)
.namespace(TEST_NAMESPACE)
.tenantId(MAIN_TENANT)
.tenantId(tenant)
.inputs(List.of(StringInput.builder().type(Type.STRING).id("a").build()))
.tasks(Collections.singletonList(Return.builder().id(TEST_FLOW_ID).type(Return.class.getName()).format(Property.ofValue(TEST_FLOW_ID)).build()))
.build();
@@ -528,12 +309,12 @@ public abstract class AbstractFlowRepositoryTest {
Flow save = flowRepository.create(GenericFlow.of(flow));
try {
assertThat(flowRepository.findById(MAIN_TENANT, flow.getNamespace(), flow.getId()).isPresent()).isTrue();
assertThat(flowRepository.findById(tenant, flow.getNamespace(), flow.getId()).isPresent()).isTrue();
Flow update = Flow.builder()
.id(IdUtils.create())
.namespace("io.kestra.unittest2")
.tenantId(MAIN_TENANT)
.tenantId(tenant)
.inputs(List.of(StringInput.builder().type(Type.STRING).id("b").build()))
.tasks(Collections.singletonList(Return.builder().id(TEST_FLOW_ID).type(Return.class.getName()).format(Property.ofValue(TEST_FLOW_ID)).build()))
.build();
@@ -551,13 +332,14 @@ public abstract class AbstractFlowRepositoryTest {
}
@Test
void removeTrigger() throws TimeoutException, QueueException {
public void removeTrigger() throws TimeoutException {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
String flowId = IdUtils.create();
Flow flow = Flow.builder()
.id(flowId)
.namespace(TEST_NAMESPACE)
.tenantId(MAIN_TENANT)
.tenantId(tenant)
.triggers(Collections.singletonList(UnitTest.builder()
.id("sleep")
.type(UnitTest.class.getName())
@@ -567,12 +349,12 @@ public abstract class AbstractFlowRepositoryTest {
flow = flowRepository.create(GenericFlow.of(flow));
try {
assertThat(flowRepository.findById(MAIN_TENANT, flow.getNamespace(), flow.getId()).isPresent()).isTrue();
assertThat(flowRepository.findById(tenant, flow.getNamespace(), flow.getId()).isPresent()).isTrue();
Flow update = Flow.builder()
.id(flowId)
.namespace(TEST_NAMESPACE)
.tenantId(MAIN_TENANT)
.tenantId(tenant)
.tasks(Collections.singletonList(Return.builder().id(TEST_FLOW_ID).type(Return.class.getName()).format(Property.ofValue(TEST_FLOW_ID)).build()))
.build();
;
@@ -583,21 +365,25 @@ public abstract class AbstractFlowRepositoryTest {
deleteFlow(flow);
}
Await.until(() -> FlowListener.getEmits().size() == 3, Duration.ofMillis(100), Duration.ofSeconds(5));
assertThat(FlowListener.getEmits().stream().filter(r -> r.getType() == CrudEventType.CREATE).count()).isEqualTo(1L);
assertThat(FlowListener.getEmits().stream().filter(r -> r.getType() == CrudEventType.UPDATE).count()).isEqualTo(1L);
assertThat(FlowListener.getEmits().stream().filter(r -> r.getType() == CrudEventType.DELETE).count()).isEqualTo(1L);
Await.until(() -> FlowListener.filterByTenant(tenant)
.size() == 3, Duration.ofMillis(100), Duration.ofSeconds(5));
assertThat(FlowListener.filterByTenant(tenant).stream()
.filter(r -> r.getType() == CrudEventType.CREATE).count()).isEqualTo(1L);
assertThat(FlowListener.filterByTenant(tenant).stream()
.filter(r -> r.getType() == CrudEventType.UPDATE).count()).isEqualTo(1L);
assertThat(FlowListener.filterByTenant(tenant).stream()
.filter(r -> r.getType() == CrudEventType.DELETE).count()).isEqualTo(1L);
}
@Test
void removeTriggerDelete() throws TimeoutException {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
String flowId = IdUtils.create();
Flow flow = Flow.builder()
.id(flowId)
.namespace(TEST_NAMESPACE)
.tenantId(MAIN_TENANT)
.tenantId(tenant)
.triggers(Collections.singletonList(UnitTest.builder()
.id("sleep")
.type(UnitTest.class.getName())
@@ -607,40 +393,39 @@ public abstract class AbstractFlowRepositoryTest {
Flow save = flowRepository.create(GenericFlow.of(flow));
try {
assertThat(flowRepository.findById(MAIN_TENANT, flow.getNamespace(), flow.getId()).isPresent()).isTrue();
assertThat(flowRepository.findById(tenant, flow.getNamespace(), flow.getId()).isPresent()).isTrue();
} finally {
deleteFlow(save);
}
Await.until(() -> FlowListener.getEmits().size() == 2, Duration.ofMillis(100), Duration.ofSeconds(5));
assertThat(FlowListener.getEmits().stream().filter(r -> r.getType() == CrudEventType.CREATE).count()).isEqualTo(1L);
assertThat(FlowListener.getEmits().stream().filter(r -> r.getType() == CrudEventType.DELETE).count()).isEqualTo(1L);
Await.until(() -> FlowListener.filterByTenant(tenant)
.size() == 2, Duration.ofMillis(100), Duration.ofSeconds(5));
assertThat(FlowListener.filterByTenant(tenant).stream()
.filter(r -> r.getType() == CrudEventType.CREATE).count()).isEqualTo(1L);
assertThat(FlowListener.filterByTenant(tenant).stream()
.filter(r -> r.getType() == CrudEventType.DELETE).count()).isEqualTo(1L);
}
@Test
void findDistinctNamespace() {
List<String> distinctNamespace = flowRepository.findDistinctNamespace(MAIN_TENANT);
assertThat((long) distinctNamespace.size()).isEqualTo(9L);
}
@Test
protected void shouldReturnNullRevisionForNonExistingFlow() {
assertThat(flowRepository.lastRevision(TEST_TENANT_ID, TEST_NAMESPACE, IdUtils.create())).isNull();
assertThat(flowRepository.lastRevision(TestsUtils.randomTenant(this.getClass().getSimpleName()), TEST_NAMESPACE, IdUtils.create())).isNull();
}
@Test
protected void shouldReturnLastRevisionOnCreate() {
// Given
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
final List<Flow> toDelete = new ArrayList<>();
final String flowId = IdUtils.create();
try {
// When
toDelete.add(flowRepository.create(createTestingLogFlow(flowId, "???")));
Integer result = flowRepository.lastRevision(TEST_TENANT_ID, TEST_NAMESPACE, flowId);
toDelete.add(flowRepository.create(createTestingLogFlow(tenant, flowId, "???")));
Integer result = flowRepository.lastRevision(tenant, TEST_NAMESPACE, flowId);
// Then
assertThat(result).isEqualTo(1);
assertThat(flowRepository.lastRevision(TEST_TENANT_ID, TEST_NAMESPACE, flowId)).isEqualTo(1);
assertThat(flowRepository.lastRevision(tenant, TEST_NAMESPACE, flowId)).isEqualTo(1);
} finally {
toDelete.forEach(this::deleteFlow);
}
@@ -649,34 +434,36 @@ public abstract class AbstractFlowRepositoryTest {
@Test
protected void shouldIncrementRevisionOnDelete() {
// Given
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
final String flowId = IdUtils.create();
FlowWithSource created = flowRepository.create(createTestingLogFlow(flowId, "first"));
assertThat(flowRepository.findRevisions(TEST_TENANT_ID, TEST_NAMESPACE, flowId).size()).isEqualTo(1);
FlowWithSource created = flowRepository.create(createTestingLogFlow(tenant, flowId, "first"));
assertThat(flowRepository.findRevisions(tenant, TEST_NAMESPACE, flowId).size()).isEqualTo(1);
// When
flowRepository.delete(created);
// Then
assertThat(flowRepository.findRevisions(TEST_TENANT_ID, TEST_NAMESPACE, flowId).size()).isEqualTo(2);
assertThat(flowRepository.findRevisions(tenant, TEST_NAMESPACE, flowId).size()).isEqualTo(2);
}
@Test
protected void shouldIncrementRevisionOnCreateAfterDelete() {
// Given
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
final List<Flow> toDelete = new ArrayList<>();
final String flowId = IdUtils.create();
try {
// Given
flowRepository.delete(
flowRepository.create(createTestingLogFlow(flowId, "first"))
flowRepository.create(createTestingLogFlow(tenant, flowId, "first"))
);
// When
toDelete.add(flowRepository.create(createTestingLogFlow(flowId, "second")));
toDelete.add(flowRepository.create(createTestingLogFlow(tenant, flowId, "second")));
// Then
assertThat(flowRepository.findRevisions(TEST_TENANT_ID, TEST_NAMESPACE, flowId).size()).isEqualTo(3);
assertThat(flowRepository.lastRevision(TEST_TENANT_ID, TEST_NAMESPACE, flowId)).isEqualTo(3);
assertThat(flowRepository.findRevisions(tenant, TEST_NAMESPACE, flowId).size()).isEqualTo(3);
assertThat(flowRepository.lastRevision(tenant, TEST_NAMESPACE, flowId)).isEqualTo(3);
} finally {
toDelete.forEach(this::deleteFlow);
}
@@ -685,22 +472,23 @@ public abstract class AbstractFlowRepositoryTest {
@Test
protected void shouldReturnNullForLastRevisionAfterDelete() {
// Given
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
final List<Flow> toDelete = new ArrayList<>();
final String flowId = IdUtils.create();
try {
// Given
FlowWithSource created = flowRepository.create(createTestingLogFlow(flowId, "first"));
FlowWithSource created = flowRepository.create(createTestingLogFlow(tenant, flowId, "first"));
toDelete.add(created);
FlowWithSource updated = flowRepository.update(createTestingLogFlow(flowId, "second"), created);
FlowWithSource updated = flowRepository.update(createTestingLogFlow(tenant, flowId, "second"), created);
toDelete.add(updated);
// When
flowRepository.delete(updated);
// Then
assertThat(flowRepository.findById(TEST_TENANT_ID, TEST_NAMESPACE, flowId, Optional.empty())).isEqualTo(Optional.empty());
assertThat(flowRepository.lastRevision(TEST_TENANT_ID, TEST_NAMESPACE, flowId)).isNull();
assertThat(flowRepository.findById(tenant, TEST_NAMESPACE, flowId, Optional.empty())).isEqualTo(Optional.empty());
assertThat(flowRepository.lastRevision(tenant, TEST_NAMESPACE, flowId)).isNull();
} finally {
toDelete.forEach(this::deleteFlow);
}
@@ -709,22 +497,23 @@ public abstract class AbstractFlowRepositoryTest {
@Test
protected void shouldFindAllRevisionsAfterDelete() {
// Given
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
final List<Flow> toDelete = new ArrayList<>();
final String flowId = IdUtils.create();
try {
// Given
FlowWithSource created = flowRepository.create(createTestingLogFlow(flowId, "first"));
FlowWithSource created = flowRepository.create(createTestingLogFlow(tenant, flowId, "first"));
toDelete.add(created);
FlowWithSource updated = flowRepository.update(createTestingLogFlow(flowId, "second"), created);
FlowWithSource updated = flowRepository.update(createTestingLogFlow(tenant, flowId, "second"), created);
toDelete.add(updated);
// When
flowRepository.delete(updated);
// Then
assertThat(flowRepository.findById(TEST_TENANT_ID, TEST_NAMESPACE, flowId, Optional.empty())).isEqualTo(Optional.empty());
assertThat(flowRepository.findRevisions(TEST_TENANT_ID, TEST_NAMESPACE, flowId).size()).isEqualTo(3);
assertThat(flowRepository.findById(tenant, TEST_NAMESPACE, flowId, Optional.empty())).isEqualTo(Optional.empty());
assertThat(flowRepository.findRevisions(tenant, TEST_NAMESPACE, flowId).size()).isEqualTo(3);
} finally {
toDelete.forEach(this::deleteFlow);
}
@@ -732,21 +521,22 @@ public abstract class AbstractFlowRepositoryTest {
@Test
protected void shouldIncrementRevisionOnUpdateGivenNotEqualSource() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
final List<Flow> toDelete = new ArrayList<>();
final String flowId = IdUtils.create();
try {
// Given
FlowWithSource created = flowRepository.create(createTestingLogFlow(flowId, "first"));
FlowWithSource created = flowRepository.create(createTestingLogFlow(tenant, flowId, "first"));
toDelete.add(created);
// When
FlowWithSource updated = flowRepository.update(createTestingLogFlow(flowId, "second"), created);
FlowWithSource updated = flowRepository.update(createTestingLogFlow(tenant, flowId, "second"), created);
toDelete.add(updated);
// Then
assertThat(updated.getRevision()).isEqualTo(2);
assertThat(flowRepository.lastRevision(TEST_TENANT_ID, TEST_NAMESPACE, flowId)).isEqualTo(2);
assertThat(flowRepository.lastRevision(tenant, TEST_NAMESPACE, flowId)).isEqualTo(2);
} finally {
toDelete.forEach(this::deleteFlow);
@@ -755,48 +545,39 @@ public abstract class AbstractFlowRepositoryTest {
@Test
protected void shouldNotIncrementRevisionOnUpdateGivenEqualSource() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
final List<Flow> toDelete = new ArrayList<>();
final String flowId = IdUtils.create();
try {
// Given
FlowWithSource created = flowRepository.create(createTestingLogFlow(flowId, "first"));
FlowWithSource created = flowRepository.create(createTestingLogFlow(tenant, flowId, "first"));
toDelete.add(created);
// When
FlowWithSource updated = flowRepository.update(createTestingLogFlow(flowId, "first"), created);
FlowWithSource updated = flowRepository.update(createTestingLogFlow(tenant, flowId, "first"), created);
toDelete.add(updated);
// Then
assertThat(updated.getRevision()).isEqualTo(1);
assertThat(flowRepository.lastRevision(TEST_TENANT_ID, TEST_NAMESPACE, flowId)).isEqualTo(1);
assertThat(flowRepository.lastRevision(tenant, TEST_NAMESPACE, flowId)).isEqualTo(1);
} finally {
toDelete.forEach(this::deleteFlow);
}
}
@Test
void shouldReturnForGivenQueryWildCardFilters() {
List<QueryFilter> filters = List.of(
QueryFilter.builder().field(QueryFilter.Field.QUERY).operation(QueryFilter.Op.EQUALS).value("*").build()
);
ArrayListTotal<Flow> flows = flowRepository.find(Pageable.from(1, 10), MAIN_TENANT, filters);
assertThat(flows.size()).isEqualTo(10);
assertThat(flows.getTotal()).isEqualTo(Helpers.FLOWS_COUNT);
}
@Test
void findByExecution() {
Flow flow = builder()
.tenantId(MAIN_TENANT)
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Flow flow = builder(tenant)
.revision(1)
.build();
flowRepository.create(GenericFlow.of(flow));
Execution execution = Execution.builder()
.id(IdUtils.create())
.namespace(flow.getNamespace())
.tenantId(MAIN_TENANT)
.tenantId(tenant)
.flowId(flow.getId())
.flowRevision(flow.getRevision())
.state(new State())
@@ -821,11 +602,13 @@ public abstract class AbstractFlowRepositoryTest {
@Test
void findByExecutionNoRevision() {
Flow flow = builder()
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Flow flow = builder(tenant)
.revision(3)
.build();
flowRepository.create(GenericFlow.of(flow));
Execution execution = Execution.builder()
.tenantId(tenant)
.id(IdUtils.create())
.namespace(flow.getNamespace())
.flowId(flow.getId())
@@ -851,13 +634,14 @@ public abstract class AbstractFlowRepositoryTest {
@Test
void shouldCountForNullTenant() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
FlowWithSource toDelete = null;
try {
// Given
Flow flow = createTestFlowForNamespace(TEST_NAMESPACE);
Flow flow = createTestFlowForNamespace(tenant, TEST_NAMESPACE);
toDelete = flowRepository.create(GenericFlow.of(flow));
// When
int count = flowRepository.count(MAIN_TENANT);
int count = flowRepository.count(tenant);
// Then
Assertions.assertTrue(count > 0);
@@ -868,11 +652,11 @@ public abstract class AbstractFlowRepositoryTest {
}
}
private static Flow createTestFlowForNamespace(String namespace) {
private static Flow createTestFlowForNamespace(String tenantId, String namespace) {
return Flow.builder()
.id(IdUtils.create())
.namespace(namespace)
.tenantId(MAIN_TENANT)
.tenantId(tenantId)
.tasks(List.of(Return.builder()
.id(IdUtils.create())
.type(Return.class.getName())
@@ -891,21 +675,31 @@ public abstract class AbstractFlowRepositoryTest {
}
@Singleton
public static class FlowListener implements ApplicationEventListener<CrudEvent<Flow>> {
@Getter
private static List<CrudEvent<Flow>> emits = new ArrayList<>();
public static class FlowListener implements ApplicationEventListener<CrudEvent<AbstractFlow>> {
private static List<CrudEvent<AbstractFlow>> emits = new CopyOnWriteArrayList<>();
@Override
public void onApplicationEvent(CrudEvent<Flow> event) {
emits.add(event);
public void onApplicationEvent(CrudEvent<AbstractFlow> event) {
//This has to be done because Micronaut may send CrudEvent<Setting> for example, and we don't want them.
if ((event.getModel() != null && event.getModel() instanceof AbstractFlow)||
(event.getPreviousModel() != null && event.getPreviousModel() instanceof AbstractFlow)) {
emits.add(event);
}
}
public static void reset() {
emits = new ArrayList<>();
emits = new CopyOnWriteArrayList<>();
}
public static List<CrudEvent<AbstractFlow>> filterByTenant(String tenantId){
return emits.stream()
.filter(e -> (e.getPreviousModel() != null && e.getPreviousModel().getTenantId().equals(tenantId)) ||
(e.getModel() != null && e.getModel().getTenantId().equals(tenantId)))
.toList();
}
}
private static GenericFlow createTestingLogFlow(String id, String logMessage) {
private static GenericFlow createTestingLogFlow(String tenantId, String id, String logMessage) {
String source = """
id: %s
namespace: %s
@@ -914,7 +708,7 @@ public abstract class AbstractFlowRepositoryTest {
type: io.kestra.plugin.core.log.Log
message: %s
""".formatted(id, TEST_NAMESPACE, logMessage);
return GenericFlow.fromYaml(TEST_TENANT_ID, source);
return GenericFlow.fromYaml(tenantId, source);
}
protected static int COUNTER = 0;

View File

@@ -4,7 +4,7 @@ import io.kestra.core.models.topologies.FlowNode;
import io.kestra.core.models.topologies.FlowRelation;
import io.kestra.core.models.topologies.FlowTopology;
import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.tenant.TenantService;
import io.kestra.core.utils.TestsUtils;
import jakarta.inject.Inject;
import org.junit.jupiter.api.Test;
@@ -17,21 +17,21 @@ public abstract class AbstractFlowTopologyRepositoryTest {
@Inject
private FlowTopologyRepositoryInterface flowTopologyRepository;
protected FlowTopology createSimpleFlowTopology(String flowA, String flowB, String namespace) {
protected FlowTopology createSimpleFlowTopology(String tenantId, String flowA, String flowB, String namespace) {
return FlowTopology.builder()
.relation(FlowRelation.FLOW_TASK)
.source(FlowNode.builder()
.id(flowA)
.namespace(namespace)
.tenantId(TenantService.MAIN_TENANT)
.uid(flowA)
.tenantId(tenantId)
.uid(tenantId + flowA)
.build()
)
.destination(FlowNode.builder()
.id(flowB)
.namespace(namespace)
.tenantId(TenantService.MAIN_TENANT)
.uid(flowB)
.tenantId(tenantId)
.uid(tenantId + flowB)
.build()
)
.build();
@@ -39,42 +39,45 @@ public abstract class AbstractFlowTopologyRepositoryTest {
@Test
void findByFlow() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
flowTopologyRepository.save(
createSimpleFlowTopology("flow-a", "flow-b", "io.kestra.tests")
createSimpleFlowTopology(tenant, "flow-a", "flow-b", "io.kestra.tests")
);
List<FlowTopology> list = flowTopologyRepository.findByFlow(TenantService.MAIN_TENANT, "io.kestra.tests", "flow-a", false);
List<FlowTopology> list = flowTopologyRepository.findByFlow(tenant, "io.kestra.tests", "flow-a", false);
assertThat(list.size()).isEqualTo(1);
}
@Test
void findByNamespace() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
flowTopologyRepository.save(
createSimpleFlowTopology("flow-a", "flow-b", "io.kestra.tests")
createSimpleFlowTopology(tenant, "flow-a", "flow-b", "io.kestra.tests")
);
flowTopologyRepository.save(
createSimpleFlowTopology("flow-c", "flow-d", "io.kestra.tests")
createSimpleFlowTopology(tenant, "flow-c", "flow-d", "io.kestra.tests")
);
List<FlowTopology> list = flowTopologyRepository.findByNamespace(TenantService.MAIN_TENANT, "io.kestra.tests");
List<FlowTopology> list = flowTopologyRepository.findByNamespace(tenant, "io.kestra.tests");
assertThat(list.size()).isEqualTo(2);
}
@Test
void findAll() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
flowTopologyRepository.save(
createSimpleFlowTopology("flow-a", "flow-b", "io.kestra.tests")
createSimpleFlowTopology(tenant, "flow-a", "flow-b", "io.kestra.tests")
);
flowTopologyRepository.save(
createSimpleFlowTopology("flow-c", "flow-d", "io.kestra.tests")
createSimpleFlowTopology(tenant, "flow-c", "flow-d", "io.kestra.tests")
);
flowTopologyRepository.save(
createSimpleFlowTopology("flow-e", "flow-f", "io.kestra.tests.2")
createSimpleFlowTopology(tenant, "flow-e", "flow-f", "io.kestra.tests.2")
);
List<FlowTopology> list = flowTopologyRepository.findAll(TenantService.MAIN_TENANT);
List<FlowTopology> list = flowTopologyRepository.findAll(tenant);
assertThat(list.size()).isEqualTo(3);
}

View File

@@ -0,0 +1,281 @@
package io.kestra.core.repositories;
import static org.assertj.core.api.Assertions.assertThat;
import io.kestra.core.Helpers;
import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.models.QueryFilter;
import io.kestra.core.models.SearchResult;
import io.kestra.core.models.flows.Flow;
import io.kestra.core.models.flows.FlowWithSource;
import io.kestra.core.utils.TestsUtils;
import io.micronaut.data.model.Pageable;
import io.micronaut.data.model.Sort;
import jakarta.inject.Inject;
import java.io.IOException;
import java.net.URISyntaxException;
import java.util.List;
import java.util.Map;
import java.util.concurrent.atomic.AtomicBoolean;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
@KestraTest
public abstract class AbstractLoadedFlowRepositoryTest {
@Inject
protected FlowRepositoryInterface flowRepository;
@Inject
protected ExecutionRepositoryInterface executionRepository;
@Inject
private LocalFlowRepositoryLoader repositoryLoader;
protected static final String TENANT = TestsUtils.randomTenant(AbstractLoadedFlowRepositoryTest.class.getSimpleName());
private static final AtomicBoolean IS_INIT = new AtomicBoolean();
@BeforeEach
protected synchronized void init() throws IOException, URISyntaxException {
initFlows(repositoryLoader);
}
protected static synchronized void initFlows(LocalFlowRepositoryLoader repo) throws IOException, URISyntaxException {
if (!IS_INIT.get()){
TestsUtils.loads(TENANT, repo);
IS_INIT.set(true);
}
}
@Test
void findAll() {
List<Flow> save = flowRepository.findAll(TENANT);
assertThat((long) save.size()).isEqualTo(Helpers.FLOWS_COUNT);
}
@Test
void findAllWithSource() {
List<FlowWithSource> save = flowRepository.findAllWithSource(TENANT);
assertThat((long) save.size()).isEqualTo(Helpers.FLOWS_COUNT);
}
@Test
void findAllForAllTenants() {
List<Flow> save = flowRepository.findAllForAllTenants();
assertThat((long) save.size()).isEqualTo(Helpers.FLOWS_COUNT);
}
@Test
void findAllWithSourceForAllTenants() {
List<FlowWithSource> save = flowRepository.findAllWithSourceForAllTenants();
assertThat((long) save.size()).isEqualTo(Helpers.FLOWS_COUNT);
}
@Test
void findByNamespace() {
List<Flow> save = flowRepository.findByNamespace(TENANT, "io.kestra.tests");
assertThat((long) save.size()).isEqualTo(Helpers.FLOWS_COUNT - 24);
save = flowRepository.findByNamespace(TENANT, "io.kestra.tests2");
assertThat((long) save.size()).isEqualTo(1L);
save = flowRepository.findByNamespace(TENANT, "io.kestra.tests.minimal.bis");
assertThat((long) save.size()).isEqualTo(1L);
}
@Test
void findByNamespacePrefix() {
List<Flow> save = flowRepository.findByNamespacePrefix(TENANT, "io.kestra.tests");
assertThat((long) save.size()).isEqualTo(Helpers.FLOWS_COUNT - 1);
save = flowRepository.findByNamespace(TENANT, "io.kestra.tests2");
assertThat((long) save.size()).isEqualTo(1L);
save = flowRepository.findByNamespace(TENANT, "io.kestra.tests.minimal.bis");
assertThat((long) save.size()).isEqualTo(1L);
}
@Test
void findByNamespacePrefixWithSource() {
List<FlowWithSource> save = flowRepository.findByNamespacePrefixWithSource(TENANT, "io.kestra.tests");
assertThat((long) save.size()).isEqualTo(Helpers.FLOWS_COUNT - 1);
}
@Test
void find_paginationPartial() {
assertThat(flowRepository.find(Pageable.from(1, (int) Helpers.FLOWS_COUNT - 1, Sort.UNSORTED), TENANT, null)
.size())
.describedAs("When paginating at MAX-1, it should return MAX-1")
.isEqualTo(Helpers.FLOWS_COUNT - 1);
assertThat(flowRepository.findWithSource(Pageable.from(1, (int) Helpers.FLOWS_COUNT - 1, Sort.UNSORTED), TENANT, null)
.size())
.describedAs("When paginating at MAX-1, it should return MAX-1")
.isEqualTo(Helpers.FLOWS_COUNT - 1);
}
@Test
void find_paginationGreaterThanExisting() {
assertThat(flowRepository.find(Pageable.from(1, (int) Helpers.FLOWS_COUNT + 1, Sort.UNSORTED), TENANT, null)
.size())
.describedAs("When paginating requesting a larger amount than existing, it should return existing MAX")
.isEqualTo(Helpers.FLOWS_COUNT);
assertThat(flowRepository.findWithSource(Pageable.from(1, (int) Helpers.FLOWS_COUNT + 1, Sort.UNSORTED), TENANT, null)
.size())
.describedAs("When paginating requesting a larger amount than existing, it should return existing MAX")
.isEqualTo(Helpers.FLOWS_COUNT);
}
@Test
void find_prefixMatchingAllNamespaces() {
assertThat(flowRepository.find(
Pageable.UNPAGED,
TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.NAMESPACE).operation(QueryFilter.Op.STARTS_WITH).value("io.kestra.tests").build()
)
).size())
.describedAs("When filtering on NAMESPACE START_WITH a pattern that match all, it should return all")
.isEqualTo(Helpers.FLOWS_COUNT);
assertThat(flowRepository.findWithSource(
Pageable.UNPAGED,
TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.NAMESPACE).operation(QueryFilter.Op.STARTS_WITH).value("io.kestra.tests").build()
)
).size())
.describedAs("When filtering on NAMESPACE START_WITH a pattern that match all, it should return all")
.isEqualTo(Helpers.FLOWS_COUNT);
}
@Test
void find_aSpecifiedNamespace() {
assertThat(flowRepository.find(
Pageable.UNPAGED,
TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.NAMESPACE).operation(QueryFilter.Op.EQUALS).value("io.kestra.tests2").build()
)
).size()).isEqualTo(1L);
assertThat(flowRepository.findWithSource(
Pageable.UNPAGED,
TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.NAMESPACE).operation(QueryFilter.Op.EQUALS).value("io.kestra.tests2").build()
)
).size()).isEqualTo(1L);
}
@Test
void find_aSpecificSubNamespace() {
assertThat(flowRepository.find(
Pageable.UNPAGED,
TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.NAMESPACE).operation(QueryFilter.Op.EQUALS).value("io.kestra.tests.minimal.bis").build()
)
).size())
.isEqualTo(1L);
assertThat(flowRepository.findWithSource(
Pageable.UNPAGED,
TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.NAMESPACE).operation(QueryFilter.Op.EQUALS).value("io.kestra.tests.minimal.bis").build()
)
).size())
.isEqualTo(1L);
}
@Test
void find_aSpecificLabel() {
assertThat(
flowRepository.find(Pageable.UNPAGED, TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.LABELS).operation(QueryFilter.Op.EQUALS).value(
Map.of("country", "FR")).build()
)
).size())
.isEqualTo(1);
assertThat(
flowRepository.findWithSource(Pageable.UNPAGED, TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.LABELS).operation(QueryFilter.Op.EQUALS).value(Map.of("country", "FR")).build()
)
).size())
.isEqualTo(1);
}
@Test
void find_aSpecificFlowByNamespaceAndLabel() {
assertThat(
flowRepository.find(Pageable.UNPAGED, TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.NAMESPACE).operation(QueryFilter.Op.EQUALS).value("io.kestra.tests").build(),
QueryFilter.builder().field(QueryFilter.Field.LABELS).operation(QueryFilter.Op.EQUALS).value(Map.of("key2", "value2")).build()
)
).size())
.isEqualTo(1);
assertThat(
flowRepository.findWithSource(Pageable.UNPAGED, TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.NAMESPACE).operation(QueryFilter.Op.EQUALS).value("io.kestra.tests").build(),
QueryFilter.builder().field(QueryFilter.Field.LABELS).operation(QueryFilter.Op.EQUALS).value(Map.of("key2", "value2")).build()
)
).size())
.isEqualTo(1);
}
@Test
void find_noResult_forAnUnknownNamespace() {
assertThat(
flowRepository.find(Pageable.UNPAGED, TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.NAMESPACE).operation(QueryFilter.Op.EQUALS).value("io.kestra.tests").build(),
QueryFilter.builder().field(QueryFilter.Field.LABELS).operation(QueryFilter.Op.EQUALS).value(Map.of("key1", "value2")).build()
)
).size())
.isEqualTo(0);
assertThat(
flowRepository.findWithSource(Pageable.UNPAGED, TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.NAMESPACE).operation(QueryFilter.Op.EQUALS).value("io.kestra.tests").build(),
QueryFilter.builder().field(QueryFilter.Field.LABELS).operation(QueryFilter.Op.EQUALS).value(Map.of("key1", "value2")).build()
)
).size())
.isEqualTo(0);
}
@Test
protected void findSpecialChars() {
ArrayListTotal<SearchResult<Flow>> save = flowRepository.findSourceCode(Pageable.unpaged(), "https://api.chucknorris.io", TENANT, null);
assertThat((long) save.size()).isEqualTo(2L);
}
@Test
void findDistinctNamespace() {
List<String> distinctNamespace = flowRepository.findDistinctNamespace(TENANT);
assertThat((long) distinctNamespace.size()).isEqualTo(9L);
}
@Test
void shouldReturnForGivenQueryWildCardFilters() {
List<QueryFilter> filters = List.of(
QueryFilter.builder().field(QueryFilter.Field.QUERY).operation(QueryFilter.Op.EQUALS).value("*").build()
);
ArrayListTotal<Flow> flows = flowRepository.find(Pageable.from(1, 10), TENANT, filters);
assertThat(flows.size()).isEqualTo(10);
assertThat(flows.getTotal()).isEqualTo(Helpers.FLOWS_COUNT);
}
}

View File

@@ -13,6 +13,7 @@ import io.kestra.core.models.executions.LogEntry;
import io.kestra.core.models.flows.State;
import io.kestra.core.repositories.ExecutionRepositoryInterface.ChildFilter;
import io.kestra.core.utils.IdUtils;
import io.kestra.core.utils.TestsUtils;
import io.kestra.plugin.core.dashboard.data.Logs;
import io.micronaut.data.model.Pageable;
import jakarta.inject.Inject;
@@ -32,9 +33,7 @@ import java.util.stream.Stream;
import static io.kestra.core.models.flows.FlowScope.SYSTEM;
import static io.kestra.core.models.flows.FlowScope.USER;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
import static org.assertj.core.api.Assertions.assertThatReflectiveOperationException;
import static org.junit.jupiter.api.Assertions.assertThrows;
@KestraTest
@@ -42,11 +41,11 @@ public abstract class AbstractLogRepositoryTest {
@Inject
protected LogRepositoryInterface logRepository;
protected static LogEntry.LogEntryBuilder logEntry(Level level) {
return logEntry(level, IdUtils.create());
protected static LogEntry.LogEntryBuilder logEntry(String tenantId, Level level) {
return logEntry(tenantId, level, IdUtils.create());
}
protected static LogEntry.LogEntryBuilder logEntry(Level level, String executionId) {
protected static LogEntry.LogEntryBuilder logEntry(String tenantId, Level level, String executionId) {
return LogEntry.builder()
.flowId("flowId")
.namespace("io.kestra.unittest")
@@ -57,7 +56,7 @@ public abstract class AbstractLogRepositoryTest {
.timestamp(Instant.now())
.level(level)
.thread("")
.tenantId(MAIN_TENANT)
.tenantId(tenantId)
.triggerId("triggerId")
.message("john doe");
}
@@ -65,9 +64,10 @@ public abstract class AbstractLogRepositoryTest {
@ParameterizedTest
@MethodSource("filterCombinations")
void should_find_all(QueryFilter filter){
logRepository.save(logEntry(Level.INFO, "executionId").build());
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
logRepository.save(logEntry(tenant, Level.INFO, "executionId").build());
ArrayListTotal<LogEntry> entries = logRepository.find(Pageable.UNPAGED, MAIN_TENANT, List.of(filter));
ArrayListTotal<LogEntry> entries = logRepository.find(Pageable.UNPAGED, tenant, List.of(filter));
assertThat(entries).hasSize(1);
}
@@ -75,9 +75,10 @@ public abstract class AbstractLogRepositoryTest {
@ParameterizedTest
@MethodSource("filterCombinations")
void should_find_async(QueryFilter filter){
logRepository.save(logEntry(Level.INFO, "executionId").build());
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
logRepository.save(logEntry(tenant, Level.INFO, "executionId").build());
Flux<LogEntry> find = logRepository.findAsync(MAIN_TENANT, List.of(filter));
Flux<LogEntry> find = logRepository.findAsync(tenant, List.of(filter));
List<LogEntry> logEntries = find.collectList().block();
assertThat(logEntries).hasSize(1);
@@ -86,11 +87,12 @@ public abstract class AbstractLogRepositoryTest {
@ParameterizedTest
@MethodSource("filterCombinations")
void should_delete_with_filter(QueryFilter filter){
logRepository.save(logEntry(Level.INFO, "executionId").build());
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
logRepository.save(logEntry(tenant, Level.INFO, "executionId").build());
logRepository.deleteByFilters(MAIN_TENANT, List.of(filter));
logRepository.deleteByFilters(tenant, List.of(filter));
assertThat(logRepository.findAllAsync(MAIN_TENANT).collectList().block()).isEmpty();
assertThat(logRepository.findAllAsync(tenant).collectList().block()).isEmpty();
}
@@ -150,7 +152,10 @@ public abstract class AbstractLogRepositoryTest {
void should_fail_to_find_all(QueryFilter filter){
assertThrows(
InvalidQueryFiltersException.class,
() -> logRepository.find(Pageable.UNPAGED, MAIN_TENANT, List.of(filter)));
() -> logRepository.find(
Pageable.UNPAGED,
TestsUtils.randomTenant(this.getClass().getSimpleName()),
List.of(filter)));
}
@@ -168,16 +173,17 @@ public abstract class AbstractLogRepositoryTest {
@Test
void all() {
LogEntry.LogEntryBuilder builder = logEntry(Level.INFO);
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
LogEntry.LogEntryBuilder builder = logEntry(tenant, Level.INFO);
ArrayListTotal<LogEntry> find = logRepository.find(Pageable.UNPAGED, MAIN_TENANT, null);
ArrayListTotal<LogEntry> find = logRepository.find(Pageable.UNPAGED, tenant, null);
assertThat(find.size()).isZero();
LogEntry save = logRepository.save(builder.build());
logRepository.save(builder.executionKind(ExecutionKind.TEST).build()); // should only be loaded by execution id
find = logRepository.find(Pageable.UNPAGED, MAIN_TENANT, null);
find = logRepository.find(Pageable.UNPAGED, tenant, null);
assertThat(find.size()).isEqualTo(1);
assertThat(find.getFirst().getExecutionId()).isEqualTo(save.getExecutionId());
var filters = List.of(QueryFilter.builder()
@@ -193,7 +199,7 @@ public abstract class AbstractLogRepositoryTest {
find = logRepository.find(Pageable.UNPAGED, "doe", filters);
assertThat(find.size()).isZero();
find = logRepository.find(Pageable.UNPAGED, MAIN_TENANT, null);
find = logRepository.find(Pageable.UNPAGED, tenant, null);
assertThat(find.size()).isEqualTo(1);
assertThat(find.getFirst().getExecutionId()).isEqualTo(save.getExecutionId());
@@ -201,141 +207,146 @@ public abstract class AbstractLogRepositoryTest {
assertThat(find.size()).isEqualTo(1);
assertThat(find.getFirst().getExecutionId()).isEqualTo(save.getExecutionId());
List<LogEntry> list = logRepository.findByExecutionId(MAIN_TENANT, save.getExecutionId(), null);
List<LogEntry> list = logRepository.findByExecutionId(tenant, save.getExecutionId(), null);
assertThat(list.size()).isEqualTo(2);
assertThat(list.getFirst().getExecutionId()).isEqualTo(save.getExecutionId());
list = logRepository.findByExecutionId(MAIN_TENANT, "io.kestra.unittest", "flowId", save.getExecutionId(), null);
list = logRepository.findByExecutionId(tenant, "io.kestra.unittest", "flowId", save.getExecutionId(), null);
assertThat(list.size()).isEqualTo(2);
assertThat(list.getFirst().getExecutionId()).isEqualTo(save.getExecutionId());
list = logRepository.findByExecutionIdAndTaskId(MAIN_TENANT, save.getExecutionId(), save.getTaskId(), null);
list = logRepository.findByExecutionIdAndTaskId(tenant, save.getExecutionId(), save.getTaskId(), null);
assertThat(list.size()).isEqualTo(2);
assertThat(list.getFirst().getExecutionId()).isEqualTo(save.getExecutionId());
list = logRepository.findByExecutionIdAndTaskId(MAIN_TENANT, "io.kestra.unittest", "flowId", save.getExecutionId(), save.getTaskId(), null);
list = logRepository.findByExecutionIdAndTaskId(tenant, "io.kestra.unittest", "flowId", save.getExecutionId(), save.getTaskId(), null);
assertThat(list.size()).isEqualTo(2);
assertThat(list.getFirst().getExecutionId()).isEqualTo(save.getExecutionId());
list = logRepository.findByExecutionIdAndTaskRunId(MAIN_TENANT, save.getExecutionId(), save.getTaskRunId(), null);
list = logRepository.findByExecutionIdAndTaskRunId(tenant, save.getExecutionId(), save.getTaskRunId(), null);
assertThat(list.size()).isEqualTo(2);
assertThat(list.getFirst().getExecutionId()).isEqualTo(save.getExecutionId());
list = logRepository.findByExecutionIdAndTaskRunIdAndAttempt(MAIN_TENANT, save.getExecutionId(), save.getTaskRunId(), null, 0);
list = logRepository.findByExecutionIdAndTaskRunIdAndAttempt(tenant, save.getExecutionId(), save.getTaskRunId(), null, 0);
assertThat(list.size()).isEqualTo(2);
assertThat(list.getFirst().getExecutionId()).isEqualTo(save.getExecutionId());
Integer countDeleted = logRepository.purge(Execution.builder().id(save.getExecutionId()).build());
assertThat(countDeleted).isEqualTo(2);
list = logRepository.findByExecutionIdAndTaskId(MAIN_TENANT, save.getExecutionId(), save.getTaskId(), null);
list = logRepository.findByExecutionIdAndTaskId(tenant, save.getExecutionId(), save.getTaskId(), null);
assertThat(list.size()).isZero();
}
@Test
void pageable() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
String executionId = "123";
LogEntry.LogEntryBuilder builder = logEntry(Level.INFO);
LogEntry.LogEntryBuilder builder = logEntry(tenant, Level.INFO);
builder.executionId(executionId);
for (int i = 0; i < 80; i++) {
logRepository.save(builder.build());
}
builder = logEntry(Level.INFO).executionId(executionId).taskId("taskId2").taskRunId("taskRunId2");
builder = logEntry(tenant, Level.INFO).executionId(executionId).taskId("taskId2").taskRunId("taskRunId2");
LogEntry logEntry2 = logRepository.save(builder.build());
for (int i = 0; i < 20; i++) {
logRepository.save(builder.build());
}
ArrayListTotal<LogEntry> find = logRepository.findByExecutionId(MAIN_TENANT, executionId, null, Pageable.from(1, 50));
ArrayListTotal<LogEntry> find = logRepository.findByExecutionId(tenant, executionId, null, Pageable.from(1, 50));
assertThat(find.size()).isEqualTo(50);
assertThat(find.getTotal()).isEqualTo(101L);
find = logRepository.findByExecutionId(MAIN_TENANT, executionId, null, Pageable.from(3, 50));
find = logRepository.findByExecutionId(tenant, executionId, null, Pageable.from(3, 50));
assertThat(find.size()).isEqualTo(1);
assertThat(find.getTotal()).isEqualTo(101L);
find = logRepository.findByExecutionIdAndTaskId(MAIN_TENANT, executionId, logEntry2.getTaskId(), null, Pageable.from(1, 50));
find = logRepository.findByExecutionIdAndTaskId(tenant, executionId, logEntry2.getTaskId(), null, Pageable.from(1, 50));
assertThat(find.size()).isEqualTo(21);
assertThat(find.getTotal()).isEqualTo(21L);
find = logRepository.findByExecutionIdAndTaskRunId(MAIN_TENANT, executionId, logEntry2.getTaskRunId(), null, Pageable.from(1, 10));
find = logRepository.findByExecutionIdAndTaskRunId(tenant, executionId, logEntry2.getTaskRunId(), null, Pageable.from(1, 10));
assertThat(find.size()).isEqualTo(10);
assertThat(find.getTotal()).isEqualTo(21L);
find = logRepository.findByExecutionIdAndTaskRunIdAndAttempt(MAIN_TENANT, executionId, logEntry2.getTaskRunId(), null, 0, Pageable.from(1, 10));
find = logRepository.findByExecutionIdAndTaskRunIdAndAttempt(tenant, executionId, logEntry2.getTaskRunId(), null, 0, Pageable.from(1, 10));
assertThat(find.size()).isEqualTo(10);
assertThat(find.getTotal()).isEqualTo(21L);
find = logRepository.findByExecutionIdAndTaskRunId(MAIN_TENANT, executionId, logEntry2.getTaskRunId(), null, Pageable.from(10, 10));
find = logRepository.findByExecutionIdAndTaskRunId(tenant, executionId, logEntry2.getTaskRunId(), null, Pageable.from(10, 10));
assertThat(find.size()).isZero();
}
@Test
void shouldFindByExecutionIdTestLogs() {
var builder = logEntry(Level.INFO).executionId("123").executionKind(ExecutionKind.TEST).build();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
var builder = logEntry(tenant, Level.INFO).executionId("123").executionKind(ExecutionKind.TEST).build();
logRepository.save(builder);
List<LogEntry> logs = logRepository.findByExecutionId(MAIN_TENANT, builder.getExecutionId(), null);
List<LogEntry> logs = logRepository.findByExecutionId(tenant, builder.getExecutionId(), null);
assertThat(logs).hasSize(1);
}
@Test
void deleteByQuery() {
LogEntry log1 = logEntry(Level.INFO).build();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
LogEntry log1 = logEntry(tenant, Level.INFO).build();
logRepository.save(log1);
logRepository.deleteByQuery(MAIN_TENANT, log1.getExecutionId(), null, null, null, null);
logRepository.deleteByQuery(tenant, log1.getExecutionId(), null, null, null, null);
ArrayListTotal<LogEntry> find = logRepository.findByExecutionId(MAIN_TENANT, log1.getExecutionId(), null, Pageable.from(1, 50));
ArrayListTotal<LogEntry> find = logRepository.findByExecutionId(tenant, log1.getExecutionId(), null, Pageable.from(1, 50));
assertThat(find.size()).isZero();
logRepository.save(log1);
logRepository.deleteByQuery(MAIN_TENANT, "io.kestra.unittest", "flowId", null, List.of(Level.TRACE, Level.DEBUG, Level.INFO), null, ZonedDateTime.now().plusMinutes(1));
logRepository.deleteByQuery(tenant, "io.kestra.unittest", "flowId", null, List.of(Level.TRACE, Level.DEBUG, Level.INFO), null, ZonedDateTime.now().plusMinutes(1));
find = logRepository.findByExecutionId(MAIN_TENANT, log1.getExecutionId(), null, Pageable.from(1, 50));
find = logRepository.findByExecutionId(tenant, log1.getExecutionId(), null, Pageable.from(1, 50));
assertThat(find.size()).isZero();
logRepository.save(log1);
logRepository.deleteByQuery(MAIN_TENANT, "io.kestra.unittest", "flowId", null);
logRepository.deleteByQuery(tenant, "io.kestra.unittest", "flowId", null);
find = logRepository.findByExecutionId(MAIN_TENANT, log1.getExecutionId(), null, Pageable.from(1, 50));
find = logRepository.findByExecutionId(tenant, log1.getExecutionId(), null, Pageable.from(1, 50));
assertThat(find.size()).isZero();
logRepository.save(log1);
logRepository.deleteByQuery(MAIN_TENANT, null, null, log1.getExecutionId(), List.of(Level.TRACE, Level.DEBUG, Level.INFO), null, ZonedDateTime.now().plusMinutes(1));
logRepository.deleteByQuery(tenant, null, null, log1.getExecutionId(), List.of(Level.TRACE, Level.DEBUG, Level.INFO), null, ZonedDateTime.now().plusMinutes(1));
find = logRepository.findByExecutionId(MAIN_TENANT, log1.getExecutionId(), null, Pageable.from(1, 50));
find = logRepository.findByExecutionId(tenant, log1.getExecutionId(), null, Pageable.from(1, 50));
assertThat(find.size()).isZero();
}
@Test
void findAllAsync() {
logRepository.save(logEntry(Level.INFO).build());
logRepository.save(logEntry(Level.INFO).executionKind(ExecutionKind.TEST).build()); // should be present as it's used for backup
logRepository.save(logEntry(Level.ERROR).build());
logRepository.save(logEntry(Level.WARN).build());
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
logRepository.save(logEntry(tenant, Level.INFO).build());
logRepository.save(logEntry(tenant, Level.INFO).executionKind(ExecutionKind.TEST).build()); // should be present as it's used for backup
logRepository.save(logEntry(tenant, Level.ERROR).build());
logRepository.save(logEntry(tenant, Level.WARN).build());
Flux<LogEntry> find = logRepository.findAllAsync(MAIN_TENANT);
Flux<LogEntry> find = logRepository.findAllAsync(tenant);
List<LogEntry> logEntries = find.collectList().block();
assertThat(logEntries).hasSize(4);
}
@Test
void fetchData() throws IOException {
logRepository.save(logEntry(Level.INFO).build());
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
logRepository.save(logEntry(tenant, Level.INFO).build());
var results = logRepository.fetchData(MAIN_TENANT,
var results = logRepository.fetchData(tenant,
Logs.builder()
.type(Logs.class.getName())
.columns(Map.of(

View File

@@ -7,6 +7,7 @@ import io.kestra.core.models.executions.TaskRun;
import io.kestra.core.models.executions.metrics.Counter;
import io.kestra.core.models.executions.metrics.MetricAggregations;
import io.kestra.core.models.executions.metrics.Timer;
import io.kestra.core.utils.TestsUtils;
import io.micronaut.data.model.Pageable;
import io.kestra.core.junit.annotations.KestraTest;
import jakarta.inject.Inject;
@@ -25,27 +26,28 @@ public abstract class AbstractMetricRepositoryTest {
@Test
void all() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
String executionId = FriendlyId.createFriendlyId();
TaskRun taskRun1 = taskRun(executionId, "task");
TaskRun taskRun1 = taskRun(tenant, executionId, "task");
MetricEntry counter = MetricEntry.of(taskRun1, counter("counter"), null);
MetricEntry testCounter = MetricEntry.of(taskRun1, counter("test"), ExecutionKind.TEST);
TaskRun taskRun2 = taskRun(executionId, "task");
TaskRun taskRun2 = taskRun(tenant, executionId, "task");
MetricEntry timer = MetricEntry.of(taskRun2, timer(), null);
metricRepository.save(counter);
metricRepository.save(testCounter); // should only be retrieved by execution id
metricRepository.save(timer);
List<MetricEntry> results = metricRepository.findByExecutionId(null, executionId, Pageable.from(1, 10));
List<MetricEntry> results = metricRepository.findByExecutionId(tenant, executionId, Pageable.from(1, 10));
assertThat(results.size()).isEqualTo(3);
results = metricRepository.findByExecutionIdAndTaskId(null, executionId, taskRun1.getTaskId(), Pageable.from(1, 10));
results = metricRepository.findByExecutionIdAndTaskId(tenant, executionId, taskRun1.getTaskId(), Pageable.from(1, 10));
assertThat(results.size()).isEqualTo(3);
results = metricRepository.findByExecutionIdAndTaskRunId(null, executionId, taskRun1.getId(), Pageable.from(1, 10));
results = metricRepository.findByExecutionIdAndTaskRunId(tenant, executionId, taskRun1.getId(), Pageable.from(1, 10));
assertThat(results.size()).isEqualTo(2);
MetricAggregations aggregationResults = metricRepository.aggregateByFlowId(
null,
tenant,
"namespace",
"flow",
null,
@@ -59,7 +61,7 @@ public abstract class AbstractMetricRepositoryTest {
assertThat(aggregationResults.getGroupBy()).isEqualTo("day");
aggregationResults = metricRepository.aggregateByFlowId(
null,
tenant,
"namespace",
"flow",
null,
@@ -76,11 +78,12 @@ public abstract class AbstractMetricRepositoryTest {
@Test
void names() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
String executionId = FriendlyId.createFriendlyId();
TaskRun taskRun1 = taskRun(executionId, "task");
TaskRun taskRun1 = taskRun(tenant, executionId, "task");
MetricEntry counter = MetricEntry.of(taskRun1, counter("counter"), null);
TaskRun taskRun2 = taskRun(executionId, "task2");
TaskRun taskRun2 = taskRun(tenant, executionId, "task2");
MetricEntry counter2 = MetricEntry.of(taskRun2, counter("counter2"), null);
MetricEntry test = MetricEntry.of(taskRun2, counter("test"), ExecutionKind.TEST);
@@ -90,9 +93,9 @@ public abstract class AbstractMetricRepositoryTest {
metricRepository.save(test); // should only be retrieved by execution id
List<String> flowMetricsNames = metricRepository.flowMetrics(null, "namespace", "flow");
List<String> taskMetricsNames = metricRepository.taskMetrics(null, "namespace", "flow", "task");
List<String> tasksWithMetrics = metricRepository.tasksWithMetrics(null, "namespace", "flow");
List<String> flowMetricsNames = metricRepository.flowMetrics(tenant, "namespace", "flow");
List<String> taskMetricsNames = metricRepository.taskMetrics(tenant, "namespace", "flow", "task");
List<String> tasksWithMetrics = metricRepository.tasksWithMetrics(tenant, "namespace", "flow");
assertThat(flowMetricsNames.size()).isEqualTo(2);
assertThat(taskMetricsNames.size()).isEqualTo(1);
@@ -101,17 +104,18 @@ public abstract class AbstractMetricRepositoryTest {
@Test
void findAllAsync() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
String executionId = FriendlyId.createFriendlyId();
TaskRun taskRun1 = taskRun(executionId, "task");
TaskRun taskRun1 = taskRun(tenant, executionId, "task");
MetricEntry counter = MetricEntry.of(taskRun1, counter("counter"), null);
TaskRun taskRun2 = taskRun(executionId, "task");
TaskRun taskRun2 = taskRun(tenant, executionId, "task");
MetricEntry timer = MetricEntry.of(taskRun2, timer(), null);
MetricEntry test = MetricEntry.of(taskRun2, counter("test"), ExecutionKind.TEST);
metricRepository.save(counter);
metricRepository.save(timer);
metricRepository.save(test); // should be retrieved as findAllAsync is used for backup
List<MetricEntry> results = metricRepository.findAllAsync(null).collectList().block();
List<MetricEntry> results = metricRepository.findAllAsync(tenant).collectList().block();
assertThat(results).hasSize(3);
}
@@ -123,8 +127,9 @@ public abstract class AbstractMetricRepositoryTest {
return Timer.of("counter", Duration.ofSeconds(5));
}
private TaskRun taskRun(String executionId, String taskId) {
private TaskRun taskRun(String tenantId, String executionId, String taskId) {
return TaskRun.builder()
.tenantId(tenantId)
.flowId("flow")
.namespace("namespace")
.executionId(executionId)

View File

@@ -4,6 +4,8 @@ import io.kestra.core.events.CrudEvent;
import io.kestra.core.events.CrudEventType;
import io.kestra.core.models.property.Property;
import io.kestra.core.models.templates.Template;
import io.kestra.core.utils.Await;
import io.kestra.core.utils.TestsUtils;
import io.kestra.plugin.core.debug.Return;
import io.kestra.core.utils.IdUtils;
import io.micronaut.context.event.ApplicationEventListener;
@@ -11,7 +13,10 @@ import io.micronaut.data.model.Pageable;
import io.kestra.core.junit.annotations.KestraTest;
import jakarta.inject.Inject;
import jakarta.inject.Singleton;
import org.junit.jupiter.api.BeforeEach;
import java.time.Duration;
import java.util.concurrent.CopyOnWriteArrayList;
import java.util.concurrent.TimeoutException;
import org.junit.jupiter.api.BeforeAll;
import org.junit.jupiter.api.Test;
import java.io.IOException;
@@ -20,6 +25,8 @@ import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import java.util.Optional;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import static org.assertj.core.api.Assertions.assertThat;
@@ -28,55 +35,60 @@ public abstract class AbstractTemplateRepositoryTest {
@Inject
protected TemplateRepositoryInterface templateRepository;
@BeforeEach
protected void init() throws IOException, URISyntaxException {
@BeforeAll
protected static void init() throws IOException, URISyntaxException {
TemplateListener.reset();
}
protected static Template.TemplateBuilder<?, ?> builder() {
return builder(null);
protected static Template.TemplateBuilder<?, ?> builder(String tenantId) {
return builder(tenantId, null);
}
protected static Template.TemplateBuilder<?, ?> builder(String namespace) {
protected static Template.TemplateBuilder<?, ?> builder(String tenantId, String namespace) {
return Template.builder()
.id(IdUtils.create())
.namespace(namespace == null ? "kestra.test" : namespace)
.tenantId(tenantId)
.tasks(Collections.singletonList(Return.builder().id("test").type(Return.class.getName()).format(Property.ofValue("test")).build()));
}
@Test
void findById() {
Template template = builder().build();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Template template = builder(tenant).build();
templateRepository.create(template);
Optional<Template> full = templateRepository.findById(null, template.getNamespace(), template.getId());
Optional<Template> full = templateRepository.findById(tenant, template.getNamespace(), template.getId());
assertThat(full.isPresent()).isTrue();
assertThat(full.get().getId()).isEqualTo(template.getId());
full = templateRepository.findById(null, template.getNamespace(), template.getId());
full = templateRepository.findById(tenant, template.getNamespace(), template.getId());
assertThat(full.isPresent()).isTrue();
assertThat(full.get().getId()).isEqualTo(template.getId());
}
@Test
void findByNamespace() {
Template template1 = builder().build();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Template template1 = builder(tenant).build();
Template template2 = Template.builder()
.id(IdUtils.create())
.tenantId(tenant)
.namespace("kestra.test.template").build();
templateRepository.create(template1);
templateRepository.create(template2);
List<Template> templates = templateRepository.findByNamespace(null, template1.getNamespace());
List<Template> templates = templateRepository.findByNamespace(tenant, template1.getNamespace());
assertThat(templates.size()).isGreaterThanOrEqualTo(1);
templates = templateRepository.findByNamespace(null, template2.getNamespace());
templates = templateRepository.findByNamespace(tenant, template2.getNamespace());
assertThat(templates.size()).isEqualTo(1);
}
@Test
void save() {
Template template = builder().build();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Template template = builder(tenant).build();
Template save = templateRepository.create(template);
assertThat(save.getId()).isEqualTo(template.getId());
@@ -84,41 +96,42 @@ public abstract class AbstractTemplateRepositoryTest {
@Test
void findAll() {
long saveCount = templateRepository.findAll(null).size();
Template template = builder().build();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
long saveCount = templateRepository.findAll(tenant).size();
Template template = builder(tenant).build();
templateRepository.create(template);
long size = templateRepository.findAll(null).size();
long size = templateRepository.findAll(tenant).size();
assertThat(size).isGreaterThan(saveCount);
templateRepository.delete(template);
assertThat((long) templateRepository.findAll(null).size()).isEqualTo(saveCount);
assertThat((long) templateRepository.findAll(tenant).size()).isEqualTo(saveCount);
}
@Test
void findAllForAllTenants() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
long saveCount = templateRepository.findAllForAllTenants().size();
Template template = builder().build();
Template template = builder(tenant).build();
templateRepository.create(template);
long size = templateRepository.findAllForAllTenants().size();
assertThat(size).isGreaterThan(saveCount);
templateRepository.delete(template);
assertThat((long) templateRepository.findAllForAllTenants().size()).isEqualTo(saveCount);
}
@Test
void find() {
Template template1 = builder().build();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Template template1 = builder(tenant).build();
templateRepository.create(template1);
Template template2 = builder().build();
Template template2 = builder(tenant).build();
templateRepository.create(template2);
Template template3 = builder().build();
Template template3 = builder(tenant).build();
templateRepository.create(template3);
// with pageable
List<Template> save = templateRepository.find(Pageable.from(1, 10),null, null, "kestra.test");
List<Template> save = templateRepository.find(Pageable.from(1, 10),null, tenant, "kestra.test");
assertThat((long) save.size()).isGreaterThanOrEqualTo(3L);
// without pageable
save = templateRepository.find(null, null, "kestra.test");
save = templateRepository.find(null, tenant, "kestra.test");
assertThat((long) save.size()).isGreaterThanOrEqualTo(3L);
templateRepository.delete(template1);
@@ -126,31 +139,45 @@ public abstract class AbstractTemplateRepositoryTest {
templateRepository.delete(template3);
}
private static final Logger LOG = LoggerFactory.getLogger(AbstractTemplateRepositoryTest.class);
@Test
void delete() {
Template template = builder().build();
protected void delete() throws TimeoutException {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Template template = builder(tenant).build();
Template save = templateRepository.create(template);
templateRepository.delete(save);
assertThat(templateRepository.findById(null, template.getNamespace(), template.getId()).isPresent()).isFalse();
assertThat(templateRepository.findById(tenant, template.getNamespace(), template.getId()).isPresent()).isFalse();
assertThat(TemplateListener.getEmits().size()).isEqualTo(2);
assertThat(TemplateListener.getEmits().stream().filter(r -> r.getType() == CrudEventType.CREATE).count()).isEqualTo(1L);
assertThat(TemplateListener.getEmits().stream().filter(r -> r.getType() == CrudEventType.DELETE).count()).isEqualTo(1L);
Await.until(() -> {
LOG.info("-------------> number of event: {}", TemplateListener.getEmits(tenant).size());
return TemplateListener.getEmits(tenant).size() == 2;
}, Duration.ofMillis(100), Duration.ofSeconds(5));
assertThat(TemplateListener.getEmits(tenant).stream().filter(r -> r.getType() == CrudEventType.CREATE).count()).isEqualTo(1L);
assertThat(TemplateListener.getEmits(tenant).stream().filter(r -> r.getType() == CrudEventType.DELETE).count()).isEqualTo(1L);
}
@Singleton
public static class TemplateListener implements ApplicationEventListener<CrudEvent<Template>> {
private static List<CrudEvent<Template>> emits = new ArrayList<>();
private static List<CrudEvent<Template>> emits = new CopyOnWriteArrayList<>();
@Override
public void onApplicationEvent(CrudEvent<Template> event) {
emits.add(event);
//The instanceOf is required because Micronaut may send non Template event via this method
if ((event.getModel() != null && event.getModel() instanceof Template) ||
(event.getPreviousModel() != null && event.getPreviousModel() instanceof Template)) {
emits.add(event);
}
}
public static List<CrudEvent<Template>> getEmits() {
return emits;
public static List<CrudEvent<Template>> getEmits(String tenantId){
return emits.stream()
.filter(e -> (e.getModel() != null && e.getModel().getTenantId().equals(tenantId)) ||
(e.getPreviousModel() != null && e.getPreviousModel().getTenantId().equals(tenantId)))
.toList();
}
public static void reset() {

View File

@@ -9,6 +9,7 @@ import io.kestra.core.models.flows.State;
import io.kestra.core.models.triggers.Trigger;
import io.kestra.core.repositories.ExecutionRepositoryInterface.ChildFilter;
import io.kestra.core.utils.IdUtils;
import io.kestra.core.utils.TestsUtils;
import io.micronaut.data.model.Pageable;
import io.micronaut.data.model.Sort;
import jakarta.inject.Inject;
@@ -24,7 +25,6 @@ import java.util.Optional;
import java.util.stream.Stream;
import static io.kestra.core.models.flows.FlowScope.USER;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
import static org.junit.jupiter.api.Assertions.assertThrows;
@@ -35,8 +35,9 @@ public abstract class AbstractTriggerRepositoryTest {
@Inject
protected TriggerRepositoryInterface triggerRepository;
private static Trigger.TriggerBuilder<?, ?> trigger() {
private static Trigger.TriggerBuilder<?, ?> trigger(String tenantId) {
return Trigger.builder()
.tenantId(tenantId)
.flowId(IdUtils.create())
.namespace(TEST_NAMESPACE)
.triggerId(IdUtils.create())
@@ -44,9 +45,9 @@ public abstract class AbstractTriggerRepositoryTest {
.date(ZonedDateTime.now());
}
protected static Trigger generateDefaultTrigger(){
protected static Trigger generateDefaultTrigger(String tenantId){
Trigger trigger = Trigger.builder()
.tenantId(MAIN_TENANT)
.tenantId(tenantId)
.triggerId("triggerId")
.namespace("trigger.namespace")
.flowId("flowId")
@@ -59,9 +60,10 @@ public abstract class AbstractTriggerRepositoryTest {
@ParameterizedTest
@MethodSource("filterCombinations")
void should_find_all(QueryFilter filter){
triggerRepository.save(generateDefaultTrigger());
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
triggerRepository.save(generateDefaultTrigger(tenant));
ArrayListTotal<Trigger> entries = triggerRepository.find(Pageable.UNPAGED, MAIN_TENANT, List.of(filter));
ArrayListTotal<Trigger> entries = triggerRepository.find(Pageable.UNPAGED, tenant, List.of(filter));
assertThat(entries).hasSize(1);
}
@@ -69,9 +71,10 @@ public abstract class AbstractTriggerRepositoryTest {
@ParameterizedTest
@MethodSource("filterCombinations")
void should_find_all_async(QueryFilter filter){
triggerRepository.save(generateDefaultTrigger());
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
triggerRepository.save(generateDefaultTrigger(tenant));
List<Trigger> entries = triggerRepository.find(MAIN_TENANT, List.of(filter)).collectList().block();
List<Trigger> entries = triggerRepository.find(tenant, List.of(filter)).collectList().block();
assertThat(entries).hasSize(1);
}
@@ -92,7 +95,7 @@ public abstract class AbstractTriggerRepositoryTest {
@ParameterizedTest
@MethodSource("errorFilterCombinations")
void should_fail_to_find_all(QueryFilter filter){
assertThrows(InvalidQueryFiltersException.class, () -> triggerRepository.find(Pageable.UNPAGED, MAIN_TENANT, List.of(filter)));
assertThrows(InvalidQueryFiltersException.class, () -> triggerRepository.find(Pageable.UNPAGED, TestsUtils.randomTenant(this.getClass().getSimpleName()), List.of(filter)));
}
static Stream<QueryFilter> errorFilterCombinations() {
@@ -110,7 +113,8 @@ public abstract class AbstractTriggerRepositoryTest {
@Test
void all() {
Trigger.TriggerBuilder<?, ?> builder = trigger();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Trigger.TriggerBuilder<?, ?> builder = trigger(tenant);
Optional<Trigger> findLast = triggerRepository.findLast(builder.build());
assertThat(findLast.isPresent()).isFalse();
@@ -130,47 +134,47 @@ public abstract class AbstractTriggerRepositoryTest {
assertThat(findLast.get().getExecutionId()).isEqualTo(save.getExecutionId());
triggerRepository.save(trigger().build());
triggerRepository.save(trigger().build());
Trigger searchedTrigger = trigger().build();
triggerRepository.save(trigger(tenant).build());
triggerRepository.save(trigger(tenant).build());
Trigger searchedTrigger = trigger(tenant).build();
triggerRepository.save(searchedTrigger);
List<Trigger> all = triggerRepository.findAllForAllTenants();
assertThat(all.size()).isEqualTo(4);
assertThat(all.size()).isGreaterThanOrEqualTo(4);
all = triggerRepository.findAll(null);
all = triggerRepository.findAll(tenant);
assertThat(all.size()).isEqualTo(4);
String namespacePrefix = "io.kestra.another";
String namespace = namespacePrefix + ".ns";
Trigger trigger = trigger().namespace(namespace).build();
Trigger trigger = trigger(tenant).namespace(namespace).build();
triggerRepository.save(trigger);
List<Trigger> find = triggerRepository.find(Pageable.from(1, 4, Sort.of(Sort.Order.asc("namespace"))), null, null, null, null, null);
List<Trigger> find = triggerRepository.find(Pageable.from(1, 4, Sort.of(Sort.Order.asc("namespace"))), null, tenant, null, null, null);
assertThat(find.size()).isEqualTo(4);
assertThat(find.getFirst().getNamespace()).isEqualTo(namespace);
find = triggerRepository.find(Pageable.from(1, 4, Sort.of(Sort.Order.asc("namespace"))), null, null, null, searchedTrigger.getFlowId(), null);
find = triggerRepository.find(Pageable.from(1, 4, Sort.of(Sort.Order.asc("namespace"))), null, tenant, null, searchedTrigger.getFlowId(), null);
assertThat(find.size()).isEqualTo(1);
assertThat(find.getFirst().getFlowId()).isEqualTo(searchedTrigger.getFlowId());
find = triggerRepository.find(Pageable.from(1, 100, Sort.of(Sort.Order.asc(triggerRepository.sortMapping().apply("triggerId")))), null, null, namespacePrefix, null, null);
find = triggerRepository.find(Pageable.from(1, 100, Sort.of(Sort.Order.asc(triggerRepository.sortMapping().apply("triggerId")))), null, tenant, namespacePrefix, null, null);
assertThat(find.size()).isEqualTo(1);
assertThat(find.getFirst().getTriggerId()).isEqualTo(trigger.getTriggerId());
// Full text search is on namespace, flowId, triggerId, executionId
find = triggerRepository.find(Pageable.from(1, 100, Sort.UNSORTED), trigger.getNamespace(), null, null, null, null);
find = triggerRepository.find(Pageable.from(1, 100, Sort.UNSORTED), trigger.getNamespace(), tenant, null, null, null);
assertThat(find.size()).isEqualTo(1);
assertThat(find.getFirst().getTriggerId()).isEqualTo(trigger.getTriggerId());
find = triggerRepository.find(Pageable.from(1, 100, Sort.UNSORTED), searchedTrigger.getFlowId(), null, null, null, null);
find = triggerRepository.find(Pageable.from(1, 100, Sort.UNSORTED), searchedTrigger.getFlowId(), tenant, null, null, null);
assertThat(find.size()).isEqualTo(1);
assertThat(find.getFirst().getTriggerId()).isEqualTo(searchedTrigger.getTriggerId());
find = triggerRepository.find(Pageable.from(1, 100, Sort.UNSORTED), searchedTrigger.getTriggerId(), null, null, null, null);
find = triggerRepository.find(Pageable.from(1, 100, Sort.UNSORTED), searchedTrigger.getTriggerId(), tenant, null, null, null);
assertThat(find.size()).isEqualTo(1);
assertThat(find.getFirst().getTriggerId()).isEqualTo(searchedTrigger.getTriggerId());
find = triggerRepository.find(Pageable.from(1, 100, Sort.UNSORTED), searchedTrigger.getExecutionId(), null, null, null, null);
find = triggerRepository.find(Pageable.from(1, 100, Sort.UNSORTED), searchedTrigger.getExecutionId(), tenant, null, null, null);
assertThat(find.size()).isEqualTo(1);
assertThat(find.getFirst().getTriggerId()).isEqualTo(searchedTrigger.getTriggerId());
}
@@ -178,15 +182,17 @@ public abstract class AbstractTriggerRepositoryTest {
@Test
void shouldCountForNullTenant() {
// Given
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
triggerRepository.save(Trigger
.builder()
.tenantId(tenant)
.triggerId(IdUtils.create())
.flowId(IdUtils.create())
.namespace("io.kestra.unittest")
.build()
);
// When
int count = triggerRepository.count(null);
int count = triggerRepository.count(tenant);
// Then
assertThat(count).isEqualTo(1);
}

View File

@@ -1,88 +1,92 @@
package io.kestra.core.repositories;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import com.google.common.collect.ImmutableMap;
import io.kestra.core.models.executions.*;
import io.kestra.core.models.flows.State;
import io.kestra.core.utils.IdUtils;
import java.time.Duration;
import java.util.Collections;
class ExecutionFixture {
public static final Execution EXECUTION_1 = Execution.builder()
.id(IdUtils.create())
.namespace("io.kestra.unittest")
.tenantId(MAIN_TENANT)
.flowId("full")
.flowRevision(1)
.state(new State())
.inputs(ImmutableMap.of("test", "value"))
.taskRunList(Collections.singletonList(
TaskRun.builder()
.id(IdUtils.create())
.namespace("io.kestra.unittest")
.flowId("full")
.state(new State())
.attempts(Collections.singletonList(
TaskRunAttempt.builder()
.build()
))
.outputs(Variables.inMemory(ImmutableMap.of(
"out", "value"
)))
.build()
))
.build();
public static Execution EXECUTION_1(String tenant) {
return Execution.builder()
.id(IdUtils.create())
.namespace("io.kestra.unittest")
.tenantId(tenant)
.flowId("full")
.flowRevision(1)
.state(new State())
.inputs(ImmutableMap.of("test", "value"))
.taskRunList(Collections.singletonList(
TaskRun.builder()
.id(IdUtils.create())
.namespace("io.kestra.unittest")
.flowId("full")
.state(new State())
.attempts(Collections.singletonList(
TaskRunAttempt.builder()
.build()
))
.outputs(Variables.inMemory(ImmutableMap.of(
"out", "value"
)))
.build()
))
.build();
}
public static final Execution EXECUTION_2 = Execution.builder()
.id(IdUtils.create())
.namespace("io.kestra.unittest")
.tenantId(MAIN_TENANT)
.flowId("full")
.flowRevision(1)
.state(new State())
.inputs(ImmutableMap.of("test", 1))
.taskRunList(Collections.singletonList(
TaskRun.builder()
.id(IdUtils.create())
.namespace("io.kestra.unittest")
.flowId("full")
.state(new State())
.attempts(Collections.singletonList(
TaskRunAttempt.builder()
.build()
))
.outputs(Variables.inMemory(ImmutableMap.of(
"out", 1
)))
.build()
))
.build();
public static Execution EXECUTION_2(String tenant) {
return Execution.builder()
.id(IdUtils.create())
.namespace("io.kestra.unittest")
.tenantId(tenant)
.flowId("full")
.flowRevision(1)
.state(new State())
.inputs(ImmutableMap.of("test", 1))
.taskRunList(Collections.singletonList(
TaskRun.builder()
.id(IdUtils.create())
.namespace("io.kestra.unittest")
.flowId("full")
.state(new State())
.attempts(Collections.singletonList(
TaskRunAttempt.builder()
.build()
))
.outputs(Variables.inMemory(ImmutableMap.of(
"out", 1
)))
.build()
))
.build();
}
public static final Execution EXECUTION_TEST = Execution.builder()
.id(IdUtils.create())
.namespace("io.kestra.unittest")
.flowId("full")
.flowRevision(1)
.state(new State())
.inputs(ImmutableMap.of("test", 1))
.kind(ExecutionKind.TEST)
.taskRunList(Collections.singletonList(
TaskRun.builder()
.id(IdUtils.create())
.namespace("io.kestra.unittest")
.flowId("full")
.state(new State())
.attempts(Collections.singletonList(
TaskRunAttempt.builder()
.build()
))
.outputs(Variables.inMemory(ImmutableMap.of(
"out", 1
)))
.build()
))
.build();
}
public static Execution EXECUTION_TEST(String tenant) {
return Execution.builder()
.id(IdUtils.create())
.namespace("io.kestra.unittest")
.tenantId(tenant)
.flowId("full")
.flowRevision(1)
.state(new State())
.inputs(ImmutableMap.of("test", 1))
.kind(ExecutionKind.TEST)
.taskRunList(Collections.singletonList(
TaskRun.builder()
.id(IdUtils.create())
.namespace("io.kestra.unittest")
.flowId("full")
.state(new State())
.attempts(Collections.singletonList(
TaskRunAttempt.builder()
.build()
))
.outputs(Variables.inMemory(ImmutableMap.of(
"out", 1
)))
.build()
))
.build();
}
}

View File

@@ -1,8 +1,5 @@
package io.kestra.core.runners;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
import io.kestra.core.junit.annotations.ExecuteFlow;
import io.kestra.core.junit.annotations.FlakyTest;
import io.kestra.core.junit.annotations.KestraTest;
@@ -13,24 +10,22 @@ import io.kestra.core.models.flows.State;
import io.kestra.core.queues.QueueException;
import io.kestra.core.queues.QueueFactoryInterface;
import io.kestra.core.queues.QueueInterface;
import io.kestra.plugin.core.flow.EachSequentialTest;
import io.kestra.plugin.core.flow.FlowCaseTest;
import io.kestra.plugin.core.flow.ForEachItemCaseTest;
import io.kestra.plugin.core.flow.PauseTest;
import io.kestra.plugin.core.flow.LoopUntilCaseTest;
import io.kestra.plugin.core.flow.WorkingDirectoryTest;
import io.kestra.plugin.core.flow.*;
import jakarta.inject.Inject;
import jakarta.inject.Named;
import java.util.Map;
import java.util.concurrent.TimeoutException;
import org.junit.jupiter.api.Disabled;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.TestInstance;
import org.junitpioneer.jupiter.RetryingTest;
import java.util.Map;
import java.util.concurrent.TimeoutException;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
@KestraTest(startRunner = true)
@TestInstance(TestInstance.Lifecycle.PER_CLASS)
//@org.junit.jupiter.api.parallel.Execution(org.junit.jupiter.api.parallel.ExecutionMode.CONCURRENT)
// must be per-class to allow calling once init() which took a lot of time
public abstract class AbstractRunnerTest {
@@ -115,7 +110,7 @@ public abstract class AbstractRunnerTest {
assertThat(execution.getTaskRunList()).hasSize(8);
}
@RetryingTest(5)
@Test
@ExecuteFlow("flows/valids/parallel-nested.yaml")
void parallelNested(Execution execution) {
assertThat(execution.getTaskRunList()).hasSize(11);
@@ -157,25 +152,25 @@ public abstract class AbstractRunnerTest {
restartCaseTest.restartFailedThenSuccess();
}
@RetryingTest(5)
@Test
@LoadFlows({"flows/valids/restart-each.yaml"})
void replay() throws Exception {
restartCaseTest.replay();
}
@RetryingTest(5)
@Test
@LoadFlows({"flows/valids/failed-first.yaml"})
void restartMultiple() throws Exception {
restartCaseTest.restartMultiple();
}
@RetryingTest(5) // Flaky on CI but never locally even with 100 repetitions
@Test
@LoadFlows({"flows/valids/restart_always_failed.yaml"})
void restartFailedThenFailureWithGlobalErrors() throws Exception {
restartCaseTest.restartFailedThenFailureWithGlobalErrors();
}
@RetryingTest(5)
@Test
@LoadFlows({"flows/valids/restart_local_errors.yaml"})
void restartFailedThenFailureWithLocalErrors() throws Exception {
restartCaseTest.restartFailedThenFailureWithLocalErrors();
@@ -199,7 +194,7 @@ public abstract class AbstractRunnerTest {
restartCaseTest.restartFailedWithAfterExecution();
}
@RetryingTest(5)
@Test
@LoadFlows({"flows/valids/trigger-flow-listener-no-inputs.yaml",
"flows/valids/trigger-flow-listener.yaml",
"flows/valids/trigger-flow-listener-namespace-condition.yaml",
@@ -208,7 +203,7 @@ public abstract class AbstractRunnerTest {
flowTriggerCaseTest.trigger();
}
@RetryingTest(5) // flaky on CI but never fail locally
@Test // flaky on CI but never fail locally
@LoadFlows({"flows/valids/trigger-flow-listener-with-pause.yaml",
"flows/valids/trigger-flow-with-pause.yaml"})
void flowTriggerWithPause() throws Exception {
@@ -232,7 +227,7 @@ public abstract class AbstractRunnerTest {
multipleConditionTriggerCaseTest.trigger();
}
@RetryingTest(5) // Flaky on CI but never locally even with 100 repetitions
@Test // Flaky on CI but never locally even with 100 repetitions
@LoadFlows({"flows/valids/trigger-flow-listener-namespace-condition.yaml",
"flows/valids/trigger-multiplecondition-flow-c.yaml",
"flows/valids/trigger-multiplecondition-flow-d.yaml"})
@@ -248,6 +243,7 @@ public abstract class AbstractRunnerTest {
multipleConditionTriggerCaseTest.flowTriggerPreconditions();
}
@Disabled
@Test
@LoadFlows({"flows/valids/flow-trigger-preconditions-flow-listen.yaml",
"flows/valids/flow-trigger-preconditions-flow-a.yaml",
@@ -262,7 +258,7 @@ public abstract class AbstractRunnerTest {
multipleConditionTriggerCaseTest.flowTriggerOnPaused();
}
@RetryingTest(5)
@Test
@LoadFlows({"flows/valids/each-null.yaml"})
void eachWithNull() throws Exception {
EachSequentialTest.eachNullTest(runnerUtils, logsQueue);
@@ -274,7 +270,7 @@ public abstract class AbstractRunnerTest {
pluginDefaultsCaseTest.taskDefaults();
}
@RetryingTest(5)
@Test
@LoadFlows({"flows/valids/switch.yaml",
"flows/valids/task-flow.yaml",
"flows/valids/task-flow-inherited-labels.yaml"})
@@ -305,9 +301,9 @@ public abstract class AbstractRunnerTest {
}
@Test
@LoadFlows({"flows/valids/working-directory.yaml"})
@LoadFlows(value = {"flows/valids/working-directory.yaml"}, tenantId = "tenant1")
public void workerFailed() throws Exception {
workingDirectoryTest.failed(runnerUtils);
workingDirectoryTest.failed("tenant1", runnerUtils);
}
@Test
@@ -322,7 +318,7 @@ public abstract class AbstractRunnerTest {
workingDirectoryTest.cache(runnerUtils);
}
@RetryingTest(5) // flaky on MySQL
@Test // flaky on MySQL
@LoadFlows({"flows/valids/pause-test.yaml"})
public void pauseRun() throws Exception {
pauseTest.run(runnerUtils);
@@ -358,40 +354,42 @@ public abstract class AbstractRunnerTest {
skipExecutionCaseTest.skipExecution();
}
@RetryingTest(5)
@Disabled
@Test
@LoadFlows({"flows/valids/for-each-item-subflow.yaml",
"flows/valids/for-each-item.yaml"})
protected void forEachItem() throws Exception {
forEachItemCaseTest.forEachItem();
}
@RetryingTest(5)
@Test
@LoadFlows({"flows/valids/for-each-item.yaml"})
protected void forEachItemEmptyItems() throws Exception {
forEachItemCaseTest.forEachItemEmptyItems();
}
@RetryingTest(5)
@Disabled
@Test
@LoadFlows({"flows/valids/for-each-item-subflow-failed.yaml",
"flows/valids/for-each-item-failed.yaml"})
protected void forEachItemFailed() throws Exception {
forEachItemCaseTest.forEachItemFailed();
}
@RetryingTest(5)
@Test
@LoadFlows({"flows/valids/for-each-item-outputs-subflow.yaml",
"flows/valids/for-each-item-outputs.yaml"})
protected void forEachItemSubflowOutputs() throws Exception {
forEachItemCaseTest.forEachItemWithSubflowOutputs();
}
@RetryingTest(5) // flaky on CI but always pass locally even with 100 iterations
@Test // flaky on CI but always pass locally even with 100 iterations
@LoadFlows({"flows/valids/restart-for-each-item.yaml", "flows/valids/restart-child.yaml"})
void restartForEachItem() throws Exception {
forEachItemCaseTest.restartForEachItem();
}
@RetryingTest(5)
@Test
@LoadFlows({"flows/valids/for-each-item-subflow.yaml",
"flows/valids/for-each-item-in-if.yaml"})
protected void forEachItemInIf() throws Exception {
@@ -441,6 +439,7 @@ public abstract class AbstractRunnerTest {
flowConcurrencyCaseTest.flowConcurrencyWithForEachItem();
}
@Disabled
@Test
@LoadFlows({"flows/valids/flow-concurrency-queue-fail.yml"})
protected void concurrencyQueueRestarted() throws Exception {

View File

@@ -20,6 +20,7 @@ import io.kestra.plugin.core.debug.Return;
import io.kestra.plugin.core.flow.Pause;
import jakarta.inject.Inject;
import lombok.extern.slf4j.Slf4j;
import org.junit.jupiter.api.Disabled;
import org.junit.jupiter.api.Test;
import org.junitpioneer.jupiter.RetryingTest;
import org.slf4j.event.Level;
@@ -40,6 +41,10 @@ import static org.junit.jupiter.api.Assertions.assertThrows;
@Slf4j
@KestraTest(startRunner = true)
class ExecutionServiceTest {
public static final String TENANT_1 = "tenant1";
public static final String TENANT_2 = "tenant2";
public static final String TENANT_3 = "tenant3";
@Inject
ExecutionService executionService;
@@ -75,13 +80,13 @@ class ExecutionServiceTest {
}
@Test
@LoadFlows({"flows/valids/restart_last_failed.yaml"})
@LoadFlows(value = {"flows/valids/restart_last_failed.yaml"}, tenantId = TENANT_1)
void restartSimpleRevision() throws Exception {
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "restart_last_failed");
Execution execution = runnerUtils.runOne(TENANT_1, "io.kestra.tests", "restart_last_failed");
assertThat(execution.getTaskRunList()).hasSize(3);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.FAILED);
FlowWithSource flow = flowRepository.findByIdWithSource(MAIN_TENANT, "io.kestra.tests", "restart_last_failed").orElseThrow();
FlowWithSource flow = flowRepository.findByIdWithSource(TENANT_1, "io.kestra.tests", "restart_last_failed").orElseThrow();
flowRepository.update(
GenericFlow.of(flow),
flow.updateTask(
@@ -124,9 +129,9 @@ class ExecutionServiceTest {
}
@RetryingTest(5)
@LoadFlows({"flows/valids/restart-each.yaml"})
@LoadFlows(value = {"flows/valids/restart-each.yaml"}, tenantId = TENANT_1)
void restartFlowable2() throws Exception {
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "restart-each", null, (f, e) -> ImmutableMap.of("failed", "SECOND"));
Execution execution = runnerUtils.runOne(TENANT_1, "io.kestra.tests", "restart-each", null, (f, e) -> ImmutableMap.of("failed", "SECOND"));
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.FAILED);
Execution restart = executionService.restart(execution, null);
@@ -177,9 +182,9 @@ class ExecutionServiceTest {
}
@Test
@LoadFlows({"flows/valids/logs.yaml"})
@LoadFlows(value = {"flows/valids/logs.yaml"}, tenantId = TENANT_1)
void replaySimple() throws Exception {
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "logs");
Execution execution = runnerUtils.runOne(TENANT_1, "io.kestra.tests", "logs");
assertThat(execution.getTaskRunList()).hasSize(5);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
@@ -196,9 +201,9 @@ class ExecutionServiceTest {
}
@Test
@LoadFlows({"flows/valids/restart-each.yaml"})
@LoadFlows(value = {"flows/valids/restart-each.yaml"}, tenantId = TENANT_2)
void replayFlowable() throws Exception {
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "restart-each", null, (f, e) -> ImmutableMap.of("failed", "NO"));
Execution execution = runnerUtils.runOne(TENANT_2, "io.kestra.tests", "restart-each", null, (f, e) -> ImmutableMap.of("failed", "NO"));
assertThat(execution.getTaskRunList()).hasSize(20);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
@@ -213,6 +218,7 @@ class ExecutionServiceTest {
assertThat(restart.getLabels()).contains(new Label(Label.REPLAY, "true"));
}
@Disabled
@Test
@LoadFlows({"flows/valids/parallel-nested.yaml"})
void replayParallel() throws Exception {
@@ -234,7 +240,7 @@ class ExecutionServiceTest {
}
@Test
@ExecuteFlow("flows/valids/each-sequential-nested.yaml")
@ExecuteFlow(value = "flows/valids/each-sequential-nested.yaml", tenantId = TENANT_2)
void replayEachSeq(Execution execution) throws Exception {
assertThat(execution.getTaskRunList()).hasSize(23);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
@@ -253,7 +259,7 @@ class ExecutionServiceTest {
}
@Test
@ExecuteFlow("flows/valids/each-sequential-nested.yaml")
@ExecuteFlow(value = "flows/valids/each-sequential-nested.yaml", tenantId = TENANT_1)
void replayEachSeq2(Execution execution) throws Exception {
assertThat(execution.getTaskRunList()).hasSize(23);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
@@ -312,9 +318,9 @@ class ExecutionServiceTest {
}
@Test
@LoadFlows({"flows/valids/each-parallel-nested.yaml"})
@LoadFlows(value = {"flows/valids/each-parallel-nested.yaml"}, tenantId = TENANT_1)
void markAsEachPara() throws Exception {
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "each-parallel-nested");
Execution execution = runnerUtils.runOne(TENANT_1, "io.kestra.tests", "each-parallel-nested");
Flow flow = flowRepository.findByExecution(execution);
assertThat(execution.getTaskRunList()).hasSize(11);
@@ -364,9 +370,9 @@ class ExecutionServiceTest {
}
@Test
@LoadFlows({"flows/valids/pause-test.yaml"})
@LoadFlows(value = {"flows/valids/pause-test.yaml"}, tenantId = TENANT_1)
void resumePausedToKilling() throws Exception {
Execution execution = runnerUtils.runOneUntilPaused(MAIN_TENANT, "io.kestra.tests", "pause-test");
Execution execution = runnerUtils.runOneUntilPaused(TENANT_1, "io.kestra.tests", "pause-test");
Flow flow = flowRepository.findByExecution(execution);
assertThat(execution.getTaskRunList()).hasSize(1);
@@ -379,7 +385,7 @@ class ExecutionServiceTest {
}
@Test
@ExecuteFlow("flows/valids/logs.yaml")
@ExecuteFlow(value = "flows/valids/logs.yaml", tenantId = TENANT_2)
void deleteExecution(Execution execution) throws IOException, TimeoutException {
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
Await.until(() -> logRepository.findByExecutionId(execution.getTenantId(), execution.getId(), Level.TRACE).size() == 5, Duration.ofMillis(10), Duration.ofSeconds(5));
@@ -391,7 +397,7 @@ class ExecutionServiceTest {
}
@Test
@ExecuteFlow("flows/valids/logs.yaml")
@ExecuteFlow(value = "flows/valids/logs.yaml", tenantId = TENANT_3)
void deleteExecutionKeepLogs(Execution execution) throws IOException, TimeoutException {
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
Await.until(() -> logRepository.findByExecutionId(execution.getTenantId(), execution.getId(), Level.TRACE).size() == 5, Duration.ofMillis(10), Duration.ofSeconds(5));
@@ -431,9 +437,9 @@ class ExecutionServiceTest {
}
@Test
@LoadFlows({"flows/valids/pause_no_tasks.yaml"})
@LoadFlows(value = {"flows/valids/pause_no_tasks.yaml"}, tenantId = TENANT_1)
void killToState() throws Exception {
Execution execution = runnerUtils.runOneUntilPaused(MAIN_TENANT, "io.kestra.tests", "pause_no_tasks");
Execution execution = runnerUtils.runOneUntilPaused(TENANT_1, "io.kestra.tests", "pause_no_tasks");
Flow flow = flowRepository.findByExecution(execution);
Execution killed = executionService.kill(execution, flow, Optional.of(State.Type.CANCELLED));

View File

@@ -18,11 +18,14 @@ import java.nio.file.Files;
import java.nio.file.Path;
import java.util.List;
import java.util.Map;
import org.junit.jupiter.api.parallel.Execution;
import org.junit.jupiter.api.parallel.ExecutionMode;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
@KestraTest(rebuildContext = true)
@Execution(ExecutionMode.SAME_THREAD)
class FilesServiceTest {
@Inject
private TestRunContextFactory runContextFactory;

View File

@@ -17,6 +17,9 @@ import io.kestra.core.storages.StorageInterface;
import io.kestra.core.utils.TestsUtils;
import jakarta.inject.Inject;
import jakarta.inject.Named;
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.atomic.AtomicReference;
import org.junit.jupiter.api.Test;
import jakarta.validation.ConstraintViolationException;
@@ -36,6 +39,7 @@ import java.util.concurrent.TimeoutException;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
import static org.junit.jupiter.api.Assertions.assertThrows;
import static org.junit.jupiter.api.Assertions.assertTrue;
@KestraTest(startRunner = true)
public class InputsTest {
@@ -90,8 +94,8 @@ public class InputsTest {
@Inject
private FlowInputOutput flowInputOutput;
private Map<String, Object> typedInputs(Map<String, Object> map) {
return typedInputs(map, flowRepository.findById(MAIN_TENANT, "io.kestra.tests", "inputs").get());
private Map<String, Object> typedInputs(Map<String, Object> map, String tenantId) {
return typedInputs(map, flowRepository.findById(tenantId, "io.kestra.tests", "inputs").get());
}
private Map<String, Object> typedInputs(Map<String, Object> map, Flow flow) {
@@ -100,7 +104,7 @@ public class InputsTest {
Execution.builder()
.id("test")
.namespace(flow.getNamespace())
.tenantId(MAIN_TENANT)
.tenantId(flow.getTenantId())
.flowRevision(1)
.flowId(flow.getId())
.build(),
@@ -113,25 +117,25 @@ public class InputsTest {
void missingRequired() {
HashMap<String, Object> inputs = new HashMap<>(InputsTest.inputs);
inputs.put("string", null);
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(inputs));
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(inputs, MAIN_TENANT));
assertThat(e.getMessage()).contains("Invalid input for `string`, missing required input, but received `null`");
}
@Test
@LoadFlows({"flows/valids/inputs.yaml"})
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant")
void nonRequiredNoDefaultNoValueIsNull() {
HashMap<String, Object> inputsWithMissingOptionalInput = new HashMap<>(inputs);
inputsWithMissingOptionalInput.remove("bool");
assertThat(typedInputs(inputsWithMissingOptionalInput).containsKey("bool")).isTrue();
assertThat(typedInputs(inputsWithMissingOptionalInput).get("bool")).isNull();
assertThat(typedInputs(inputsWithMissingOptionalInput, "tenant").containsKey("bool")).isTrue();
assertThat(typedInputs(inputsWithMissingOptionalInput, "tenant").get("bool")).isNull();
}
@SuppressWarnings("unchecked")
@Test
@LoadFlows({"flows/valids/inputs.yaml"})
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant1")
void allValidInputs() throws URISyntaxException, IOException {
Map<String, Object> typeds = typedInputs(inputs);
Map<String, Object> typeds = typedInputs(inputs, "tenant1");
assertThat(typeds.get("string")).isEqualTo("myString");
assertThat(typeds.get("int")).isEqualTo(42);
@@ -143,7 +147,7 @@ public class InputsTest {
assertThat(typeds.get("time")).isEqualTo(LocalTime.parse("18:27:49"));
assertThat(typeds.get("duration")).isEqualTo(Duration.parse("PT5M6S"));
assertThat((URI) typeds.get("file")).isEqualTo(new URI("kestra:///io/kestra/tests/inputs/executions/test/inputs/file/application-test.yml"));
assertThat(CharStreams.toString(new InputStreamReader(storageInterface.get(MAIN_TENANT, null, (URI) typeds.get("file"))))).isEqualTo(CharStreams.toString(new InputStreamReader(new FileInputStream((String) inputs.get("file")))));
assertThat(CharStreams.toString(new InputStreamReader(storageInterface.get("tenant1", null, (URI) typeds.get("file"))))).isEqualTo(CharStreams.toString(new InputStreamReader(new FileInputStream((String) inputs.get("file")))));
assertThat(typeds.get("json")).isEqualTo(Map.of("a", "b"));
assertThat(typeds.get("uri")).isEqualTo("https://www.google.com");
assertThat(((Map<String, Object>) typeds.get("nested")).get("string")).isEqualTo("a string");
@@ -166,9 +170,9 @@ public class InputsTest {
}
@Test
@LoadFlows({"flows/valids/inputs.yaml"})
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant2")
void allValidTypedInputs() {
Map<String, Object> typeds = typedInputs(inputs);
Map<String, Object> typeds = typedInputs(inputs, "tenant2");
typeds.put("int", 42);
typeds.put("float", 42.42F);
typeds.put("bool", false);
@@ -181,10 +185,10 @@ public class InputsTest {
}
@Test
@LoadFlows({"flows/valids/inputs.yaml"})
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant3")
void inputFlow() throws TimeoutException, QueueException {
Execution execution = runnerUtils.runOne(
MAIN_TENANT,
"tenant3",
"io.kestra.tests",
"inputs",
null,
@@ -201,165 +205,165 @@ public class InputsTest {
}
@Test
@LoadFlows({"flows/valids/inputs.yaml"})
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant4")
void inputValidatedStringBadValue() {
HashMap<String, Object> map = new HashMap<>(inputs);
map.put("validatedString", "foo");
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(map));
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(map, "tenant4"));
assertThat(e.getMessage()).contains("Invalid input for `validatedString`, it must match the pattern");
}
@Test
@LoadFlows({"flows/valids/inputs.yaml"})
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant5")
void inputValidatedIntegerBadValue() {
HashMap<String, Object> mapMin = new HashMap<>(inputs);
mapMin.put("validatedInt", "9");
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMin));
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMin, "tenant5"));
assertThat(e.getMessage()).contains("Invalid input for `validatedInt`, it must be more than `10`, but received `9`");
HashMap<String, Object> mapMax = new HashMap<>(inputs);
mapMax.put("validatedInt", "21");
e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMax));
e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMax, "tenant5"));
assertThat(e.getMessage()).contains("Invalid input for `validatedInt`, it must be less than `20`, but received `21`");
}
@Test
@LoadFlows({"flows/valids/inputs.yaml"})
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant6")
void inputValidatedDateBadValue() {
HashMap<String, Object> mapMin = new HashMap<>(inputs);
mapMin.put("validatedDate", "2022-01-01");
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMin));
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMin, "tenant6"));
assertThat(e.getMessage()).contains("Invalid input for `validatedDate`, it must be after `2023-01-01`, but received `2022-01-01`");
HashMap<String, Object> mapMax = new HashMap<>(inputs);
mapMax.put("validatedDate", "2024-01-01");
e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMax));
e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMax, "tenant6"));
assertThat(e.getMessage()).contains("Invalid input for `validatedDate`, it must be before `2023-12-31`, but received `2024-01-01`");
}
@Test
@LoadFlows({"flows/valids/inputs.yaml"})
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant7")
void inputValidatedDateTimeBadValue() {
HashMap<String, Object> mapMin = new HashMap<>(inputs);
mapMin.put("validatedDateTime", "2022-01-01T00:00:00Z");
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMin));
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMin, "tenant7"));
assertThat(e.getMessage()).contains("Invalid input for `validatedDateTime`, it must be after `2023-01-01T00:00:00Z`, but received `2022-01-01T00:00:00Z`");
HashMap<String, Object> mapMax = new HashMap<>(inputs);
mapMax.put("validatedDateTime", "2024-01-01T00:00:00Z");
e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMax));
e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMax, "tenant7"));
assertThat(e.getMessage()).contains("Invalid input for `validatedDateTime`, it must be before `2023-12-31T23:59:59Z`");
}
@Test
@LoadFlows({"flows/valids/inputs.yaml"})
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant8")
void inputValidatedDurationBadValue() {
HashMap<String, Object> mapMin = new HashMap<>(inputs);
mapMin.put("validatedDuration", "PT1S");
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMin));
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMin, "tenant8"));
assertThat(e.getMessage()).contains("Invalid input for `validatedDuration`, It must be more than `PT10S`, but received `PT1S`");
HashMap<String, Object> mapMax = new HashMap<>(inputs);
mapMax.put("validatedDuration", "PT30S");
e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMax));
e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMax, "tenant8"));
assertThat(e.getMessage()).contains("Invalid input for `validatedDuration`, It must be less than `PT20S`, but received `PT30S`");
}
@Test
@LoadFlows({"flows/valids/inputs.yaml"})
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant9")
void inputValidatedFloatBadValue() {
HashMap<String, Object> mapMin = new HashMap<>(inputs);
mapMin.put("validatedFloat", "0.01");
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMin));
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMin, "tenant9"));
assertThat(e.getMessage()).contains("Invalid input for `validatedFloat`, it must be more than `0.1`, but received `0.01`");
HashMap<String, Object> mapMax = new HashMap<>(inputs);
mapMax.put("validatedFloat", "1.01");
e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMax));
e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMax, "tenant9"));
assertThat(e.getMessage()).contains("Invalid input for `validatedFloat`, it must be less than `0.5`, but received `1.01`");
}
@Test
@LoadFlows({"flows/valids/inputs.yaml"})
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant10")
void inputValidatedTimeBadValue() {
HashMap<String, Object> mapMin = new HashMap<>(inputs);
mapMin.put("validatedTime", "00:00:01");
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMin));
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMin, "tenant10"));
assertThat(e.getMessage()).contains("Invalid input for `validatedTime`, it must be after `01:00`, but received `00:00:01`");
HashMap<String, Object> mapMax = new HashMap<>(inputs);
mapMax.put("validatedTime", "14:00:00");
e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMax));
e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMax, "tenant10"));
assertThat(e.getMessage()).contains("Invalid input for `validatedTime`, it must be before `11:59:59`, but received `14:00:00`");
}
@Test
@LoadFlows({"flows/valids/inputs.yaml"})
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant11")
void inputFailed() {
HashMap<String, Object> map = new HashMap<>(inputs);
map.put("uri", "http:/bla");
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(map));
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(map, "tenant11"));
assertThat(e.getMessage()).contains("Invalid input for `uri`, Expected `URI` but received `http:/bla`, but received `http:/bla`");
}
@Test
@LoadFlows({"flows/valids/inputs.yaml"})
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant12")
void inputEnumFailed() {
HashMap<String, Object> map = new HashMap<>(inputs);
map.put("enum", "INVALID");
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(map));
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(map, "tenant12"));
assertThat(e.getMessage()).isEqualTo("enum: Invalid input for `enum`, it must match the values `[ENUM_VALUE, OTHER_ONE]`, but received `INVALID`");
}
@Test
@LoadFlows({"flows/valids/inputs.yaml"})
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant13")
void inputArrayFailed() {
HashMap<String, Object> map = new HashMap<>(inputs);
map.put("array", "[\"s1\", \"s2\"]");
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(map));
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(map, "tenant13"));
assertThat(e.getMessage()).contains("Invalid input for `array`, Unable to parse array element as `INT` on `s1`, but received `[\"s1\", \"s2\"]`");
}
@Test
@LoadFlows({"flows/valids/inputs.yaml"})
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant14")
void inputEmptyJson() {
HashMap<String, Object> map = new HashMap<>(inputs);
map.put("json", "{}");
Map<String, Object> typeds = typedInputs(map);
Map<String, Object> typeds = typedInputs(map, "tenant14");
assertThat(typeds.get("json")).isInstanceOf(Map.class);
assertThat(((Map<?, ?>) typeds.get("json")).size()).isZero();
}
@Test
@LoadFlows({"flows/valids/inputs.yaml"})
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant15")
void inputEmptyJsonFlow() throws TimeoutException, QueueException {
HashMap<String, Object> map = new HashMap<>(inputs);
map.put("json", "{}");
Execution execution = runnerUtils.runOne(
MAIN_TENANT,
"tenant15",
"io.kestra.tests",
"inputs",
null,
@@ -375,12 +379,20 @@ public class InputsTest {
}
@Test
@LoadFlows({"flows/valids/input-log-secret.yaml"})
void shouldNotLogSecretInput() throws TimeoutException, QueueException {
Flux<LogEntry> receive = TestsUtils.receive(logQueue, l -> {});
@LoadFlows(value = {"flows/valids/input-log-secret.yaml"}, tenantId = "tenant16")
void shouldNotLogSecretInput() throws TimeoutException, QueueException, InterruptedException {
AtomicReference<LogEntry> logEntry = new AtomicReference<>();
CountDownLatch countDownLatch = new CountDownLatch(1);
Flux<LogEntry> receive = TestsUtils.receive(logQueue, l -> {
LogEntry left = l.getLeft();
if (left.getTenantId().equals("tenant16")){
logEntry.set(left);
countDownLatch.countDown();
}
});
Execution execution = runnerUtils.runOne(
MAIN_TENANT,
"tenant16",
"io.kestra.tests",
"input-log-secret",
null,
@@ -390,20 +402,21 @@ public class InputsTest {
assertThat(execution.getTaskRunList()).hasSize(1);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
var logEntry = receive.blockLast();
assertThat(logEntry).isNotNull();
assertThat(logEntry.getMessage()).isEqualTo("These are my secrets: ****** - ******");
receive.blockLast();
assertTrue(countDownLatch.await(10, TimeUnit.SECONDS));
assertThat(logEntry.get()).isNotNull();
assertThat(logEntry.get().getMessage()).isEqualTo("These are my secrets: ****** - ******");
}
@Test
@LoadFlows({"flows/valids/inputs.yaml"})
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant17")
void fileInputWithFileDefault() throws IOException, QueueException, TimeoutException {
HashMap<String, Object> newInputs = new HashMap<>(InputsTest.inputs);
URI file = createFile();
newInputs.put("file", file);
Execution execution = runnerUtils.runOne(
MAIN_TENANT,
"tenant17",
"io.kestra.tests",
"inputs",
null,
@@ -415,14 +428,14 @@ public class InputsTest {
}
@Test
@LoadFlows({"flows/valids/inputs.yaml"})
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant18")
void fileInputWithNsfile() throws IOException, QueueException, TimeoutException {
HashMap<String, Object> inputs = new HashMap<>(InputsTest.inputs);
URI file = createNsFile(false);
URI file = createNsFile(false, "tenant18");
inputs.put("file", file);
Execution execution = runnerUtils.runOne(
MAIN_TENANT,
"tenant18",
"io.kestra.tests",
"inputs",
null,
@@ -439,11 +452,11 @@ public class InputsTest {
return tempFile.toPath().toUri();
}
private URI createNsFile(boolean nsInAuthority) throws IOException {
private URI createNsFile(boolean nsInAuthority, String tenantId) throws IOException {
String namespace = "io.kestra.tests";
String filePath = "file.txt";
storageInterface.createDirectory(MAIN_TENANT, namespace, URI.create(StorageContext.namespaceFilePrefix(namespace)));
storageInterface.put(MAIN_TENANT, namespace, URI.create(StorageContext.namespaceFilePrefix(namespace) + "/" + filePath), new ByteArrayInputStream("Hello World".getBytes()));
storageInterface.createDirectory(tenantId, namespace, URI.create(StorageContext.namespaceFilePrefix(namespace)));
storageInterface.put(tenantId, namespace, URI.create(StorageContext.namespaceFilePrefix(namespace) + "/" + filePath), new ByteArrayInputStream("Hello World".getBytes()));
return URI.create("nsfile://" + (nsInAuthority ? namespace : "") + "/" + filePath);
}
}

View File

@@ -14,10 +14,12 @@ import java.io.IOException;
import java.net.URISyntaxException;
import java.util.Objects;
import java.util.concurrent.TimeoutException;
import org.junit.jupiter.api.parallel.ExecutionMode;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
@org.junit.jupiter.api.parallel.Execution(ExecutionMode.SAME_THREAD)
@KestraTest(startRunner = true)
class ListenersTest {

View File

@@ -11,6 +11,7 @@ import jakarta.inject.Inject;
import jakarta.inject.Named;
import org.junit.jupiter.api.RepeatedTest;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.parallel.ExecutionMode;
import org.slf4j.Logger;
import org.slf4j.event.Level;
import reactor.core.publisher.Flux;
@@ -25,6 +26,7 @@ import java.util.concurrent.CopyOnWriteArrayList;
import static org.assertj.core.api.Assertions.assertThat;
@KestraTest
@org.junit.jupiter.api.parallel.Execution(ExecutionMode.SAME_THREAD)
class RunContextLoggerTest {
@Inject
@Named(QueueFactoryInterface.WORKERTASKLOG_NAMED)

View File

@@ -1,13 +1,24 @@
package io.kestra.core.runners;
import io.kestra.core.exceptions.IllegalVariableEvaluationException;
import io.kestra.core.models.executions.Execution;
import io.kestra.core.models.flows.DependsOn;
import io.kestra.core.models.flows.Flow;
import io.kestra.core.models.flows.Type;
import io.kestra.core.models.flows.input.BoolInput;
import io.kestra.core.models.property.Property;
import io.kestra.core.models.property.PropertyContext;
import io.kestra.core.models.tasks.Task;
import io.kestra.core.models.triggers.AbstractTrigger;
import io.kestra.core.runners.pebble.functions.SecretFunction;
import io.kestra.core.utils.IdUtils;
import io.micronaut.context.ApplicationContext;
import org.junit.jupiter.api.Assertions;
import org.junit.jupiter.api.Test;
import org.mockito.Mockito;
import java.util.Collections;
import java.util.List;
import java.util.Map;
import static org.assertj.core.api.Assertions.assertThat;
@@ -112,4 +123,25 @@ class RunVariablesTest {
assertThat(kestra.get("environment")).isEqualTo("test");
assertThat(kestra.get("url")).isEqualTo("http://localhost:8080");
}
}
@Test
void nonResolvableDynamicInputsShouldBeSkipped() throws IllegalVariableEvaluationException {
Map<String, Object> variables = new RunVariables.DefaultBuilder()
.withFlow(Flow
.builder()
.namespace("a.b")
.id("c")
.inputs(List.of(
BoolInput.builder().id("a").type(Type.BOOL).defaults(Property.ofValue(true)).build(),
BoolInput.builder().id("b").type(Type.BOOL).dependsOn(new DependsOn(List.of("a"), null)).defaults(Property.ofExpression("{{inputs.a == true}}")).build()
))
.build()
)
.withExecution(Execution.builder().id(IdUtils.create()).build())
.build(new RunContextLogger(), PropertyContext.create(new VariableRenderer(Mockito.mock(ApplicationContext.class), Mockito.mock(VariableRenderer.VariableConfiguration.class), Collections.emptyList())));
Assertions.assertEquals(Map.of(
"a", true
), variables.get("inputs"));
}
}

View File

@@ -44,10 +44,10 @@ public class TaskWithAllowFailureTest {
}
@Test
@LoadFlows({"flows/valids/task-allow-failure-executable-flow.yml",
"flows/valids/for-each-item-subflow-failed.yaml"})
@LoadFlows(value = {"flows/valids/task-allow-failure-executable-flow.yml",
"flows/valids/for-each-item-subflow-failed.yaml"}, tenantId = "tenant1")
void executableTask_Flow() throws QueueException, TimeoutException {
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "task-allow-failure-executable-flow");
Execution execution = runnerUtils.runOne("tenant1", "io.kestra.tests", "task-allow-failure-executable-flow");
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.WARNING);
assertThat(execution.getTaskRunList()).hasSize(2);
}

View File

@@ -50,7 +50,7 @@ class TestSuiteTest {
void withoutAnyTaskFixture() throws QueueException, TimeoutException {
var fixtures = List.<TaskFixture>of();
var executionResult = runReturnFlow(fixtures);
var executionResult = runReturnFlow(fixtures, MAIN_TENANT);
assertThat(executionResult.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertOutputForTask(executionResult, "task-id")
@@ -65,7 +65,7 @@ class TestSuiteTest {
}
@Test
@LoadFlows({"flows/valids/return.yaml"})
@LoadFlows(value = {"flows/valids/return.yaml"}, tenantId = "tenant1")
void taskFixture() throws TimeoutException, QueueException {
var fixtures = List.of(
TaskFixture.builder()
@@ -73,7 +73,7 @@ class TestSuiteTest {
.build()
);
var executionResult = runReturnFlow(fixtures);
var executionResult = runReturnFlow(fixtures, "tenant1");
assertThat(executionResult.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertOutputForTask(executionResult, "task-id")
@@ -85,7 +85,7 @@ class TestSuiteTest {
}
@Test
@LoadFlows({"flows/valids/return.yaml"})
@LoadFlows(value = {"flows/valids/return.yaml"}, tenantId = "tenant2")
void twoTaskFixturesOverridingOutput() throws QueueException, TimeoutException {
var fixtures = List.of(
TaskFixture.builder()
@@ -98,7 +98,7 @@ class TestSuiteTest {
.build()
);
var executionResult = runReturnFlow(fixtures);
var executionResult = runReturnFlow(fixtures, "tenant2");
assertThat(executionResult.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertOutputForTask(executionResult, "task-id")
@@ -110,7 +110,7 @@ class TestSuiteTest {
}
@Test
@LoadFlows({"flows/valids/return.yaml"})
@LoadFlows(value = {"flows/valids/return.yaml"}, tenantId = "tenant3")
void taskFixturesWithWarningState() throws QueueException, TimeoutException {
var fixtures = List.of(
TaskFixture.builder()
@@ -119,7 +119,7 @@ class TestSuiteTest {
.build()
);
var executionResult = runReturnFlow(fixtures);
var executionResult = runReturnFlow(fixtures, "tenant3");
assertThat(executionResult.getState().getCurrent()).isEqualTo(State.Type.WARNING);
assertTask(executionResult, "task-id")
@@ -133,8 +133,8 @@ class TestSuiteTest {
.isEqualTo(State.Type.WARNING);
}
private Execution runReturnFlow(List<TaskFixture> fixtures) throws TimeoutException, QueueException {
var flow = flowRepository.findById(MAIN_TENANT, "io.kestra.tests", "return", Optional.empty()).orElseThrow();
private Execution runReturnFlow(List<TaskFixture> fixtures, String tenantId) throws TimeoutException, QueueException {
var flow = flowRepository.findById(tenantId, "io.kestra.tests", "return", Optional.empty()).orElseThrow();
var execution = Execution.builder()
.id(IdUtils.create())

View File

@@ -9,6 +9,8 @@ import io.micronaut.context.annotation.Property;
import jakarta.inject.Inject;
import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.parallel.Execution;
import org.junit.jupiter.api.parallel.ExecutionMode;
import org.slf4j.event.Level;
import java.time.Instant;
@@ -18,6 +20,7 @@ import static org.assertj.core.api.Assertions.assertThat;
@KestraTest
@Property(name = "kestra.server-type", value = "WORKER")
@Execution(ExecutionMode.SAME_THREAD)
class ErrorLogsFunctionTest {
@Inject
private LogRepositoryInterface logRepository;

View File

@@ -17,11 +17,14 @@ import java.io.IOException;
import java.net.URI;
import java.nio.file.Files;
import java.util.Map;
import org.junit.jupiter.api.parallel.Execution;
import org.junit.jupiter.api.parallel.ExecutionMode;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
import static org.junit.jupiter.api.Assertions.*;
@Execution(ExecutionMode.SAME_THREAD)
@KestraTest(rebuildContext = true)
class FileExistsFunctionTest {
@@ -191,7 +194,7 @@ class FileExistsFunctionTest {
}
private URI createFile() throws IOException {
File tempFile = File.createTempFile("file", ".txt");
File tempFile = File.createTempFile("%s-file".formatted(IdUtils.create()), ".txt");
Files.write(tempFile.toPath(), "Hello World".getBytes());
return tempFile.toPath().toUri();
}

View File

@@ -17,6 +17,8 @@ import java.io.IOException;
import java.net.URI;
import java.nio.file.Files;
import java.util.Map;
import org.junit.jupiter.api.parallel.Execution;
import org.junit.jupiter.api.parallel.ExecutionMode;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
@@ -24,6 +26,7 @@ import static org.hibernate.validator.internal.util.Contracts.assertTrue;
import static org.junit.jupiter.api.Assertions.assertThrows;
@KestraTest(rebuildContext = true)
@Execution(ExecutionMode.SAME_THREAD)
public class FileSizeFunctionTest {
private static final String NAMESPACE = "my.namespace";
@@ -275,14 +278,14 @@ public class FileSizeFunctionTest {
private URI createNsFile(boolean nsInAuthority) throws IOException {
String namespace = "io.kestra.tests";
String filePath = "file.txt";
String filePath = "%sfile.txt".formatted(IdUtils.create());
storageInterface.createDirectory(MAIN_TENANT, namespace, URI.create(StorageContext.namespaceFilePrefix(namespace)));
storageInterface.put(MAIN_TENANT, namespace, URI.create(StorageContext.namespaceFilePrefix(namespace) + "/" + filePath), new ByteArrayInputStream("Hello World".getBytes()));
return URI.create("nsfile://" + (nsInAuthority ? namespace : "") + "/" + filePath);
}
private URI createFile() throws IOException {
File tempFile = File.createTempFile("file", ".txt");
File tempFile = File.createTempFile("%sfile".formatted(IdUtils.create()), ".txt");
Files.write(tempFile.toPath(), "Hello World".getBytes());
return tempFile.toPath().toUri();
}

View File

@@ -8,6 +8,7 @@ import io.kestra.core.exceptions.IllegalVariableEvaluationException;
import io.kestra.core.runners.VariableRenderer;
import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.serializers.JacksonMapper;
import io.pebbletemplates.pebble.error.PebbleException;
import jakarta.inject.Inject;
import org.apache.hc.client5.http.utils.Base64;
import org.junit.jupiter.api.Assertions;
@@ -16,12 +17,17 @@ import org.junit.jupiter.api.Test;
import java.nio.charset.StandardCharsets;
import java.util.List;
import java.util.Map;
import org.junit.jupiter.api.parallel.Execution;
import org.junit.jupiter.api.parallel.ExecutionMode;
import static com.github.tomakehurst.wiremock.client.WireMock.*;
import static com.github.tomakehurst.wiremock.client.WireMock.aResponse;
import static org.assertj.core.api.Assertions.assertThat;
import static org.junit.Assert.assertThrows;
@KestraTest
@WireMockTest(httpPort = 28182)
@Execution(ExecutionMode.SAME_THREAD)
class HttpFunctionTest {
@Inject
private VariableRenderer variableRenderer;
@@ -45,6 +51,13 @@ class HttpFunctionTest {
Assertions.assertTrue(rendered.contains("\"todo\":\"New todo\""));
}
@Test
void wrongMethod() {
var exception = assertThrows(IllegalVariableEvaluationException.class, () -> variableRenderer.render("{{ http(url) }}", Map.of("url", "https://dummyjson.com/todos/add")));
assertThat(exception.getCause()).isInstanceOf(PebbleException.class);
assertThat(exception.getCause().getMessage()).isEqualTo("Failed to execute HTTP Request, server respond with status 404 : Not Found ({{ http(url) }}:1)");
}
@Test
void getWithQueryHttpCall() throws IllegalVariableEvaluationException, JsonProcessingException {
String rendered = variableRenderer.render("""

View File

@@ -17,12 +17,15 @@ import java.io.IOException;
import java.net.URI;
import java.nio.file.Files;
import java.util.Map;
import org.junit.jupiter.api.parallel.Execution;
import org.junit.jupiter.api.parallel.ExecutionMode;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
import static org.junit.jupiter.api.Assertions.*;
@KestraTest(rebuildContext = true)
@Execution(ExecutionMode.SAME_THREAD)
class IsFileEmptyFunctionTest {
private static final String NAMESPACE = "my.namespace";

View File

@@ -19,6 +19,8 @@ import java.io.IOException;
import java.net.URI;
import java.nio.file.Files;
import java.util.Map;
import org.junit.jupiter.api.parallel.Execution;
import org.junit.jupiter.api.parallel.ExecutionMode;
import static io.kestra.core.runners.pebble.functions.FunctionTestUtils.NAMESPACE;
import static io.kestra.core.runners.pebble.functions.FunctionTestUtils.getVariables;
@@ -29,6 +31,7 @@ import static org.junit.jupiter.api.Assertions.assertThrows;
@KestraTest(rebuildContext = true)
@Property(name="kestra.server-type", value="WORKER")
@Execution(ExecutionMode.SAME_THREAD)
class ReadFileFunctionTest {
@Inject
VariableRenderer variableRenderer;

View File

@@ -1,13 +1,14 @@
package io.kestra.core.serializers;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import lombok.AllArgsConstructor;
import lombok.Getter;
import lombok.NoArgsConstructor;
import org.apache.commons.lang3.tuple.Pair;
import org.junit.jupiter.api.Test;
import org.junitpioneer.jupiter.DefaultTimeZone;
import org.junitpioneer.jupiter.RetryingTest;
import java.io.IOException;
import java.time.Instant;
@@ -86,6 +87,36 @@ class JacksonMapperTest {
assertThat(deserialize.getZonedDateTime().toEpochSecond()).isEqualTo(original.getZonedDateTime().toEpochSecond());
assertThat(deserialize.getZonedDateTime().getOffset()).isEqualTo(original.getZonedDateTime().getOffset());
}
@Test
void shouldComputeDiffGivenCreatedObject() {
Pair<JsonNode, JsonNode> value = JacksonMapper.getBiDirectionalDiffs(null, new DummyObject("value"));
// patch
assertThat(value.getLeft().toString()).isEqualTo("[{\"op\":\"replace\",\"path\":\"\",\"value\":{\"value\":\"value\"}}]");
// Revert
assertThat(value.getRight().toString()).isEqualTo("[{\"op\":\"replace\",\"path\":\"\",\"value\":null}]");
}
@Test
void shouldComputeDiffGivenUpdatedObject() {
Pair<JsonNode, JsonNode> value = JacksonMapper.getBiDirectionalDiffs(new DummyObject("before"), new DummyObject("after"));
// patch
assertThat(value.getLeft().toString()).isEqualTo("[{\"op\":\"replace\",\"path\":\"/value\",\"value\":\"after\"}]");
// Revert
assertThat(value.getRight().toString()).isEqualTo("[{\"op\":\"replace\",\"path\":\"/value\",\"value\":\"before\"}]");
}
@Test
void shouldComputeDiffGivenDeletedObject() {
Pair<JsonNode, JsonNode> value = JacksonMapper.getBiDirectionalDiffs(new DummyObject("value"), null);
// Patch
assertThat(value.getLeft().toString()).isEqualTo("[{\"op\":\"replace\",\"path\":\"\",\"value\":null}]");
// Revert
assertThat(value.getRight().toString()).isEqualTo("[{\"op\":\"replace\",\"path\":\"\",\"value\":{\"value\":\"value\"}}]");
}
private record DummyObject(String value){}
@Getter
@NoArgsConstructor

View File

@@ -1,6 +1,7 @@
package io.kestra.core.server;
import io.kestra.core.contexts.KestraContext;
import io.kestra.core.models.ServerType;
import io.kestra.core.utils.IdUtils;
import io.kestra.core.utils.Network;
import org.junit.jupiter.api.Assertions;
@@ -25,6 +26,7 @@ import java.util.Set;
import static io.kestra.core.server.ServiceStateTransition.Result.ABORTED;
import static io.kestra.core.server.ServiceStateTransition.Result.FAILED;
import static io.kestra.core.server.ServiceStateTransition.Result.SUCCEEDED;
import static org.mockito.Mockito.when;
@ExtendWith({MockitoExtension.class})
@MockitoSettings(strictness = Strictness.LENIENT)
@@ -59,6 +61,8 @@ public class ServiceLivenessManagerTest {
);
KestraContext context = Mockito.mock(KestraContext.class);
KestraContext.setContext(context);
when(context.getServerType()).thenReturn(ServerType.INDEXER);
this.serviceLivenessManager = new ServiceLivenessManager(
config,
new ServiceRegistry(),
@@ -100,8 +104,7 @@ public class ServiceLivenessManagerTest {
);
// mock the state transition result
Mockito
.when(serviceLivenessUpdater.update(Mockito.any(ServiceInstance.class), Mockito.any(Service.ServiceState.class)))
when(serviceLivenessUpdater.update(Mockito.any(ServiceInstance.class), Mockito.any(Service.ServiceState.class)))
.thenReturn(response);
// When
@@ -127,8 +130,7 @@ public class ServiceLivenessManagerTest {
);
// mock the state transition result
Mockito
.when(serviceLivenessUpdater.update(Mockito.any(ServiceInstance.class), Mockito.any(Service.ServiceState.class)))
when(serviceLivenessUpdater.update(Mockito.any(ServiceInstance.class), Mockito.any(Service.ServiceState.class)))
.thenReturn(response);
// When
@@ -147,8 +149,7 @@ public class ServiceLivenessManagerTest {
serviceLivenessManager.updateServiceInstance(running, serviceInstanceFor(running));
// mock the state transition result
Mockito
.when(serviceLivenessUpdater.update(Mockito.any(ServiceInstance.class), Mockito.any(Service.ServiceState.class)))
when(serviceLivenessUpdater.update(Mockito.any(ServiceInstance.class), Mockito.any(Service.ServiceState.class)))
.thenReturn(new ServiceStateTransition.Response(ABORTED));
// When

View File

@@ -10,6 +10,10 @@ import io.kestra.core.runners.RunContextFactory;
import io.kestra.plugin.core.trigger.Schedule;
import jakarta.inject.Inject;
import org.junit.jupiter.api.Test;
import io.kestra.plugin.core.execution.Labels;
import io.kestra.core.models.executions.Execution;
import static org.assertj.core.api.Assertions.assertThatThrownBy;
import java.util.Collections;
import java.util.List;
@@ -79,4 +83,24 @@ class LabelServiceTest {
assertTrue(LabelService.containsAll(List.of(new Label("key1", "value1")), List.of(new Label("key1", "value1"))));
assertTrue(LabelService.containsAll(List.of(new Label("key1", "value1"), new Label("key2", "value2")), List.of(new Label("key1", "value1"))));
}
@Test
void shouldThrowExceptionOnEmptyLabelValueInLabelsTask() throws Exception {
Labels task = Labels.builder()
.id("test")
.type(Labels.class.getName())
.labels(Map.of("invalidLabel", "")) // empty value
.build();
RunContext runContext = runContextFactory.of();
Execution execution = Execution.builder()
.id("execId")
.namespace("test.ns")
.build();
assertThatThrownBy(() -> task.update(execution, runContext))
.isInstanceOf(IllegalArgumentException.class)
.hasMessageContaining("Label values cannot be empty");
}
}

View File

@@ -0,0 +1,79 @@
package io.kestra.core.services;
import com.google.common.collect.ImmutableMap;
import io.kestra.core.exceptions.FlowProcessingException;
import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.models.flows.Flow;
import io.kestra.core.models.flows.PluginDefault;
import io.kestra.core.services.PluginDefaultServiceTest.DefaultPrecedenceTester;
import io.kestra.core.utils.TestsUtils;
import jakarta.inject.Inject;
import lombok.extern.slf4j.Slf4j;
import org.junit.jupiter.api.parallel.ExecutionMode;
import org.junit.jupiter.params.ParameterizedTest;
import org.junit.jupiter.params.provider.Arguments;
import org.junit.jupiter.params.provider.MethodSource;
import java.util.Collections;
import java.util.List;
import java.util.stream.Stream;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.is;
@Slf4j
@KestraTest
class PluginDefaultServiceOverrideTest {
@Inject
private PluginDefaultService pluginDefaultService;
@org.junit.jupiter.api.parallel.Execution(ExecutionMode.SAME_THREAD)
@ParameterizedTest
@MethodSource
void flowDefaultsOverrideGlobalDefaults(boolean flowDefaultForced, boolean globalDefaultForced, String fooValue, String barValue, String bazValue) throws FlowProcessingException {
final DefaultPrecedenceTester task = DefaultPrecedenceTester.builder()
.id("test")
.type(DefaultPrecedenceTester.class.getName())
.propBaz("taskValue")
.build();
final PluginDefault flowDefault = new PluginDefault(DefaultPrecedenceTester.class.getName(), flowDefaultForced, ImmutableMap.of(
"propBar", "flowValue",
"propBaz", "flowValue"
));
final PluginDefault globalDefault = new PluginDefault(DefaultPrecedenceTester.class.getName(), globalDefaultForced, ImmutableMap.of(
"propFoo", "globalValue",
"propBar", "globalValue",
"propBaz", "globalValue"
));
var tenant = TestsUtils.randomTenant(PluginDefaultServiceOverrideTest.class.getSimpleName());
final Flow flowWithPluginDefault = Flow.builder()
.tenantId(tenant)
.tasks(Collections.singletonList(task))
.pluginDefaults(List.of(flowDefault))
.build();
final PluginGlobalDefaultConfiguration pluginGlobalDefaultConfiguration = new PluginGlobalDefaultConfiguration();
pluginGlobalDefaultConfiguration.defaults = List.of(globalDefault);
var previousGlobalDefault = pluginDefaultService.pluginGlobalDefault;
pluginDefaultService.pluginGlobalDefault = pluginGlobalDefaultConfiguration;
final Flow injected = pluginDefaultService.injectAllDefaults(flowWithPluginDefault, true);
pluginDefaultService.pluginGlobalDefault = previousGlobalDefault;
assertThat(((DefaultPrecedenceTester) injected.getTasks().getFirst()).getPropFoo(), is(fooValue));
assertThat(((DefaultPrecedenceTester) injected.getTasks().getFirst()).getPropBar(), is(barValue));
assertThat(((DefaultPrecedenceTester) injected.getTasks().getFirst()).getPropBaz(), is(bazValue));
}
private static Stream<Arguments> flowDefaultsOverrideGlobalDefaults() {
return Stream.of(
Arguments.of(false, false, "globalValue", "flowValue", "taskValue"),
Arguments.of(false, true, "globalValue", "globalValue", "globalValue"),
Arguments.of(true, false, "globalValue", "flowValue", "flowValue"),
Arguments.of(true, true, "globalValue", "flowValue", "flowValue")
);
}
}

View File

@@ -1,12 +1,11 @@
package io.kestra.core.services;
import com.google.common.collect.ImmutableMap;
import com.fasterxml.jackson.core.JsonProcessingException;
import io.kestra.core.exceptions.FlowProcessingException;
import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.models.annotations.Plugin;
import io.kestra.core.models.conditions.ConditionContext;
import io.kestra.core.models.executions.Execution;
import io.kestra.core.models.flows.Flow;
import io.kestra.core.models.flows.FlowInterface;
import io.kestra.core.models.flows.FlowWithSource;
import io.kestra.core.models.flows.GenericFlow;
@@ -19,6 +18,7 @@ import io.kestra.core.models.triggers.PollingTriggerInterface;
import io.kestra.core.models.triggers.TriggerContext;
import io.kestra.core.models.triggers.TriggerOutput;
import io.kestra.core.runners.RunContext;
import io.kestra.core.utils.TestsUtils;
import io.kestra.plugin.core.condition.Expression;
import io.kestra.plugin.core.log.Log;
import io.kestra.plugin.core.trigger.Schedule;
@@ -31,19 +31,13 @@ import lombok.ToString;
import lombok.experimental.SuperBuilder;
import org.junit.jupiter.api.Assertions;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.params.ParameterizedTest;
import org.junit.jupiter.params.provider.Arguments;
import org.junit.jupiter.params.provider.MethodSource;
import org.slf4j.event.Level;
import java.time.Duration;
import java.util.Collections;
import java.util.List;
import java.util.Map;
import java.util.Optional;
import java.util.stream.Stream;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.containsInAnyOrder;
import static org.hamcrest.Matchers.is;
@@ -71,7 +65,8 @@ class PluginDefaultServiceTest {
@Test
void shouldInjectGivenFlowWithNullSource() throws FlowProcessingException {
// Given
FlowInterface flow = GenericFlow.fromYaml(MAIN_TENANT, TEST_LOG_FLOW_SOURCE);
var tenant = TestsUtils.randomTenant(PluginDefaultServiceTest.class.getSimpleName());
FlowInterface flow = GenericFlow.fromYaml(tenant, TEST_LOG_FLOW_SOURCE);
// When
FlowWithSource result = pluginDefaultService.injectAllDefaults(flow, true);
@@ -131,55 +126,8 @@ class PluginDefaultServiceTest {
), result);
}
@ParameterizedTest
@MethodSource
void flowDefaultsOverrideGlobalDefaults(boolean flowDefaultForced, boolean globalDefaultForced, String fooValue, String barValue, String bazValue) throws FlowProcessingException {
final DefaultPrecedenceTester task = DefaultPrecedenceTester.builder()
.id("test")
.type(DefaultPrecedenceTester.class.getName())
.propBaz("taskValue")
.build();
final PluginDefault flowDefault = new PluginDefault(DefaultPrecedenceTester.class.getName(), flowDefaultForced, ImmutableMap.of(
"propBar", "flowValue",
"propBaz", "flowValue"
));
final PluginDefault globalDefault = new PluginDefault(DefaultPrecedenceTester.class.getName(), globalDefaultForced, ImmutableMap.of(
"propFoo", "globalValue",
"propBar", "globalValue",
"propBaz", "globalValue"
));
final Flow flowWithPluginDefault = Flow.builder()
.tasks(Collections.singletonList(task))
.pluginDefaults(List.of(flowDefault))
.build();
final PluginGlobalDefaultConfiguration pluginGlobalDefaultConfiguration = new PluginGlobalDefaultConfiguration();
pluginGlobalDefaultConfiguration.defaults = List.of(globalDefault);
var previousGlobalDefault = pluginDefaultService.pluginGlobalDefault;
pluginDefaultService.pluginGlobalDefault = pluginGlobalDefaultConfiguration;
final Flow injected = pluginDefaultService.injectAllDefaults(flowWithPluginDefault, true);
pluginDefaultService.pluginGlobalDefault = previousGlobalDefault;
assertThat(((DefaultPrecedenceTester) injected.getTasks().getFirst()).getPropFoo(), is(fooValue));
assertThat(((DefaultPrecedenceTester) injected.getTasks().getFirst()).getPropBar(), is(barValue));
assertThat(((DefaultPrecedenceTester) injected.getTasks().getFirst()).getPropBaz(), is(bazValue));
}
private static Stream<Arguments> flowDefaultsOverrideGlobalDefaults() {
return Stream.of(
Arguments.of(false, false, "globalValue", "flowValue", "taskValue"),
Arguments.of(false, true, "globalValue", "globalValue", "globalValue"),
Arguments.of(true, false, "globalValue", "flowValue", "flowValue"),
Arguments.of(true, true, "globalValue", "flowValue", "flowValue")
);
}
@Test
public void injectFlowAndGlobals() throws FlowProcessingException {
public void injectFlowAndGlobals() throws FlowProcessingException, JsonProcessingException {
String source = String.format("""
id: default-test
namespace: io.kestra.tests
@@ -215,8 +163,8 @@ class PluginDefaultServiceTest {
DefaultTriggerTester.class.getName(),
Expression.class.getName()
);
FlowWithSource injected = pluginDefaultService.parseFlowWithAllDefaults(null, source, false);
var tenant = TestsUtils.randomTenant(PluginDefaultServiceTest.class.getSimpleName());
FlowWithSource injected = pluginDefaultService.parseFlowWithAllDefaults(tenant, source, false);
assertThat(((DefaultTester) injected.getTasks().getFirst()).getValue(), is(1));
assertThat(((DefaultTester) injected.getTasks().getFirst()).getSet(), is(666));
@@ -261,7 +209,8 @@ class PluginDefaultServiceTest {
""";
// When
FlowWithSource injected = pluginDefaultService.parseFlowWithAllDefaults(null, source, false);
var tenant = TestsUtils.randomTenant(PluginDefaultServiceTest.class.getSimpleName());
FlowWithSource injected = pluginDefaultService.parseFlowWithAllDefaults(tenant, source, false);
// Then
assertThat(((DefaultTester) injected.getTasks().getFirst()).getSet(), is(2));
@@ -299,7 +248,8 @@ class PluginDefaultServiceTest {
""";
// When
FlowWithSource injected = pluginDefaultService.parseFlowWithAllDefaults(null, source, false);
var tenant = TestsUtils.randomTenant(PluginDefaultServiceTest.class.getSimpleName());
FlowWithSource injected = pluginDefaultService.parseFlowWithAllDefaults(tenant, source, false);
// Then
assertThat(((DefaultTester) injected.getTasks().getFirst()).getSet(), is(666));
@@ -309,7 +259,8 @@ class PluginDefaultServiceTest {
@Test
void shouldInjectFlowDefaultsGivenAlias() throws FlowProcessingException {
// Given
GenericFlow flow = GenericFlow.fromYaml(MAIN_TENANT, """
var tenant = TestsUtils.randomTenant(PluginDefaultServiceTest.class.getSimpleName());
GenericFlow flow = GenericFlow.fromYaml(tenant, """
id: default-test
namespace: io.kestra.tests
@@ -333,7 +284,8 @@ class PluginDefaultServiceTest {
@Test
void shouldInjectFlowDefaultsGivenType() throws FlowProcessingException {
GenericFlow flow = GenericFlow.fromYaml(MAIN_TENANT, """
var tenant = TestsUtils.randomTenant(PluginDefaultServiceTest.class.getSimpleName());
GenericFlow flow = GenericFlow.fromYaml(tenant, """
id: default-test
namespace: io.kestra.tests
@@ -356,7 +308,8 @@ class PluginDefaultServiceTest {
@Test
public void shouldNotInjectDefaultsGivenExistingTaskValue() throws FlowProcessingException {
// Given
GenericFlow flow = GenericFlow.fromYaml(MAIN_TENANT, """
var tenant = TestsUtils.randomTenant(PluginDefaultServiceTest.class.getSimpleName());
GenericFlow flow = GenericFlow.fromYaml(tenant, """
id: default-test
namespace: io.kestra.tests

View File

@@ -5,6 +5,7 @@ import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.models.executions.Execution;
import io.kestra.core.models.executions.TaskRun;
import io.kestra.core.models.flows.State;
import org.junit.jupiter.api.Disabled;
import org.junit.jupiter.api.Test;
import static org.assertj.core.api.Assertions.assertThat;
@@ -32,6 +33,7 @@ class SanityCheckTest {
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
}
@Disabled
@Test
@ExecuteFlow("sanity-checks/kv.yaml")
void qaKv(Execution execution) {
@@ -111,4 +113,11 @@ class SanityCheckTest {
assertThat(execution.getTaskRunList()).hasSize(6);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
}
@Test
@ExecuteFlow("sanity-checks/output_values.yaml")
void qaOutputValues(Execution execution) {
assertThat(execution.getTaskRunList()).hasSize(2);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
}
}

View File

@@ -204,6 +204,10 @@ class FlowTopologyServiceTest {
io.kestra.plugin.core.trigger.Flow.UpstreamFlow.builder().namespace("io.kestra.ee").flowId("parent").build(),
io.kestra.plugin.core.trigger.Flow.UpstreamFlow.builder().namespace("io.kestra.others").flowId("invalid").build()
))
// add an always true condition to validate that it's an AND between 'flows' and 'where'
.where(List.of(io.kestra.plugin.core.trigger.Flow.ExecutionFilter.builder()
.filters(List.of(io.kestra.plugin.core.trigger.Flow.Filter.builder().field(io.kestra.plugin.core.trigger.Flow.Field.EXPRESSION).type(io.kestra.plugin.core.trigger.Flow.Type.IS_NOT_NULL).value("something").build()))
.build()))
.build()
)
.build()

View File

@@ -0,0 +1,304 @@
package io.kestra.core.topologies;
import io.kestra.core.exceptions.FlowProcessingException;
import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.models.flows.FlowWithSource;
import io.kestra.core.models.topologies.FlowTopology;
import io.kestra.core.repositories.FlowRepositoryInterface;
import io.kestra.core.repositories.FlowTopologyRepositoryInterface;
import io.kestra.core.services.FlowService;
import io.kestra.core.utils.IdUtils;
import jakarta.inject.Inject;
import org.jetbrains.annotations.NotNull;
import org.junit.jupiter.api.Test;
import java.util.List;
import static org.assertj.core.api.Assertions.assertThat;
@KestraTest
public class FlowTopologyTest {
@Inject
private FlowService flowService;
@Inject
private FlowTopologyService flowTopologyService;
@Inject
private FlowTopologyRepositoryInterface flowTopologyRepository;
@Test
void should_findDependencies_simpleCase() throws FlowProcessingException {
// Given
var tenantId = randomTenantId();
var child = flowService.importFlow(tenantId,
"""
id: child
namespace: io.kestra.unittest
tasks:
- id: download
type: io.kestra.plugin.core.http.Download
""");
var parent = flowService.importFlow(tenantId, """
id: parent
namespace: io.kestra.unittest
tasks:
- id: subflow
type: io.kestra.core.tasks.flows.Flow
flowId: child
namespace: io.kestra.unittest
""");
var unrelatedFlow = flowService.importFlow(tenantId, """
id: unrelated_flow
namespace: io.kestra.unittest
tasks:
- id: download
type: io.kestra.plugin.core.http.Download
""");
// When
computeAndSaveTopologies(List.of(child, parent, unrelatedFlow));
System.out.println();
flowTopologyRepository.findAll(tenantId).forEach(topology -> {
System.out.println(FlowTopologyTestData.of(topology));
});
var dependencies = flowService.findDependencies(tenantId, "io.kestra.unittest", parent.getId(), false, true);
flowTopologyRepository.findAll(tenantId).forEach(topology -> {
System.out.println(FlowTopologyTestData.of(topology));
});
// Then
assertThat(dependencies.map(FlowTopologyTestData::of))
.containsExactlyInAnyOrder(
new FlowTopologyTestData(parent, child)
);
}
@Test
void should_findDependencies_subchildAndSuperParent() throws FlowProcessingException {
// Given
var tenantId = randomTenantId();
var subChild = flowService.importFlow(tenantId,
"""
id: sub_child
namespace: io.kestra.unittest
tasks:
- id: download
type: io.kestra.plugin.core.http.Download
""");
var child = flowService.importFlow(tenantId,
"""
id: child
namespace: io.kestra.unittest
tasks:
- id: subflow
type: io.kestra.core.tasks.flows.Flow
flowId: sub_child
namespace: io.kestra.unittest
""");
var superParent = flowService.importFlow(tenantId, """
id: super_parent
namespace: io.kestra.unittest
tasks:
- id: subflow
type: io.kestra.core.tasks.flows.Flow
flowId: parent
namespace: io.kestra.unittest
""");
var parent = flowService.importFlow(tenantId, """
id: parent
namespace: io.kestra.unittest
tasks:
- id: subflow
type: io.kestra.core.tasks.flows.Flow
flowId: child
namespace: io.kestra.unittest
""");
var unrelatedFlow = flowService.importFlow(tenantId, """
id: unrelated_flow
namespace: io.kestra.unittest
tasks:
- id: download
type: io.kestra.plugin.core.http.Download
""");
// When
computeAndSaveTopologies(List.of(subChild, child, superParent, parent, unrelatedFlow));
System.out.println();
flowTopologyRepository.findAll(tenantId).forEach(topology -> {
System.out.println(FlowTopologyTestData.of(topology));
});
System.out.println();
var dependencies = flowService.findDependencies(tenantId, "io.kestra.unittest", parent.getId(), false, true);
flowTopologyRepository.findAll(tenantId).forEach(topology -> {
System.out.println(FlowTopologyTestData.of(topology));
});
// Then
assertThat(dependencies.map(FlowTopologyTestData::of))
.containsExactlyInAnyOrder(
new FlowTopologyTestData(superParent, parent),
new FlowTopologyTestData(parent, child),
new FlowTopologyTestData(child, subChild)
);
}
@Test
void should_findDependencies_cyclicTriggers() throws FlowProcessingException {
// Given
var tenantId = randomTenantId();
var triggeredFlowOne = flowService.importFlow(tenantId,
"""
id: triggered_flow_one
namespace: io.kestra.unittest
tasks:
- id: download
type: io.kestra.plugin.core.http.Download
triggers:
- id: listen
type: io.kestra.plugin.core.trigger.Flow
conditions:
- type: io.kestra.plugin.core.condition.ExecutionStatus
in:
- FAILED
""");
var triggeredFlowTwo = flowService.importFlow(tenantId, """
id: triggered_flow_two
namespace: io.kestra.unittest
tasks:
- id: download
type: io.kestra.plugin.core.http.Download
triggers:
- id: listen
type: io.kestra.plugin.core.trigger.Flow
conditions:
- type: io.kestra.plugin.core.condition.ExecutionStatus
in:
- FAILED
""");
// When
computeAndSaveTopologies(List.of(triggeredFlowOne, triggeredFlowTwo));
flowTopologyRepository.findAll(tenantId).forEach(topology -> {
System.out.println(FlowTopologyTestData.of(topology));
});
var dependencies = flowService.findDependencies(tenantId, "io.kestra.unittest", triggeredFlowTwo.getId(), false, true).toList();
flowTopologyRepository.findAll(tenantId).forEach(topology -> {
System.out.println(FlowTopologyTestData.of(topology));
});
// Then
assertThat(dependencies.stream().map(FlowTopologyTestData::of))
.containsExactlyInAnyOrder(
new FlowTopologyTestData(triggeredFlowTwo, triggeredFlowOne),
new FlowTopologyTestData(triggeredFlowOne, triggeredFlowTwo)
);
}
@Test
void flowTriggerWithTargetFlow() throws FlowProcessingException {
// Given
var tenantId = randomTenantId();
var parent = flowService.importFlow(tenantId,
"""
id: parent
namespace: io.kestra.unittest
inputs:
- id: a
type: BOOL
defaults: true
- id: b
type: BOOL
defaults: "{{ inputs.a == true }}"
dependsOn:
inputs:
- a
tasks:
- id: helloA
type: io.kestra.plugin.core.log.Log
message: Hello A
""");
var child = flowService.importFlow(tenantId, """
id: child
namespace: io.kestra.unittest
tasks:
- id: helloB
type: io.kestra.plugin.core.log.Log
message: Hello B
triggers:
- id: release
type: io.kestra.plugin.core.trigger.Flow
states:
- SUCCESS
preconditions:
id: flows
flows:
- namespace: io.kestra.unittest
flowId: parent
""");
var unrelatedFlow = flowService.importFlow(tenantId, """
id: unrelated_flow
namespace: io.kestra.unittest
tasks:
- id: download
type: io.kestra.plugin.core.http.Download
""");
// When
computeAndSaveTopologies(List.of(child, parent, unrelatedFlow));
System.out.println();
flowTopologyRepository.findAll(tenantId).forEach(topology -> {
System.out.println(FlowTopologyTestData.of(topology));
});
var dependencies = flowService.findDependencies(tenantId, "io.kestra.unittest", parent.getId(), false, true);
flowTopologyRepository.findAll(tenantId).forEach(topology -> {
System.out.println(FlowTopologyTestData.of(topology));
});
// Then
assertThat(dependencies.map(FlowTopologyTestData::of))
.containsExactlyInAnyOrder(
new FlowTopologyTestData(parent, child)
);
}
/**
* this function mimics the production behaviour
*/
private void computeAndSaveTopologies(List<@NotNull FlowWithSource> flows) {
flows.forEach(flow ->
flowTopologyService
.topology(
flow,
flows
).distinct()
.forEach(topology -> flowTopologyRepository.save(topology))
);
}
private static String randomTenantId() {
return FlowTopologyTest.class + IdUtils.create();
}
record FlowTopologyTestData(String sourceUid, String destinationUid) {
public FlowTopologyTestData(FlowWithSource parent, FlowWithSource child) {
this(parent.uidWithoutRevision(), child.uidWithoutRevision());
}
public static FlowTopologyTestData of(FlowTopology flowTopology) {
return new FlowTopologyTestData(flowTopology.getSource().getUid(), flowTopology.getDestination().getUid());
}
@Override
public String toString() {
return sourceUid + " -> " + destinationUid;
}
}
}

View File

@@ -9,7 +9,6 @@ import java.util.List;
import java.util.Map;
import static org.assertj.core.api.Assertions.assertThat;
import static org.junit.jupiter.api.Assertions.assertThrows;
class MapUtilsTest {
@SuppressWarnings("unchecked")
@@ -208,10 +207,13 @@ class MapUtilsTest {
}
@Test
void shouldThrowIfNestedMapContainsMultipleEntries() {
var exception = assertThrows(IllegalArgumentException.class,
() -> MapUtils.nestedToFlattenMap(Map.of("k1", Map.of("k2", Map.of("k3", "v1"), "k4", "v2")))
);
assertThat(exception.getMessage()).isEqualTo("You cannot flatten a map with an entry that is a map of more than one element, conflicting key: k1");
void shouldFlattenANestedMapWithDuplicateKeys() {
Map<String, Object> results = MapUtils.nestedToFlattenMap(Map.of("k1", Map.of("k2", Map.of("k3", "v1"), "k4", "v2")));
assertThat(results).hasSize(2);
assertThat(results).containsAllEntriesOf(Map.of(
"k1.k2.k3", "v1",
"k1.k4", "v2"
));
}
}

View File

@@ -17,6 +17,8 @@ import jakarta.inject.Named;
import java.nio.file.Path;
import java.util.stream.Collectors;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.parallel.Execution;
import org.junit.jupiter.api.parallel.ExecutionMode;
import reactor.core.publisher.Flux;
import java.io.ByteArrayInputStream;
@@ -31,6 +33,7 @@ import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
@KestraTest
@Execution(ExecutionMode.SAME_THREAD)
class NamespaceFilesUtilsTest {
@Inject
RunContextFactory runContextFactory;

View File

@@ -6,7 +6,13 @@ import org.junit.jupiter.api.Test;
import java.util.List;
class VersionTest {
@Test
void shouldCreateVersionFromIntegerGivenMajorVersion() {
Version version = Version.of(1);
Assertions.assertEquals(1, version.majorVersion());
}
@Test
void shouldCreateVersionFromStringGivenMajorVersion() {
Version version = Version.of("1");
@@ -135,14 +141,24 @@ class VersionTest {
}
@Test
public void shouldGetStableVersionGivenMajorMinorVersions() {
Version result = Version.getStable(Version.of("1.2.0"), List.of(Version.of("1.2.1"), Version.of("1.2.2"), Version.of("0.99.0")));
Assertions.assertEquals(Version.of("1.2.2"), result);
public void shouldGetStableVersionGivenMajorMinorPatchVersion() {
Assertions.assertEquals(Version.of("1.2.1"), Version.getStable(Version.of("1.2.1"), List.of(Version.of("1.2.1"), Version.of("1.2.3"), Version.of("0.99.0"))));
Assertions.assertEquals(Version.of("1.2.3"), Version.getStable(Version.of("1.2.0"), List.of(Version.of("1.2.1"), Version.of("1.2.3"), Version.of("0.99.0"))));
}
@Test
public void shouldGetStableVersionGivenMajorMinorVersion() {
Assertions.assertEquals(Version.of("1.2.3"), Version.getStable(Version.of("1.2"), List.of(Version.of("1.2.1"), Version.of("1.2.3"), Version.of("0.99.0"))));
}
@Test
public void shouldGetStableVersionGivenMajorVersion() {
Assertions.assertEquals(Version.of("1.2.3"), Version.getStable(Version.of("1"), List.of(Version.of("1.2.1"), Version.of("1.2.3"), Version.of("0.99.0"))));
}
@Test
public void shouldGetNullForStableVersionGivenNoCompatibleVersions() {
Version result = Version.getStable(Version.of("1.2.0"), List.of(Version.of("1.3.0"), Version.of("2.0.0"), Version.of("0.99.0")));
Assertions.assertNull(result);
Assertions.assertNull(Version.getStable(Version.of("3.0"), List.of(Version.of("1.3.0"), Version.of("2.0.0"), Version.of("0.99.0"))));
Assertions.assertNull(Version.getStable(Version.of("0.1"), List.of(Version.of("1.3.0"), Version.of("2.0.0"), Version.of("0.99.0"))));
}
}

View File

@@ -1,6 +1,6 @@
package io.kestra.plugin.core.execution;
import com.google.common.collect.ImmutableMap;
import io.kestra.core.context.TestRunContextFactory;
import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.models.executions.statistics.Flow;
import io.kestra.core.models.flows.State;
@@ -8,7 +8,6 @@ import io.kestra.core.models.property.Property;
import io.kestra.core.repositories.AbstractExecutionRepositoryTest;
import io.kestra.core.repositories.ExecutionRepositoryInterface;
import io.kestra.core.runners.RunContext;
import io.kestra.core.runners.RunContextFactory;
import io.kestra.core.utils.IdUtils;
import io.kestra.core.utils.TestsUtils;
import jakarta.inject.Inject;
@@ -20,8 +19,10 @@ import static org.assertj.core.api.Assertions.assertThat;
@KestraTest
class CountTest {
public static final String NAMESPACE = "io.kestra.unittest";
@Inject
RunContextFactory runContextFactory;
TestRunContextFactory runContextFactory;
@Inject
ExecutionRepositoryInterface executionRepository;
@@ -29,8 +30,10 @@ class CountTest {
@Test
void run() throws Exception {
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
for (int i = 0; i < 28; i++) {
executionRepository.save(AbstractExecutionRepositoryTest.builder(
tenant,
i < 5 ? State.Type.RUNNING : (i < 8 ? State.Type.FAILED : State.Type.SUCCESS),
i < 4 ? "first" : (i < 10 ? "second" : "third")
).build());
@@ -49,7 +52,8 @@ class CountTest {
.endDate(new Property<>("{{ now() }}"))
.build();
RunContext runContext = TestsUtils.mockRunContext(runContextFactory, task, ImmutableMap.of("namespace", "io.kestra.unittest"));
RunContext runContext = runContextFactory.of("id", NAMESPACE, tenant);
Count.Output run = task.run(runContext);
assertThat(run.getResults().size()).isEqualTo(2);

View File

@@ -1,38 +1,15 @@
package io.kestra.plugin.core.execution;
import static org.assertj.core.api.Assertions.assertThat;
import io.kestra.core.junit.annotations.ExecuteFlow;
import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.junit.annotations.LoadFlows;
import io.kestra.core.models.executions.Execution;
import io.kestra.core.models.flows.Flow;
import io.kestra.core.models.flows.State;
import io.kestra.core.queues.QueueException;
import io.kestra.core.queues.QueueFactoryInterface;
import io.kestra.core.queues.QueueInterface;
import io.kestra.core.repositories.FlowRepositoryInterface;
import io.kestra.core.utils.TestsUtils;
import jakarta.inject.Inject;
import jakarta.inject.Named;
import org.junit.jupiter.api.Test;
import reactor.core.publisher.Flux;
import java.util.Optional;
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.atomic.AtomicReference;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
import static org.junit.jupiter.api.Assertions.assertTrue;
@KestraTest(startRunner = true)
class ExitTest {
@Inject
@Named(QueueFactoryInterface.EXECUTION_NAMED)
private QueueInterface<Execution> executionQueue;
@Inject
private FlowRepositoryInterface flowRepository;
@Test
@ExecuteFlow("flows/valids/exit.yaml")
@@ -43,29 +20,12 @@ class ExitTest {
}
@Test
@LoadFlows("flows/valids/exit-killed.yaml")
void shouldExitAndKillTheExecution() throws QueueException, InterruptedException {
CountDownLatch countDownLatch = new CountDownLatch(2);// We need to wait for 3 execution modifications to be sure all tasks are passed to KILLED
AtomicReference<Execution> killedExecution = new AtomicReference<>();
Flux<Execution> receive = TestsUtils.receive(executionQueue, either -> {
Execution execution = either.getLeft();
if (execution.getFlowId().equals("exit-killed") && execution.getState().getCurrent().isKilled()) {
killedExecution.set(execution);
countDownLatch.countDown();
}
});
@ExecuteFlow("flows/valids/exit-killed.yaml")
void shouldExitAndKillTheExecution(Execution execution) {
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.KILLED);
assertThat(execution.getTaskRunList().size()).isEqualTo(2);
assertThat(execution.getTaskRunList().getFirst().getState().getCurrent()).isEqualTo(State.Type.KILLED);
assertThat(execution.getTaskRunList().get(1).getState().getCurrent()).isEqualTo(State.Type.KILLED);
// we cannot use the runnerUtils as it may not see the RUNNING state before the execution is killed
Flow flow = flowRepository.findById(MAIN_TENANT, "io.kestra.tests", "exit-killed", Optional.empty()).orElseThrow();
Execution execution = Execution.newExecution(flow, null, null, Optional.empty());
executionQueue.emit(execution);
assertTrue(countDownLatch.await(1, TimeUnit.MINUTES));
assertThat(killedExecution.get()).isNotNull();
assertThat(killedExecution.get().getState().getCurrent()).isEqualTo(State.Type.KILLED);
assertThat(killedExecution.get().getTaskRunList().size()).isEqualTo(2);
assertThat(killedExecution.get().getTaskRunList().getFirst().getState().getCurrent()).isEqualTo(State.Type.KILLED);
assertThat(killedExecution.get().getTaskRunList().get(1).getState().getCurrent()).isEqualTo(State.Type.KILLED);
receive.blockLast();
}
}

View File

@@ -33,9 +33,9 @@ public class FailTest {
}
@Test
@LoadFlows({"flows/valids/fail-on-condition.yaml"})
@LoadFlows(value = {"flows/valids/fail-on-condition.yaml"}, tenantId = "fail")
void failOnCondition() throws TimeoutException, QueueException{
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "fail-on-condition", null,
Execution execution = runnerUtils.runOne("fail", "io.kestra.tests", "fail-on-condition", null,
(f, e) -> Map.of("param", "fail") , Duration.ofSeconds(20));
assertThat(execution.getTaskRunList()).hasSize(2);
@@ -44,9 +44,9 @@ public class FailTest {
}
@Test
@LoadFlows({"flows/valids/fail-on-condition.yaml"})
@LoadFlows(value = {"flows/valids/fail-on-condition.yaml"}, tenantId = "success")
void dontFailOnCondition() throws TimeoutException, QueueException{
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "fail-on-condition", null,
Execution execution = runnerUtils.runOne("success", "io.kestra.tests", "fail-on-condition", null,
(f, e) -> Map.of("param", "success") , Duration.ofSeconds(20));
assertThat(execution.getTaskRunList()).hasSize(3);

View File

@@ -35,10 +35,10 @@ class AllowFailureTest {
}
@Test
@LoadFlows({"flows/valids/allow-failure.yaml"})
@LoadFlows(value = {"flows/valids/allow-failure.yaml"}, tenantId = "fail")
void failed() throws TimeoutException, QueueException {
Execution execution = runnerUtils.runOne(
MAIN_TENANT,
"fail",
"io.kestra.tests",
"allow-failure",
null,

View File

@@ -22,6 +22,7 @@ import static org.assertj.core.api.Assertions.assertThat;
class FinallyTest {
public static final String NAMESPACE = "io.kestra.tests";
private static final String TENANT_ID = "tenant1";
@Inject
protected RunnerUtils runnerUtils;
@@ -46,10 +47,10 @@ class FinallyTest {
}
@Test
@LoadFlows({"flows/valids/finally-sequential.yaml"})
@LoadFlows(value = {"flows/valids/finally-sequential.yaml"}, tenantId = TENANT_ID)
void sequentialWithErrors() throws QueueException, TimeoutException {
Execution execution = runnerUtils.runOne(
MAIN_TENANT,
TENANT_ID,
NAMESPACE, "finally-sequential", null,
(flow, execution1) -> flowIO.readExecutionInputs(flow, execution1, Map.of("failed", true)),
Duration.ofSeconds(60)
@@ -92,10 +93,10 @@ class FinallyTest {
}
@Test
@LoadFlows({"flows/valids/finally-sequential-error.yaml"})
@LoadFlows(value = {"flows/valids/finally-sequential-error.yaml"}, tenantId = TENANT_ID)
void sequentialErrorBlockWithErrors() throws QueueException, TimeoutException {
Execution execution = runnerUtils.runOne(
MAIN_TENANT,
TENANT_ID,
NAMESPACE, "finally-sequential-error", null,
(flow, execution1) -> flowIO.readExecutionInputs(flow, execution1, Map.of("failed", true)),
Duration.ofSeconds(60)
@@ -128,10 +129,10 @@ class FinallyTest {
}
@Test
@LoadFlows({"flows/valids/finally-allowfailure.yaml"})
@LoadFlows(value = {"flows/valids/finally-allowfailure.yaml"}, tenantId = TENANT_ID)
void allowFailureWithErrors() throws QueueException, TimeoutException {
Execution execution = runnerUtils.runOne(
MAIN_TENANT,
TENANT_ID,
NAMESPACE, "finally-allowfailure", null,
(flow, execution1) -> flowIO.readExecutionInputs(flow, execution1, Map.of("failed", true)),
Duration.ofSeconds(60)
@@ -164,10 +165,10 @@ class FinallyTest {
}
@Test
@LoadFlows({"flows/valids/finally-parallel.yaml"})
@LoadFlows(value = {"flows/valids/finally-parallel.yaml"}, tenantId = TENANT_ID)
void parallelWithErrors() throws QueueException, TimeoutException {
Execution execution = runnerUtils.runOne(
MAIN_TENANT,
TENANT_ID,
NAMESPACE, "finally-parallel", null,
(flow, execution1) -> flowIO.readExecutionInputs(flow, execution1, Map.of("failed", true)),
Duration.ofSeconds(60)
@@ -183,10 +184,10 @@ class FinallyTest {
}
@Test
@LoadFlows({"flows/valids/finally-foreach.yaml"})
@LoadFlows(value = {"flows/valids/finally-foreach.yaml"}, tenantId = TENANT_ID)
void forEachWithoutErrors() throws QueueException, TimeoutException {
Execution execution = runnerUtils.runOne(
MAIN_TENANT,
TENANT_ID,
NAMESPACE, "finally-foreach", null,
(flow, execution1) -> flowIO.readExecutionInputs(flow, execution1, Map.of("failed", false)),
Duration.ofSeconds(60)
@@ -236,10 +237,10 @@ class FinallyTest {
}
@Test
@LoadFlows({"flows/valids/finally-eachparallel.yaml"})
@LoadFlows(value = {"flows/valids/finally-eachparallel.yaml"}, tenantId = TENANT_ID)
void eachParallelWithErrors() throws QueueException, TimeoutException {
Execution execution = runnerUtils.runOne(
MAIN_TENANT,
TENANT_ID,
NAMESPACE, "finally-eachparallel", null,
(flow, execution1) -> flowIO.readExecutionInputs(flow, execution1, Map.of("failed", true)),
Duration.ofSeconds(60)
@@ -255,10 +256,10 @@ class FinallyTest {
}
@Test
@LoadFlows({"flows/valids/finally-dag.yaml"})
@LoadFlows(value = {"flows/valids/finally-dag.yaml"}, tenantId = TENANT_ID)
void dagWithoutErrors() throws QueueException, TimeoutException {
Execution execution = runnerUtils.runOne(
MAIN_TENANT,
TENANT_ID,
NAMESPACE, "finally-dag", null,
(flow, execution1) -> flowIO.readExecutionInputs(flow, execution1, Map.of("failed", false)),
Duration.ofSeconds(60)
@@ -308,10 +309,10 @@ class FinallyTest {
}
@Test
@LoadFlows({"flows/valids/finally-flow.yaml"})
@LoadFlows(value = {"flows/valids/finally-flow.yaml"}, tenantId = TENANT_ID)
void flowWithErrors() throws QueueException, TimeoutException {
Execution execution = runnerUtils.runOne(
MAIN_TENANT,
TENANT_ID,
NAMESPACE, "finally-flow", null,
(flow, execution1) -> flowIO.readExecutionInputs(flow, execution1, Map.of("failed", true)),
Duration.ofSeconds(60)
@@ -342,10 +343,10 @@ class FinallyTest {
}
@Test
@LoadFlows({"flows/valids/finally-flow-error.yaml"})
@LoadFlows(value = {"flows/valids/finally-flow-error.yaml"}, tenantId = TENANT_ID)
void flowErrorBlockWithErrors() throws QueueException, TimeoutException {
Execution execution = runnerUtils.runOne(
MAIN_TENANT,
TENANT_ID,
NAMESPACE, "finally-flow-error", null,
(flow, execution1) -> flowIO.readExecutionInputs(flow, execution1, Map.of("failed", true)),
Duration.ofSeconds(60)

View File

@@ -98,7 +98,7 @@ public class FlowCaseTest {
testInherited ? "task-flow" : "task-flow-inherited-labels",
null,
(f, e) -> ImmutableMap.of("string", input),
Duration.ofMinutes(1),
Duration.ofSeconds(15),
testInherited ? List.of(new Label("mainFlowExecutionLabel", "execFoo")) : List.of()
);

View File

@@ -5,7 +5,9 @@ import io.kestra.core.junit.annotations.LoadFlows;
import org.junit.jupiter.api.Test;
import jakarta.inject.Inject;
import org.junit.jupiter.api.TestInstance;
@TestInstance(TestInstance.Lifecycle.PER_CLASS)
@KestraTest(startRunner = true)
class FlowTest {
@Inject
@@ -20,7 +22,7 @@ class FlowTest {
}
@Test
@LoadFlows({"flows/valids/task-flow.yaml",
@LoadFlows(value = {"flows/valids/task-flow.yaml",
"flows/valids/task-flow-inherited-labels.yaml",
"flows/valids/switch.yaml"})
void waitFailed() throws Exception {

View File

@@ -348,7 +348,7 @@ public class ForEachItemCaseTest {
Duration.ofSeconds(30));
// we should have triggered 26 subflows
assertThat(countDownLatch.await(1, TimeUnit.MINUTES)).isTrue();
assertTrue(countDownLatch.await(20, TimeUnit.SECONDS), "Remaining countdown: %s".formatted(countDownLatch.getCount()));
receive.blockLast();
// assert on the main flow execution

View File

@@ -20,27 +20,29 @@ import static org.assertj.core.api.Assertions.assertThat;
@KestraTest(startRunner = true)
class IfTest {
private static final String TENANT_ID = "true";
@Inject
private RunnerUtils runnerUtils;
@Test
@LoadFlows({"flows/valids/if-condition.yaml"})
@LoadFlows(value = {"flows/valids/if-condition.yaml"}, tenantId = TENANT_ID)
void ifTruthy() throws TimeoutException, QueueException {
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "if-condition", null,
Execution execution = runnerUtils.runOne(TENANT_ID, "io.kestra.tests", "if-condition", null,
(f, e) -> Map.of("param", true) , Duration.ofSeconds(120));
assertThat(execution.getTaskRunList()).hasSize(2);
assertThat(execution.findTaskRunsByTaskId("when-true").getFirst().getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "if-condition", null,
execution = runnerUtils.runOne(TENANT_ID, "io.kestra.tests", "if-condition", null,
(f, e) -> Map.of("param", "true") , Duration.ofSeconds(120));
assertThat(execution.getTaskRunList()).hasSize(2);
assertThat(execution.findTaskRunsByTaskId("when-true").getFirst().getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "if-condition", null,
execution = runnerUtils.runOne(TENANT_ID, "io.kestra.tests", "if-condition", null,
(f, e) -> Map.of("param", 1) , Duration.ofSeconds(120));
assertThat(execution.getTaskRunList()).hasSize(2);

View File

@@ -98,9 +98,9 @@ public class PauseTest {
}
@Test
@LoadFlows({"flows/valids/pause_on_resume.yaml"})
@LoadFlows(value = {"flows/valids/pause_on_resume.yaml"}, tenantId = "tenant1")
void runOnResumeMissingInputs() throws Exception {
suite.runOnResumeMissingInputs(runnerUtils);
suite.runOnResumeMissingInputs("tenant1", runnerUtils);
}
@Test
@@ -110,27 +110,27 @@ public class PauseTest {
}
@Test
@LoadFlows({"flows/valids/pause-behavior.yaml"})
@LoadFlows(value = {"flows/valids/pause-behavior.yaml"}, tenantId = "resume")
void runDurationWithCONTINUEBehavior() throws Exception {
suite.runDurationWithBehavior(runnerUtils, Pause.Behavior.RESUME);
suite.runDurationWithBehavior("resume", runnerUtils, Pause.Behavior.RESUME);
}
@Test
@LoadFlows({"flows/valids/pause-behavior.yaml"})
@LoadFlows(value = {"flows/valids/pause-behavior.yaml"}, tenantId = "fail")
void runDurationWithFAILBehavior() throws Exception {
suite.runDurationWithBehavior(runnerUtils, Pause.Behavior.FAIL);
suite.runDurationWithBehavior("fail", runnerUtils, Pause.Behavior.FAIL);
}
@Test
@LoadFlows({"flows/valids/pause-behavior.yaml"})
@LoadFlows(value = {"flows/valids/pause-behavior.yaml"}, tenantId = "warn")
void runDurationWithWARNBehavior() throws Exception {
suite.runDurationWithBehavior(runnerUtils, Pause.Behavior.WARN);
suite.runDurationWithBehavior("warn", runnerUtils, Pause.Behavior.WARN);
}
@Test
@LoadFlows({"flows/valids/pause-behavior.yaml"})
@LoadFlows(value = {"flows/valids/pause-behavior.yaml"}, tenantId = "cancel")
void runDurationWithCANCELBehavior() throws Exception {
suite.runDurationWithBehavior(runnerUtils, Pause.Behavior.CANCEL);
suite.runDurationWithBehavior("cancel", runnerUtils, Pause.Behavior.CANCEL);
}
@Test
@@ -329,8 +329,8 @@ public class PauseTest {
assertThat(CharStreams.toString(new InputStreamReader(storageInterface.get(MAIN_TENANT, null, URI.create((String) outputs.get("data")))))).isEqualTo(executionId);
}
public void runOnResumeMissingInputs(RunnerUtils runnerUtils) throws Exception {
Execution execution = runnerUtils.runOneUntilPaused(MAIN_TENANT, "io.kestra.tests", "pause_on_resume", null, null, Duration.ofSeconds(30));
public void runOnResumeMissingInputs(String tenantId, RunnerUtils runnerUtils) throws Exception {
Execution execution = runnerUtils.runOneUntilPaused(tenantId, "io.kestra.tests", "pause_on_resume", null, null, Duration.ofSeconds(30));
Flow flow = flowRepository.findByExecution(execution);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.PAUSED);
@@ -365,8 +365,8 @@ public class PauseTest {
assertThat(outputs.get("asked")).isEqualTo("MISSING");
}
public void runDurationWithBehavior(RunnerUtils runnerUtils, Pause.Behavior behavior) throws Exception {
Execution execution = runnerUtils.runOneUntilPaused(MAIN_TENANT, "io.kestra.tests", "pause-behavior", null, (unused, _unused) -> Map.of("behavior", behavior), Duration.ofSeconds(30));
public void runDurationWithBehavior(String tenantId, RunnerUtils runnerUtils, Pause.Behavior behavior) throws Exception {
Execution execution = runnerUtils.runOneUntilPaused(tenantId, "io.kestra.tests", "pause-behavior", null, (unused, _unused) -> Map.of("behavior", behavior), Duration.ofSeconds(30));
String executionId = execution.getId();
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.PAUSED);

View File

@@ -105,10 +105,10 @@ class RuntimeLabelsTest {
}
@Test
@LoadFlows({"flows/valids/primitive-labels-flow.yml"})
@LoadFlows(value = {"flows/valids/primitive-labels-flow.yml"}, tenantId = "tenant1")
void primitiveTypeLabelsOverrideExistingLabels() throws TimeoutException, QueueException {
Execution execution = runnerUtils.runOne(
MAIN_TENANT,
"tenant1",
"io.kestra.tests",
"primitive-labels-flow",
null,

View File

@@ -21,6 +21,8 @@ import org.junit.jupiter.api.Test;
@KestraTest(startRunner = true)
class StateTest {
public static final String FLOW_ID = "state";
public static final String NAMESPACE = "io.kestra.tests";
@Inject
private RunnerUtils runnerUtils;
@@ -30,17 +32,20 @@ class StateTest {
void set() throws TimeoutException, QueueException {
String stateName = IdUtils.create();
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "state", null, (f, e) -> ImmutableMap.of("state", stateName));
Execution execution = runnerUtils.runOne(MAIN_TENANT, NAMESPACE,
FLOW_ID, null, (f, e) -> ImmutableMap.of(FLOW_ID, stateName));
assertThat(execution.getTaskRunList()).hasSize(5);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(((Map<String, Integer>) execution.findTaskRunsByTaskId("createGet").getFirst().getOutputs().get("data")).get("value")).isEqualTo(1);
execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "state", null, (f, e) -> ImmutableMap.of("state", stateName));
execution = runnerUtils.runOne(MAIN_TENANT, NAMESPACE,
FLOW_ID, null, (f, e) -> ImmutableMap.of(FLOW_ID, stateName));
assertThat(execution.getTaskRunList()).hasSize(5);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(((Map<String, Object>) execution.findTaskRunsByTaskId("updateGet").getFirst().getOutputs().get("data")).get("value")).isEqualTo("2");
execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "state", null, (f, e) -> ImmutableMap.of("state", stateName));
execution = runnerUtils.runOne(MAIN_TENANT, NAMESPACE,
FLOW_ID, null, (f, e) -> ImmutableMap.of(FLOW_ID, stateName));
assertThat(execution.getTaskRunList()).hasSize(5);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat((Integer) execution.findTaskRunsByTaskId("deleteGet").getFirst().getOutputs().get("count")).isZero();
@@ -48,11 +53,12 @@ class StateTest {
@SuppressWarnings("unchecked")
@Test
@LoadFlows({"flows/valids/state.yaml"})
@LoadFlows(value = {"flows/valids/state.yaml"}, tenantId = "tenant1")
void each() throws TimeoutException, InternalException, QueueException {
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "state", null, (f, e) -> ImmutableMap.of("state", "each"));
assertThat(execution.getTaskRunList()).hasSize(17);
Execution execution = runnerUtils.runOne("tenant1", NAMESPACE,
FLOW_ID, null, (f, e) -> ImmutableMap.of(FLOW_ID, "each"));
assertThat(execution.getTaskRunList()).hasSize(19);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(((Map<String, String>) execution.findTaskRunByTaskIdAndValue("regetEach1", List.of("b")).getOutputs().get("data")).get("value")).isEqualTo("null-b");
assertThat(((Map<String, String>) execution.findTaskRunByTaskIdAndValue("regetEach2", List.of("b")).getOutputs().get("data")).get("value")).isEqualTo("null-a-b");

View File

@@ -21,10 +21,10 @@ class SwitchTest {
private RunnerUtils runnerUtils;
@Test
@LoadFlows({"flows/valids/switch.yaml"})
@LoadFlows(value = {"flows/valids/switch.yaml"}, tenantId = "switch")
void switchFirst() throws TimeoutException, QueueException {
Execution execution = runnerUtils.runOne(
MAIN_TENANT,
"switch",
"io.kestra.tests",
"switch",
null,
@@ -37,10 +37,10 @@ class SwitchTest {
}
@Test
@LoadFlows({"flows/valids/switch.yaml"})
@LoadFlows(value = {"flows/valids/switch.yaml"}, tenantId = "second")
void switchSecond() throws TimeoutException, QueueException {
Execution execution = runnerUtils.runOne(
MAIN_TENANT,
"second",
"io.kestra.tests",
"switch",
null,
@@ -54,10 +54,10 @@ class SwitchTest {
}
@Test
@LoadFlows({"flows/valids/switch.yaml"})
@LoadFlows(value = {"flows/valids/switch.yaml"}, tenantId = "third")
void switchThird() throws TimeoutException, QueueException {
Execution execution = runnerUtils.runOne(
MAIN_TENANT,
"third",
"io.kestra.tests",
"switch",
null,

View File

@@ -52,9 +52,9 @@ public class WorkingDirectoryTest {
}
@Test
@LoadFlows({"flows/valids/working-directory.yaml"})
@LoadFlows(value = {"flows/valids/working-directory.yaml"}, tenantId = "tenant1")
void failed() throws TimeoutException, QueueException {
suite.failed(runnerUtils);
suite.failed("tenant1", runnerUtils);
}
@Test
@@ -100,9 +100,9 @@ public class WorkingDirectoryTest {
}
@Test
@LoadFlows({"flows/valids/working-directory-outputs.yml"})
@LoadFlows(value = {"flows/valids/working-directory-outputs.yml"}, tenantId = "output")
void outputFiles() throws Exception {
suite.outputFiles(runnerUtils);
suite.outputFiles("output", runnerUtils);
}
@Test
@@ -132,8 +132,8 @@ public class WorkingDirectoryTest {
assertThat((String) execution.getTaskRunList().get(3).getOutputs().get("value")).startsWith("kestra://");
}
public void failed(RunnerUtils runnerUtils) throws TimeoutException, QueueException {
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "working-directory", null,
public void failed(String tenantId, RunnerUtils runnerUtils) throws TimeoutException, QueueException {
Execution execution = runnerUtils.runOne(tenantId, "io.kestra.tests", "working-directory", null,
(f, e) -> ImmutableMap.of("failed", "true"), Duration.ofSeconds(60)
);
@@ -151,9 +151,9 @@ public class WorkingDirectoryTest {
}
@SuppressWarnings("unchecked")
public void outputFiles(RunnerUtils runnerUtils) throws TimeoutException, IOException, QueueException {
public void outputFiles(String tenantId, RunnerUtils runnerUtils) throws TimeoutException, IOException, QueueException {
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "working-directory-outputs");
Execution execution = runnerUtils.runOne(tenantId, "io.kestra.tests", "working-directory-outputs");
assertThat(execution.getTaskRunList()).hasSize(2);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);

View File

@@ -23,6 +23,8 @@ import jakarta.annotation.Nullable;
import jakarta.inject.Inject;
import org.apache.commons.io.IOUtils;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.parallel.Execution;
import org.junit.jupiter.api.parallel.ExecutionMode;
import org.reactivestreams.Publisher;
import reactor.core.publisher.Mono;
@@ -42,6 +44,7 @@ import static org.assertj.core.api.Assertions.assertThat;
import static org.junit.jupiter.api.Assertions.assertThrows;
@KestraTest
@Execution(ExecutionMode.SAME_THREAD)
class RequestTest {
@Inject
private TestRunContextFactory runContextFactory;

View File

@@ -13,6 +13,7 @@ import io.kestra.core.models.flows.GenericFlow;
import io.kestra.core.models.property.Property;
import io.kestra.core.repositories.FlowRepositoryInterface;
import io.kestra.core.runners.RunContext;
import io.kestra.core.storages.kv.KVEntry;
import io.kestra.core.storages.kv.KVMetadata;
import io.kestra.core.storages.kv.KVStore;
import io.kestra.core.storages.kv.KVValueAndMetadata;
@@ -24,7 +25,10 @@ import java.time.Duration;
import java.util.List;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.parallel.Execution;
import org.junit.jupiter.api.parallel.ExecutionMode;
@Execution(ExecutionMode.SAME_THREAD)
@KestraTest
public class PurgeKVTest {
@@ -187,9 +191,9 @@ public class PurgeKVTest {
Output output = purgeKV.run(runContext);
assertThat(output.getSize()).isEqualTo(2L);
assertThat(kvStore1.get("key_1")).isEmpty();
assertThat(kvStore1.get("key_2")).isEmpty();
assertThat(kvStore1.get("not_found")).isEmpty();
List<KVEntry> kvEntries = kvStore1.listAll();
assertThat(kvEntries.size()).isEqualTo(1);
assertThat(kvEntries.getFirst().key()).isEqualTo("not_found");
}
private void addNamespaces() {

View File

@@ -5,10 +5,12 @@ import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.models.kv.KVType;
import io.kestra.core.models.property.Property;
import io.kestra.core.runners.RunContext;
import io.kestra.core.storages.StorageInterface;
import io.kestra.core.storages.kv.KVEntry;
import io.kestra.core.storages.kv.KVMetadata;
import io.kestra.core.storages.kv.KVStore;
import io.kestra.core.storages.kv.KVStoreException;
import io.kestra.core.storages.kv.KVValue;
import io.kestra.core.storages.kv.KVValueAndMetadata;
import io.kestra.core.utils.TestsUtils;
import jakarta.inject.Inject;
import org.junit.jupiter.api.Assertions;
@@ -25,10 +27,6 @@ import static org.assertj.core.api.Assertions.assertThat;
@KestraTest
class SetTest {
static final String TEST_KEY = "test-key";
@Inject
StorageInterface storageInterface;
@Inject
TestRunContextFactory runContextFactory;
@@ -47,7 +45,7 @@ class SetTest {
var value = Map.of("date", Instant.now().truncatedTo(ChronoUnit.MILLIS), "int", 1, "string", "string");
String description = "myDescription";
final RunContext runContext = TestsUtils.mockRunContext(this.runContextFactory, set, Map.of(
"key", TEST_KEY,
"key", "no_ns_key",
"value", value,
"description", description
));
@@ -57,9 +55,13 @@ class SetTest {
// Then
final KVStore kv = runContext.namespaceKv(runContext.flowInfo().namespace());
assertThat(kv.getValue(TEST_KEY)).isEqualTo(Optional.of(new KVValue(value)));
assertThat(kv.list().getFirst().expirationDate()).isNull();
assertThat(kv.list().getFirst().description()).isEqualTo(description);
Optional<KVValue> kvValueOptional = kv.getValue("no_ns_key");
assertThat(kvValueOptional).isPresent().get().isEqualTo(new KVValue(value));
Optional<KVEntry> noNsKey = kv.get("no_ns_key");
assertThat(noNsKey).isPresent();
KVEntry kvEntry = noNsKey.get();
assertThat(kvEntry.expirationDate()).isNull();
assertThat(kvEntry.description()).isEqualTo(description);
}
@Test
@@ -67,7 +69,7 @@ class SetTest {
// Given
RunContext runContext = this.runContextFactory.of("io.kestra.test", Map.of(
"inputs", Map.of(
"key", TEST_KEY,
"key", "same_ns_key",
"value", "test-value"
)
));
@@ -85,7 +87,7 @@ class SetTest {
// Then
final KVStore kv = runContext.namespaceKv("io.kestra.test");
assertThat(kv.getValue(TEST_KEY)).isEqualTo(Optional.of(new KVValue("test-value")));
assertThat(kv.getValue("same_ns_key")).isEqualTo(Optional.of(new KVValue("test-value")));
assertThat(kv.list().getFirst().expirationDate()).isNull();
}
@@ -94,7 +96,7 @@ class SetTest {
// Given
RunContext runContext = this.runContextFactory.of("io.kestra.test", Map.of(
"inputs", Map.of(
"key", TEST_KEY,
"key", "child_ns_key",
"value", "test-value"
)
));
@@ -111,7 +113,7 @@ class SetTest {
// then
final KVStore kv = runContext.namespaceKv("io.kestra");
assertThat(kv.getValue(TEST_KEY)).isEqualTo(Optional.of(new KVValue("test-value")));
assertThat(kv.getValue("child_ns_key")).isEqualTo(Optional.of(new KVValue("test-value")));
assertThat(kv.list().getFirst().expirationDate()).isNull();
}
@@ -120,7 +122,7 @@ class SetTest {
// Given
RunContext runContext = this.runContextFactory.of("io.kestra.test", Map.of(
"inputs", Map.of(
"key", TEST_KEY,
"key", "non_existing_ns_key",
"value", "test-value"
)
));
@@ -150,7 +152,7 @@ class SetTest {
var value = Map.of("date", Instant.now().truncatedTo(ChronoUnit.MILLIS), "int", 1, "string", "string");
final RunContext runContext = TestsUtils.mockRunContext(this.runContextFactory, set, Map.of(
"key", TEST_KEY,
"key", "ttl_key",
"value", value
));
@@ -159,13 +161,13 @@ class SetTest {
// Then
final KVStore kv = runContext.namespaceKv(runContext.flowInfo().namespace());
assertThat(kv.getValue(TEST_KEY)).isEqualTo(Optional.of(new KVValue(value)));
Instant expirationDate = kv.get(TEST_KEY).get().expirationDate();
assertThat(kv.getValue("ttl_key")).isEqualTo(Optional.of(new KVValue(value)));
Instant expirationDate = kv.get("ttl_key").get().expirationDate();
assertThat(expirationDate.isAfter(Instant.now().plus(Duration.ofMinutes(4))) && expirationDate.isBefore(Instant.now().plus(Duration.ofMinutes(6)))).isTrue();
}
@Test
void shouldFailGivenExistingKeyAndOverwriteFalse() {
void shouldFailGivenExistingKeyAndOverwriteFalse() throws Exception {
// Given
Set set = Set.builder()
.id(Set.class.getSimpleName())
@@ -177,45 +179,49 @@ class SetTest {
var value = Map.of("date", Instant.now().truncatedTo(ChronoUnit.MILLIS), "int", 1, "string", "string");
final RunContext runContext = TestsUtils.mockRunContext(this.runContextFactory, set, Map.of(
"key", TEST_KEY,
"key", "existing_key",
"value", value
));
// When - Then
//set key a first:
runContext.namespaceKv(runContext.flowInfo().namespace()).put("existing_key", new KVValueAndMetadata(new KVMetadata("unused", null), value));
//fail because key is already set
KVStoreException exception = Assertions.assertThrows(KVStoreException.class, () -> set.run(runContext));
assertThat(exception.getMessage()).isEqualTo("Cannot set value for key '" + TEST_KEY + "'. Key already exists and `overwrite` is set to `false`.");
assertThat(exception.getMessage()).isEqualTo("Cannot set value for key 'existing_key'. Key already exists and `overwrite` is set to `false`.");
}
@Test
void typeSpecified() throws Exception {
KVStore kv = createAndPerformSetTask("123.45", KVType.NUMBER);
assertThat(kv.getValue(TEST_KEY).orElseThrow().value()).isEqualTo(123.45);
String key = "specified_key";
KVStore kv = createAndPerformSetTask(key, "123.45", KVType.NUMBER);
assertThat(kv.getValue(key).orElseThrow().value()).isEqualTo(123.45);
kv = createAndPerformSetTask("true", KVType.BOOLEAN);
assertThat((Boolean) kv.getValue(TEST_KEY).orElseThrow().value()).isTrue();
kv = createAndPerformSetTask(key, "true", KVType.BOOLEAN);
assertThat((Boolean) kv.getValue(key).orElseThrow().value()).isTrue();
kv = createAndPerformSetTask("2023-05-02T01:02:03Z", KVType.DATETIME);
assertThat(kv.getValue(TEST_KEY).orElseThrow().value()).isEqualTo(Instant.parse("2023-05-02T01:02:03Z"));
kv = createAndPerformSetTask(key, "2023-05-02T01:02:03Z", KVType.DATETIME);
assertThat(kv.getValue(key).orElseThrow().value()).isEqualTo(Instant.parse("2023-05-02T01:02:03Z"));
kv = createAndPerformSetTask("P1DT5S", KVType.DURATION);
kv = createAndPerformSetTask(key, "P1DT5S", KVType.DURATION);
// TODO Hack meanwhile we handle duration serialization as currently they are stored as bigint...
assertThat((long) Double.parseDouble(kv.getValue(TEST_KEY).orElseThrow().value().toString())).isEqualTo(Duration.ofDays(1).plus(Duration.ofSeconds(5)).toSeconds());
assertThat((long) Double.parseDouble(kv.getValue(key).orElseThrow().value().toString())).isEqualTo(Duration.ofDays(1).plus(Duration.ofSeconds(5)).toSeconds());
kv = createAndPerformSetTask("[{\"some\":\"value\"},{\"another\":\"value\"}]", KVType.JSON);
assertThat(kv.getValue(TEST_KEY).orElseThrow().value()).isEqualTo(List.of(Map.of("some", "value"), Map.of("another", "value")));
kv = createAndPerformSetTask(key, "[{\"some\":\"value\"},{\"another\":\"value\"}]", KVType.JSON);
assertThat(kv.getValue(key).orElseThrow().value()).isEqualTo(List.of(Map.of("some", "value"), Map.of("another", "value")));
kv = createAndPerformSetTask("{{ 200 }}", KVType.STRING);
assertThat(kv.getValue(TEST_KEY).orElseThrow().value()).isEqualTo("200");
kv = createAndPerformSetTask(key, "{{ 200 }}", KVType.STRING);
assertThat(kv.getValue(key).orElseThrow().value()).isEqualTo("200");
kv = createAndPerformSetTask("{{ 200.1 }}", KVType.STRING);
assertThat(kv.getValue(TEST_KEY).orElseThrow().value()).isEqualTo("200.1");
kv = createAndPerformSetTask(key, "{{ 200.1 }}", KVType.STRING);
assertThat(kv.getValue(key).orElseThrow().value()).isEqualTo("200.1");
}
private KVStore createAndPerformSetTask(String value, KVType type) throws Exception {
private KVStore createAndPerformSetTask(String key, String value, KVType type) throws Exception {
Set set = Set.builder()
.id(Set.class.getSimpleName())
.type(Set.class.getName())
.key(Property.ofValue(TEST_KEY))
.key(Property.ofValue(key))
.value(value.contains("{{") ? Property.ofExpression(value) : Property.ofValue(value))
.kvType(Property.ofValue(type))
.build();

View File

@@ -10,6 +10,7 @@ import jakarta.inject.Inject;
import java.time.temporal.ChronoUnit;
import java.util.stream.Stream;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.parallel.ExecutionMode;
import org.junit.jupiter.params.ParameterizedTest;
import org.junit.jupiter.params.provider.Arguments;
import org.junit.jupiter.params.provider.MethodSource;
@@ -51,7 +52,7 @@ class PurgeLogsTest {
assertThat((int) execution.getTaskRunList().getFirst().getOutputs().get("count")).isPositive();
}
@org.junit.jupiter.api.parallel.Execution(ExecutionMode.SAME_THREAD)
@ParameterizedTest
@MethodSource("buildArguments")
@LoadFlows("flows/valids/purge_logs_full_arguments.yaml")

Some files were not shown because too many files have changed in this diff Show More