Compare commits

..

66 Commits

Author SHA1 Message Date
YannC.
baa07dd02b fix: disabled flakky test shouldGetReport 2025-09-30 13:16:48 +02:00
github-actions[bot]
260cb50651 chore(version): update to version '1.0.3' 2025-09-30 07:07:34 +00:00
YannC
0a45325c69 fix(ui): avoid having a authentication dialog open when credentials are wrong (#11576) 2025-09-30 09:00:55 +02:00
Florian Hussonnois
c2522e2544 fix(triggers): do not resolve recoverMissedSchedule when enabling back a trigger
Add some refactoring to allow some methods to be overrided
2025-09-29 20:43:35 +02:00
Florian Hussonnois
27476279ae fix(triggers): handle RecoverMissedSchedules on trigger batch update
* Fix and clean code in TriggerController
* Remove duplicate code in Trigger class
2025-09-29 20:43:34 +02:00
YannC.
3cc6372cb5 fix: missing import due to backport 2025-09-29 18:09:25 +02:00
YannC
5f6e9dbe06 fix(dashboard): show startDate instead of duration in defaults, and avoid formatting date in JDBC if there is no aggregations (#11467)
close #5867
2025-09-29 17:51:36 +02:00
yuri1969
5078ce741d fix(core): enable runIf at execution updating tasks 2025-09-25 14:46:08 +02:00
github-actions[bot]
b7e17b7114 chore(version): update to version '1.0.2' 2025-09-24 08:03:43 +00:00
nKwiatkowski
acaee34b0e chore(version): update to version '1.0.1' 2025-09-24 10:03:23 +02:00
github-actions[bot]
1d78332505 chore(version): update to version '1.0.2' 2025-09-24 08:02:25 +00:00
nKwiatkowski
7249632510 fix(tests): disable flaky test that prevent the release 2025-09-24 10:01:43 +02:00
Sanjay Ramsinghani
4a66a08c3b chore(core): align toggle icon in failed execution collapse element (#11430)
Closes https://github.com/kestra-io/kestra/issues/11406.

Co-authored-by: Miloš Paunović <paun992@hotmail.com>
2025-09-23 14:20:10 +02:00
Antoine Gauthier
22fd6e97ea chore(logs): display copy button only on row hover (#11254)
Closes https://github.com/kestra-io/kestra/issues/11220.

Co-authored-by: Miloš Paunović <paun992@hotmail.com>
2025-09-23 14:18:34 +02:00
Jaem Dessources
9afd86d32b fix(core): align copy logs button to each row’s right edge (#11216)
Closes https://github.com/kestra-io/kestra/issues/10898.

Co-authored-by: Miloš Paunović <paun992@hotmail.com>
2025-09-23 14:18:28 +02:00
github-actions[bot]
797ea6c9e4 chore(version): update to version '1.0.2' 2025-09-23 12:10:01 +00:00
nKwiatkowski
07d5e815c4 chore(version): update to version '1.0.1' 2025-09-23 14:09:38 +02:00
github-actions[bot]
33ac9b1495 chore(version): update to version '1.0.2' 2025-09-23 09:22:01 +00:00
Bart Ledoux
4d5b95d040 chore: update package-lock 2025-09-23 11:17:48 +02:00
brian-mulier-p
667aca7345 fix(ai): avoid moving cursor twice after using AI Copilot (#11451)
closes #11314
2025-09-23 10:40:32 +02:00
brian.mulier
e05cc65202 fix(system): avoid trigger locking after scheduler restart
closes #11434
2025-09-22 18:40:22 +02:00
brian.mulier
71b606c27c fix(ci): same CI as develop 2025-09-22 18:40:19 +02:00
Florian Hussonnois
47f9f12ce8 chore(websever): make kvStore method in KVController protected
Related-to: kestra-io/kestra-ee#5055
2025-09-22 13:57:59 +02:00
Florian Hussonnois
01acae5e97 feat(core): add new findMetadataAndValue to KVStore
Related-to: kestra-io/kestra-ee#5055
2025-09-22 13:57:58 +02:00
Florian Hussonnois
e5878f08b7 fix(core): fix NPE in JackMapping.applyPatchesOnJsonNode method 2025-09-22 13:57:57 +02:00
brian-mulier-p
0bcb6b4e0d fix(tests): enforce closing consumers after each tests (#11399) 2025-09-19 16:35:23 +02:00
brian-mulier-p
3c2ecf4342 fix(core): avoid ClassCastException when doing secret decryption (#11393)
closes kestra-io/kestra-ee#5191
2025-09-19 11:32:27 +02:00
Piyush Bhaskar
3d4f66772e fix(core: webhook curl coomand needs tenant. 2025-09-19 14:17:00 +05:30
Sandip Mandal
e2afd4bcc3 fix(core: webhook curl coomand needs tenant. (#11391)
Co-authored-by: Piyush Bhaskar <102078527+Piyush-r-bhaskar@users.noreply.github.com>
Co-authored-by: Miloš Paunović <paun992@hotmail.com>
2025-09-19 14:10:36 +05:30
Loïc Mathieu
d143097f03 fix(executions): computing subflow outputs could fail when the executioin is failing or killing
Fixes https://github.com/kestra-io/kestra/issues/11379
2025-09-18 17:42:15 +02:00
Loïc Mathieu
72c0d91c1a fix(executions): concurrency limit should update the executioin
As if it's not updated in the database, it would not be detected as changed so that terminal actions (like purge) would not be done.

Fixes  #11022
Fixes #11025
Fixes #8143
2025-09-18 12:10:36 +02:00
Loïc Mathieu
1d692e56b0 fix(executions): the Exit task was not correctly ends parent tasks
Fixes https://github.com/kestra-io/kestra-ee/issues/5168
2025-09-18 11:39:16 +02:00
Miloš Paunović
0352d617ac chore(core): improve coloring scheme for dependencies graph (#11306) 2025-09-18 09:22:27 +02:00
Miloš Paunović
b41aa4e0b9 fix(core): adjust positioning of default tour elements (#11286)
The problem occurred when `No Code` was selected as the `Default Editor Type` in `Settings`. This `PR` resolves the issue.

Closes https://github.com/kestra-io/kestra/issues/9556.
2025-09-18 09:21:40 +02:00
Miloš Paunović
d811dc030b chore(core): ensure editor suggestion widget renders above other elements (#11258)
Closes https://github.com/kestra-io/kestra/issues/10702.
Closes https://github.com/kestra-io/kestra/issues/11033.
2025-09-18 09:21:18 +02:00
Miloš Paunović
105e62eee1 fix(namespaces): open details page at top (#11221)
Closes https://github.com/kestra-io/kestra/issues/10536.
2025-09-18 09:20:55 +02:00
Loïc Mathieu
28796862a4 fix(executions): possible NPE on dynamic taskrun
Fixes https://github.com/kestra-io/kestra-ee/issues/5166
2025-09-17 15:56:28 +02:00
brian.mulier
637cd794a4 fix(core): filters weren't applying anymore 2025-09-17 12:57:47 +02:00
Miloš Paunović
fdd5c6e63d chore(core): remove unused decompress library (#11346) 2025-09-17 11:15:43 +02:00
brian.mulier
eda2483ec9 fix(core): avoid filters from overlapping on other pages when changing query params 2025-09-17 10:37:58 +02:00
brian.mulier
7b3c296489 fix(core): avoid clearing filters when reclicking on current left menu item
closes #9476
2025-09-17 10:37:56 +02:00
brian.mulier
fe6f8b4ed9 fix(core): avoid undefined error on refresh chart 2025-09-17 10:37:04 +02:00
Roman Acevedo
17ff539690 ci: fix some non-release workflows were not using develop 2025-09-16 14:43:24 +02:00
Roman Acevedo
bbd0dda47e ci: readd back workflow-publish-docker.yml needed for release 2025-09-16 12:16:15 +02:00
github-actions[bot]
27a8e8b5a7 chore(version): update to version '1.0.1' 2025-09-16 10:00:39 +00:00
Roman Acevedo
d6620a34cd ci: try to use develop CI workflows 2025-09-16 11:38:34 +02:00
Loïc Mathieu
6f8b3c5cfd fix(flows): properly coompute flow dependencies with preconditions
When both upstream flows and where are set, it should be a AND between the two as dependencies must match the upstream flows.

Fixes #11164
2025-09-16 10:44:26 +02:00
Florian Hussonnois
6da6cbab60 fix(executions): add missing CrudEvent on purge execution
Related-to: kestra-io/kestra-ee#5061
2025-09-16 10:30:53 +02:00
Loïc Mathieu
a899e16178 fix(system): allow flattening a map with duplicated keys 2025-09-16 10:25:25 +02:00
Florian Hussonnois
568cd0b0c7 fix(core): fix CrudEvent model for DELETE operation
Refactor XxxRepository class to use new factory methods
from the CrudEvent class

Related-to: kestra-io/kestra-ee#4727
2025-09-15 18:51:36 +02:00
Loïc Mathieu
92e1dcb6eb fix(executions): truncate the execution_running table as in 0.24 there was an issue in the purge
This table contains executions for flows that have a concurrency that are currently running.
It has been added in 0.24 but in that release there was a bug that may prevent some records to being correctly removed from this table.
To fix that, we truncate it once.
2025-09-15 17:30:08 +02:00
brian-mulier-p
499e040cd0 fix(test): add tenant-in-path storage test (#11292)
part of kestra-io/storage-s3#166
2025-09-15 16:53:56 +02:00
brian-mulier-p
5916831d62 fix(security): enhance basic auth security (#11285)
closes kestra-io/kestra-ee#5111
2025-09-15 16:28:16 +02:00
Bart Ledoux
0b1b55957e fix: remove last uses of vuex as a store 2025-09-12 16:23:25 +02:00
Bart Ledoux
7ee40d376a flows: clear tasks list when last task is deleted 2025-09-12 16:15:36 +02:00
Florian Hussonnois
e2c9b3e256 fix(core): make CRC32 for plugin JARs lazy
Make CRC32 calculation for lazy plugin JAR files
to avoid excessive startup time and performance impact.

Avoid byte buffer reallocation while computing CRC32.
2025-09-12 14:02:23 +02:00
brian-mulier-p
556730777b fix(core): add ability to remap sort keys (#11233)
part of kestra-io/kestra-ee#5075
2025-09-12 09:44:32 +02:00
brian.mulier
c1a75a431f fix(ai): increase maxOutputToken default 2025-09-11 18:24:21 +02:00
brian-mulier-p
4a5b91667a fix(flows): avoid failing flow dependencies with dynamic defaults (#11166)
closes #11117
2025-09-10 16:15:04 +02:00
Roman Acevedo
f7b2af16a1 fix(flows): topology would not load when having many flows and cyclic relations
- this will probably fix https://github.com/kestra-io/kestra-ee/issues/4980

the issue was recursiveFlowTopology was returning a lot of duplicates, it was aggravated when having many Flows and multiple Flow triggers
2025-09-10 16:14:41 +02:00
Loïc Mathieu
9351cb22e0 fixsystem): always load netty from the app classloader
As Netty is used in core and a lot of plugins, and we already load project reactor from the app classloader that depends in Netty.

Fixes https://github.com/kestra-io/kestra-ee/issues/5038
2025-09-10 10:51:31 +02:00
brian-mulier-p
b1ecb82fdc fix(namespaces): avoid adding 'company.team' as default ns (#11174)
closes #11168
2025-09-09 17:14:27 +02:00
Miloš Paunović
c6d56151eb chore(flows): display correct flow dependency count (#11169)
Closes https://github.com/kestra-io/kestra/issues/11127.
2025-09-09 13:57:00 +02:00
François Delbrayelle
ed4398467a fix(outputs): open external file was not working (#11154) 2025-09-09 09:46:02 +02:00
brian-mulier-p
c51947419a chore(ci): add LTS tagging (#11131) 2025-09-08 14:10:53 +02:00
github-actions[bot]
ccb6a1f4a7 chore(version): update to version 'v1.0.0'. 2025-09-08 08:00:59 +00:00
568 changed files with 9810 additions and 9743 deletions

View File

@@ -4,7 +4,6 @@ on:
pull_request:
branches:
- develop
- releases/*
concurrency:
group: ${{ github.workflow }}-${{ github.ref_name }}-pr

View File

@@ -33,10 +33,10 @@
<p align="center">
<a href="https://go.kestra.io/video/product-overview" target="_blank">
<img src="https://kestra.io/startvideo.png" alt="Get started in 3 minutes with Kestra" width="640px" />
<img src="https://kestra.io/startvideo.png" alt="Get started in 4 minutes with Kestra" width="640px" />
</a>
</p>
<p align="center" style="color:grey;"><i>Click on the image to learn how to get started with Kestra in 3 minutes.</i></p>
<p align="center" style="color:grey;"><i>Click on the image to learn how to get started with Kestra in 4 minutes.</i></p>
## 🌟 What is Kestra?

View File

@@ -32,12 +32,12 @@ plugins {
// release
id 'net.researchgate.release' version '3.1.0'
id "com.gorylenko.gradle-git-properties" version "2.5.3"
id "com.gorylenko.gradle-git-properties" version "2.5.2"
id 'signing'
id "com.vanniktech.maven.publish" version "0.34.0"
// OWASP dependency check
id "org.owasp.dependencycheck" version "12.1.5" apply false
id "org.owasp.dependencycheck" version "12.1.3" apply false
}
idea {
@@ -168,9 +168,8 @@ allprojects {
/**********************************************************************************************************************\
* Test
**********************************************************************************************************************/
subprojects {subProj ->
if (subProj.name != 'platform' && subProj.name != 'jmh-benchmarks') {
subprojects {
if (it.name != 'platform' && it.name != 'jmh-benchmarks') {
apply plugin: "com.adarshr.test-logger"
java {
@@ -208,13 +207,6 @@ subprojects {subProj ->
test {
useJUnitPlatform()
reports {
junitXml.required = true
junitXml.outputPerTestCase = true
junitXml.mergeReruns = true
junitXml.includeSystemErrLog = true;
junitXml.outputLocation = layout.buildDirectory.dir("test-results/test")
}
// set Xmx for test workers
maxHeapSize = '4g'
@@ -230,15 +222,6 @@ subprojects {subProj ->
environment 'SECRET_PASSWORD', "cGFzc3dvcmQ="
environment 'ENV_TEST1', "true"
environment 'ENV_TEST2', "Pass by env"
if (subProj.name == 'core' || subProj.name == 'jdbc-h2' || subProj.name == 'jdbc-mysql' || subProj.name == 'jdbc-postgres') {
// JUnit 5 parallel settings
systemProperty 'junit.jupiter.execution.parallel.enabled', 'true'
systemProperty 'junit.jupiter.execution.parallel.mode.default', 'concurrent'
systemProperty 'junit.jupiter.execution.parallel.mode.classes.default', 'same_thread'
systemProperty 'junit.jupiter.execution.parallel.config.strategy', 'dynamic'
}
}
testlogger {

View File

@@ -40,6 +40,5 @@ dependencies {
implementation project(":worker")
//test
testImplementation project(':tests')
testImplementation "org.wiremock:wiremock-jetty12"
}

View File

@@ -49,7 +49,7 @@ import java.util.concurrent.Callable;
@Introspected
public class App implements Callable<Integer> {
public static void main(String[] args) {
execute(App.class, new String [] { Environment.CLI }, args);
execute(App.class, args);
}
@Override
@@ -57,13 +57,13 @@ public class App implements Callable<Integer> {
return PicocliRunner.call(App.class, "--help");
}
protected static void execute(Class<?> cls, String[] environments, String... args) {
protected static void execute(Class<?> cls, String... args) {
// Log Bridge
SLF4JBridgeHandler.removeHandlersForRootLogger();
SLF4JBridgeHandler.install();
// Init ApplicationContext
ApplicationContext applicationContext = App.applicationContext(cls, environments, args);
ApplicationContext applicationContext = App.applicationContext(cls, args);
// Call Picocli command
int exitCode = 0;
@@ -80,7 +80,6 @@ public class App implements Callable<Integer> {
System.exit(Objects.requireNonNullElse(exitCode, 0));
}
/**
* Create an {@link ApplicationContext} with additional properties based on configuration files (--config) and
* forced Properties from current command.
@@ -89,13 +88,12 @@ public class App implements Callable<Integer> {
* @return the application context created
*/
protected static ApplicationContext applicationContext(Class<?> mainClass,
String[] environments,
String[] args) {
ApplicationContextBuilder builder = ApplicationContext
.builder()
.mainClass(mainClass)
.environments(environments);
.environments(Environment.CLI);
CommandLine cmd = new CommandLine(mainClass, CommandLine.defaultFactory());
continueOnParsingErrors(cmd);

View File

@@ -2,27 +2,19 @@ package io.kestra.cli.commands.servers;
import io.kestra.cli.AbstractCommand;
import io.kestra.core.contexts.KestraContext;
import lombok.extern.slf4j.Slf4j;
import jakarta.annotation.PostConstruct;
import picocli.CommandLine;
@Slf4j
public abstract class AbstractServerCommand extends AbstractCommand implements ServerCommandInterface {
abstract public class AbstractServerCommand extends AbstractCommand implements ServerCommandInterface {
@CommandLine.Option(names = {"--port"}, description = "The port to bind")
Integer serverPort;
@Override
public Integer call() throws Exception {
log.info("Machine information: {} available cpu(s), {}MB max memory, Java version {}", Runtime.getRuntime().availableProcessors(), maxMemoryInMB(), Runtime.version());
this.shutdownHook(true, () -> KestraContext.getContext().shutdown());
return super.call();
}
private long maxMemoryInMB() {
return Runtime.getRuntime().maxMemory() / 1024 / 1024;
}
protected static int defaultWorkerThread() {
return Runtime.getRuntime().availableProcessors() * 8;
}

View File

@@ -262,8 +262,6 @@ public class FileChangedEventListener {
}
private String getTenantIdFromPath(Path path) {
// FIXME there is probably a bug here when a tenant has '_' in its name,
// a valid tenant name is defined with following regex: "^[a-z0-9][a-z0-9_-]*"
return path.getFileName().toString().split("_")[0];
}
}

View File

@@ -169,6 +169,7 @@ kestra:
- "/api/v1/executions/webhook/"
- "/api/v1/main/executions/webhook/"
- "/api/v1/*/executions/webhook/"
- "/api/v1/basicAuthValidationErrors"
preview:
initial-rows: 100

View File

@@ -37,7 +37,7 @@ class AppTest {
final String[] args = new String[]{"server", serverType, "--help"};
try (ApplicationContext ctx = App.applicationContext(App.class, new String [] { Environment.CLI }, args)) {
try (ApplicationContext ctx = App.applicationContext(App.class, args)) {
new CommandLine(App.class, new MicronautFactory(ctx)).execute(args);
assertTrue(ctx.getProperty("kestra.server-type", ServerType.class).isEmpty());
@@ -52,7 +52,7 @@ class AppTest {
final String[] argsWithMissingParams = new String[]{"flow", "namespace", "update"};
try (ApplicationContext ctx = App.applicationContext(App.class, new String [] { Environment.CLI }, argsWithMissingParams)) {
try (ApplicationContext ctx = App.applicationContext(App.class, argsWithMissingParams)) {
new CommandLine(App.class, new MicronautFactory(ctx)).execute(argsWithMissingParams);
assertThat(out.toString()).startsWith("Missing required parameters: ");

View File

@@ -4,11 +4,11 @@ import io.kestra.core.models.flows.Flow;
import io.kestra.core.models.flows.GenericFlow;
import io.kestra.core.repositories.FlowRepositoryInterface;
import io.kestra.core.utils.Await;
import io.kestra.core.utils.TestsUtils;
import io.micronaut.test.extensions.junit5.annotation.MicronautTest;
import jakarta.inject.Inject;
import org.apache.commons.io.FileUtils;
import org.junit.jupiter.api.*;
import org.junitpioneer.jupiter.RetryingTest;
import java.io.IOException;
import java.nio.file.Files;
@@ -18,8 +18,8 @@ import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.TimeoutException;
import java.util.concurrent.atomic.AtomicBoolean;
import org.junitpioneer.jupiter.RetryingTest;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static io.kestra.core.utils.Rethrow.throwRunnable;
import static org.assertj.core.api.Assertions.assertThat;
@@ -57,11 +57,10 @@ class FileChangedEventListenerTest {
}
}
@Test
@RetryingTest(5) // Flaky on CI but always pass locally
void test() throws IOException, TimeoutException {
var tenant = TestsUtils.randomTenant(FileChangedEventListenerTest.class.getSimpleName(), "test");
// remove the flow if it already exists
flowRepository.findByIdWithSource(tenant, "io.kestra.tests.watch", "myflow").ifPresent(flow -> flowRepository.delete(flow));
flowRepository.findByIdWithSource(MAIN_TENANT, "io.kestra.tests.watch", "myflow").ifPresent(flow -> flowRepository.delete(flow));
// create a basic flow
String flow = """
@@ -74,14 +73,14 @@ class FileChangedEventListenerTest {
message: Hello World! 🚀
""";
GenericFlow genericFlow = GenericFlow.fromYaml(tenant, flow);
GenericFlow genericFlow = GenericFlow.fromYaml(MAIN_TENANT, flow);
Files.write(Path.of(FILE_WATCH + "/" + genericFlow.uidWithoutRevision() + ".yaml"), flow.getBytes());
Await.until(
() -> flowRepository.findById(tenant, "io.kestra.tests.watch", "myflow").isPresent(),
() -> flowRepository.findById(MAIN_TENANT, "io.kestra.tests.watch", "myflow").isPresent(),
Duration.ofMillis(100),
Duration.ofSeconds(10)
);
Flow myflow = flowRepository.findById(tenant, "io.kestra.tests.watch", "myflow").orElseThrow();
Flow myflow = flowRepository.findById(MAIN_TENANT, "io.kestra.tests.watch", "myflow").orElseThrow();
assertThat(myflow.getTasks()).hasSize(1);
assertThat(myflow.getTasks().getFirst().getId()).isEqualTo("hello");
assertThat(myflow.getTasks().getFirst().getType()).isEqualTo("io.kestra.plugin.core.log.Log");
@@ -89,17 +88,16 @@ class FileChangedEventListenerTest {
// delete the flow
Files.delete(Path.of(FILE_WATCH + "/" + genericFlow.uidWithoutRevision() + ".yaml"));
Await.until(
() -> flowRepository.findById(tenant, "io.kestra.tests.watch", "myflow").isEmpty(),
() -> flowRepository.findById(MAIN_TENANT, "io.kestra.tests.watch", "myflow").isEmpty(),
Duration.ofMillis(100),
Duration.ofSeconds(10)
);
}
@RetryingTest(2)
@RetryingTest(5) // Flaky on CI but always pass locally
void testWithPluginDefault() throws IOException, TimeoutException {
var tenant = TestsUtils.randomTenant(FileChangedEventListenerTest.class.getName(), "testWithPluginDefault");
// remove the flow if it already exists
flowRepository.findByIdWithSource(tenant, "io.kestra.tests.watch", "pluginDefault").ifPresent(flow -> flowRepository.delete(flow));
flowRepository.findByIdWithSource(MAIN_TENANT, "io.kestra.tests.watch", "pluginDefault").ifPresent(flow -> flowRepository.delete(flow));
// create a flow with plugin default
String pluginDefault = """
@@ -115,14 +113,14 @@ class FileChangedEventListenerTest {
values:
message: Hello World!
""";
GenericFlow genericFlow = GenericFlow.fromYaml(tenant, pluginDefault);
GenericFlow genericFlow = GenericFlow.fromYaml(MAIN_TENANT, pluginDefault);
Files.write(Path.of(FILE_WATCH + "/" + genericFlow.uidWithoutRevision() + ".yaml"), pluginDefault.getBytes());
Await.until(
() -> flowRepository.findById(tenant, "io.kestra.tests.watch", "pluginDefault").isPresent(),
() -> flowRepository.findById(MAIN_TENANT, "io.kestra.tests.watch", "pluginDefault").isPresent(),
Duration.ofMillis(100),
Duration.ofSeconds(10)
);
Flow pluginDefaultFlow = flowRepository.findById(tenant, "io.kestra.tests.watch", "pluginDefault").orElseThrow();
Flow pluginDefaultFlow = flowRepository.findById(MAIN_TENANT, "io.kestra.tests.watch", "pluginDefault").orElseThrow();
assertThat(pluginDefaultFlow.getTasks()).hasSize(1);
assertThat(pluginDefaultFlow.getTasks().getFirst().getId()).isEqualTo("helloWithDefault");
assertThat(pluginDefaultFlow.getTasks().getFirst().getType()).isEqualTo("io.kestra.plugin.core.log.Log");
@@ -130,7 +128,7 @@ class FileChangedEventListenerTest {
// delete both files
Files.delete(Path.of(FILE_WATCH + "/" + genericFlow.uidWithoutRevision() + ".yaml"));
Await.until(
() -> flowRepository.findById(tenant, "io.kestra.tests.watch", "pluginDefault").isEmpty(),
() -> flowRepository.findById(MAIN_TENANT, "io.kestra.tests.watch", "pluginDefault").isEmpty(),
Duration.ofMillis(100),
Duration.ofSeconds(10)
);

View File

@@ -84,7 +84,7 @@ dependencies {
testImplementation "org.testcontainers:testcontainers:1.21.3"
testImplementation "org.testcontainers:junit-jupiter:1.21.3"
testImplementation "org.bouncycastle:bcpkix-jdk18on"
testImplementation "org.bouncycastle:bcpkix-jdk18on:1.81"
testImplementation "org.wiremock:wiremock-jetty12"
}

View File

@@ -2,13 +2,12 @@ package io.kestra.core.models;
import io.kestra.core.utils.MapUtils;
import jakarta.annotation.Nullable;
import jakarta.validation.constraints.NotEmpty;
import jakarta.validation.constraints.NotNull;
import java.util.*;
import java.util.function.Predicate;
import java.util.stream.Collectors;
public record Label(@NotEmpty String key, @NotEmpty String value) {
public record Label(@NotNull String key, @NotNull String value) {
public static final String SYSTEM_PREFIX = "system.";
// system labels
@@ -42,7 +41,7 @@ public record Label(@NotEmpty String key, @NotEmpty String value) {
public static Map<String, String> toMap(@Nullable List<Label> labels) {
if (labels == null || labels.isEmpty()) return Collections.emptyMap();
return labels.stream()
.filter(label -> label.value() != null && !label.value().isEmpty() && label.key() != null && !label.key().isEmpty())
.filter(label -> label.value() != null && label.key() != null)
// using an accumulator in case labels with the same key exists: the second is kept
.collect(Collectors.toMap(Label::key, Label::value, (first, second) -> second, LinkedHashMap::new));
}
@@ -57,7 +56,6 @@ public record Label(@NotEmpty String key, @NotEmpty String value) {
public static List<Label> deduplicate(@Nullable List<Label> labels) {
if (labels == null || labels.isEmpty()) return Collections.emptyList();
return toMap(labels).entrySet().stream()
.filter(getEntryNotEmptyPredicate())
.map(entry -> new Label(entry.getKey(), entry.getValue()))
.collect(Collectors.toCollection(ArrayList::new));
}
@@ -72,7 +70,6 @@ public record Label(@NotEmpty String key, @NotEmpty String value) {
if (map == null || map.isEmpty()) return List.of();
return map.entrySet()
.stream()
.filter(getEntryNotEmptyPredicate())
.map(entry -> new Label(entry.getKey(), entry.getValue()))
.toList();
}
@@ -91,14 +88,4 @@ public record Label(@NotEmpty String key, @NotEmpty String value) {
}
return map;
}
/**
* Provides predicate for not empty entries.
*
* @return The non-empty filter
*/
public static Predicate<Map.Entry<String, String>> getEntryNotEmptyPredicate() {
return entry -> entry.getKey() != null && !entry.getKey().isEmpty() &&
entry.getValue() != null && !entry.getValue().isEmpty();
}
}

View File

@@ -1,33 +1,16 @@
package io.kestra.core.models;
import io.swagger.v3.oas.annotations.media.Schema;
import jakarta.validation.Valid;
import jakarta.validation.constraints.Pattern;
import java.util.List;
import java.util.Map;
/**
* Interface that can be implemented by classes supporting plugin versioning.
*
* @see Plugin
*/
public interface PluginVersioning {
String TITLE = "Plugin Version";
String DESCRIPTION = """
Defines the version of the plugin to use.
The version must follow the Semantic Versioning (SemVer) specification:
- A single-digit MAJOR version (e.g., `1`).
- A MAJOR.MINOR version (e.g., `1.1`).
- A MAJOR.MINOR.PATCH version, optionally with any qualifier
(e.g., `1.1.2`, `1.1.0-SNAPSHOT`).
""";
@Schema(
title = TITLE,
description = DESCRIPTION
)
@Pattern(regexp="\\d+\\.\\d+\\.\\d+(-[a-zA-Z0-9-]+)?|([a-zA-Z0-9]+)")
@Schema(title = "The version of the plugin to use.")
String getVersion();
}

View File

@@ -254,7 +254,19 @@ public record QueryFilter(
*
* @return List of {@code ResourceField} with resource names, fields, and operations.
*/
public static List<ResourceField> asResourceList() {
return Arrays.stream(values())
.map(Resource::toResourceField)
.toList();
}
private static ResourceField toResourceField(Resource resource) {
List<FieldOp> fieldOps = resource.supportedField().stream()
.map(Resource::toFieldInfo)
.toList();
return new ResourceField(resource.name().toLowerCase(), fieldOps);
}
private static FieldOp toFieldInfo(Field field) {
List<Operation> operations = field.supportedOp().stream()
.map(Resource::toOperation)
@@ -267,6 +279,9 @@ public record QueryFilter(
}
}
public record ResourceField(String name, List<FieldOp> fields) {
}
public record FieldOp(String name, String value, List<Operation> operations) {
}

View File

@@ -17,12 +17,31 @@ import java.util.List;
@Introspected
public class ExecutionUsage {
private final List<DailyExecutionStatistics> dailyExecutionsCount;
private final List<DailyExecutionStatistics> dailyTaskRunsCount;
public static ExecutionUsage of(final String tenantId,
final ExecutionRepositoryInterface executionRepository,
final ZonedDateTime from,
final ZonedDateTime to) {
List<DailyExecutionStatistics> dailyTaskRunsCount = null;
try {
dailyTaskRunsCount = executionRepository.dailyStatistics(
null,
tenantId,
null,
null,
null,
from,
to,
DateUtils.GroupType.DAY,
null,
true);
} catch (UnsupportedOperationException ignored) {
}
return ExecutionUsage.builder()
.dailyExecutionsCount(executionRepository.dailyStatistics(
null,
@@ -33,13 +52,28 @@ public class ExecutionUsage {
from,
to,
DateUtils.GroupType.DAY,
null))
null,
false))
.dailyTaskRunsCount(dailyTaskRunsCount)
.build();
}
public static ExecutionUsage of(final ExecutionRepositoryInterface repository,
final ZonedDateTime from,
final ZonedDateTime to) {
List<DailyExecutionStatistics> dailyTaskRunsCount = null;
try {
dailyTaskRunsCount = repository.dailyStatisticsForAllTenants(
null,
null,
null,
from,
to,
DateUtils.GroupType.DAY,
true
);
} catch (UnsupportedOperationException ignored) {}
return ExecutionUsage.builder()
.dailyExecutionsCount(repository.dailyStatisticsForAllTenants(
null,
@@ -47,8 +81,10 @@ public class ExecutionUsage {
null,
from,
to,
DateUtils.GroupType.DAY
DateUtils.GroupType.DAY,
false
))
.dailyTaskRunsCount(dailyTaskRunsCount)
.build();
}
}

View File

@@ -865,18 +865,20 @@ public class Execution implements DeletedInterface, TenantInterface {
* @param e the exception raise
* @return new taskRun with updated attempt with logs
*/
private FailedTaskRunWithLog lastAttemptsTaskRunForFailedExecution(TaskRun taskRun, TaskRunAttempt lastAttempt, Exception e) {
TaskRun failed = taskRun
.withAttempts(
Stream
.concat(
taskRun.getAttempts().stream().limit(taskRun.getAttempts().size() - 1),
Stream.of(lastAttempt.getState().isFailed() ? lastAttempt : lastAttempt.withState(State.Type.FAILED))
)
.toList()
);
private FailedTaskRunWithLog lastAttemptsTaskRunForFailedExecution(TaskRun taskRun,
TaskRunAttempt lastAttempt, Exception e) {
return new FailedTaskRunWithLog(
failed.getState().isFailed() ? failed : failed.withState(State.Type.FAILED),
taskRun
.withAttempts(
Stream
.concat(
taskRun.getAttempts().stream().limit(taskRun.getAttempts().size() - 1),
Stream.of(lastAttempt
.withState(State.Type.FAILED))
)
.toList()
)
.withState(State.Type.FAILED),
RunContextLogger.logEntries(loggingEventFromException(e), LogEntry.of(taskRun, kind))
);
}

View File

@@ -62,7 +62,6 @@ public abstract class AbstractFlow implements FlowInterface {
@JsonSerialize(using = ListOrMapOfLabelSerializer.class)
@JsonDeserialize(using = ListOrMapOfLabelDeserializer.class)
@Schema(implementation = Object.class, oneOf = {List.class, Map.class})
@Valid
List<Label> labels;
@Schema(additionalProperties = Schema.AdditionalPropertiesValue.TRUE)

View File

@@ -56,7 +56,7 @@ public class DefaultPluginRegistry implements PluginRegistry {
*
* @return the {@link DefaultPluginRegistry}.
*/
public synchronized static DefaultPluginRegistry getOrCreate() {
public static DefaultPluginRegistry getOrCreate() {
DefaultPluginRegistry instance = LazyHolder.INSTANCE;
if (!instance.isInitialized()) {
instance.init();
@@ -74,7 +74,7 @@ public class DefaultPluginRegistry implements PluginRegistry {
/**
* Initializes the registry by loading all core plugins.
*/
protected synchronized void init() {
protected void init() {
if (initialized.compareAndSet(false, true)) {
register(scanner.scan());
}
@@ -200,7 +200,7 @@ public class DefaultPluginRegistry implements PluginRegistry {
if (existing != null && existing.crc32() == plugin.crc32()) {
return; // same plugin already registered
}
lock.lock();
try {
if (existing != null) {
@@ -212,7 +212,7 @@ public class DefaultPluginRegistry implements PluginRegistry {
lock.unlock();
}
}
protected void registerAll(Map<PluginIdentifier, PluginClassAndMetadata<? extends Plugin>> plugins) {
pluginClassByIdentifier.putAll(plugins);
}

View File

@@ -144,7 +144,7 @@ public final class PluginDeserializer<T extends Plugin> extends JsonDeserializer
static String extractPluginRawIdentifier(final JsonNode node, final boolean isVersioningSupported) {
String type = Optional.ofNullable(node.get(TYPE)).map(JsonNode::textValue).orElse(null);
String version = Optional.ofNullable(node.get(VERSION)).map(JsonNode::asText).orElse(null);
String version = Optional.ofNullable(node.get(VERSION)).map(JsonNode::textValue).orElse(null);
if (type == null || type.isEmpty()) {
return null;

View File

@@ -25,6 +25,8 @@ import java.util.Optional;
import java.util.function.Function;
public interface ExecutionRepositoryInterface extends SaveRepositoryInterface<Execution>, QueryBuilderInterface<Executions.Fields> {
Boolean isTaskRunEnabled();
default Optional<Execution> findById(String tenantId, String id) {
return findById(tenantId, id, false);
}
@@ -94,6 +96,12 @@ public interface ExecutionRepositoryInterface extends SaveRepositoryInterface<Ex
Flux<Execution> findAllAsync(@Nullable String tenantId);
ArrayListTotal<TaskRun> findTaskRun(
Pageable pageable,
@Nullable String tenantId,
List<QueryFilter> filters
);
Execution delete(Execution execution);
Integer purge(Execution execution);
@@ -104,7 +112,8 @@ public interface ExecutionRepositoryInterface extends SaveRepositoryInterface<Ex
@Nullable String flowId,
@Nullable ZonedDateTime startDate,
@Nullable ZonedDateTime endDate,
@Nullable DateUtils.GroupType groupBy
@Nullable DateUtils.GroupType groupBy,
boolean isTaskRun
);
List<DailyExecutionStatistics> dailyStatistics(
@@ -116,7 +125,8 @@ public interface ExecutionRepositoryInterface extends SaveRepositoryInterface<Ex
@Nullable ZonedDateTime startDate,
@Nullable ZonedDateTime endDate,
@Nullable DateUtils.GroupType groupBy,
List<State.Type> state
List<State.Type> state,
boolean isTaskRun
);
@Getter

View File

@@ -83,9 +83,7 @@ public class LocalFlowRepositoryLoader {
}
public void load(String tenantId, File basePath) throws IOException {
Map<String, FlowInterface> flowByUidInRepository = flowRepository.findAllForAllTenants()
.stream()
.filter(flow -> tenantId.equals(flow.getTenantId()))
Map<String, FlowInterface> flowByUidInRepository = flowRepository.findAllForAllTenants().stream()
.collect(Collectors.toMap(FlowId::uidWithoutRevision, Function.identity()));
try (Stream<Path> pathStream = Files.walk(basePath.toPath())) {

View File

@@ -5,7 +5,10 @@ import io.kestra.core.exceptions.IllegalVariableEvaluationException;
import io.kestra.core.exceptions.InternalException;
import io.kestra.core.models.Label;
import io.kestra.core.models.executions.*;
import io.kestra.core.models.flows.*;
import io.kestra.core.models.flows.Flow;
import io.kestra.core.models.flows.FlowInterface;
import io.kestra.core.models.flows.FlowWithException;
import io.kestra.core.models.flows.State;
import io.kestra.core.models.property.Property;
import io.kestra.core.models.tasks.ExecutableTask;
import io.kestra.core.models.tasks.Task;
@@ -26,7 +29,6 @@ import org.apache.commons.lang3.stream.Streams;
import java.time.Instant;
import java.time.ZonedDateTime;
import java.util.*;
import java.util.stream.Collectors;
import static io.kestra.core.trace.Tracer.throwCallable;
import static io.kestra.core.utils.Rethrow.throwConsumer;
@@ -151,24 +153,17 @@ public final class ExecutableUtils {
currentFlow.getNamespace(),
currentFlow.getId()
)
.orElseThrow(() -> {
String msg = "Unable to find flow '" + subflowNamespace + "'.'" + subflowId + "' with revision '" + subflowRevision.orElse(0) + "'";
runContext.logger().error(msg);
return new IllegalStateException(msg);
});
.orElseThrow(() -> new IllegalStateException("Unable to find flow '" + subflowNamespace + "'.'" + subflowId + "' with revision '" + subflowRevision.orElse(0) + "'"));
if (flow.isDisabled()) {
String msg = "Cannot execute a flow which is disabled";
runContext.logger().error(msg);
throw new IllegalStateException(msg);
throw new IllegalStateException("Cannot execute a flow which is disabled");
}
if (flow instanceof FlowWithException fwe) {
String msg = "Cannot execute an invalid flow: " + fwe.getException();
runContext.logger().error(msg);
throw new IllegalStateException(msg);
throw new IllegalStateException("Cannot execute an invalid flow: " + fwe.getException());
}
List<Label> newLabels = inheritLabels ? new ArrayList<>(filterLabels(currentExecution.getLabels(), flow)) : new ArrayList<>(systemLabels(currentExecution));
List<Label> newLabels = inheritLabels ? new ArrayList<>(filterLabels(currentExecution.getLabels(), flow)) : new ArrayList<>(systemLabels(currentExecution));
if (labels != null) {
labels.forEach(throwConsumer(label -> newLabels.add(new Label(runContext.render(label.key()), runContext.render(label.value())))));
}
@@ -206,20 +201,7 @@ public final class ExecutableUtils {
.build()
)
.withScheduleDate(scheduleOnDate);
if(execution.getInputs().size()<inputs.size()) {
Map<String,Object>resolvedInputs=execution.getInputs();
for (var inputKey : inputs.keySet()) {
if (!resolvedInputs.containsKey(inputKey)) {
runContext.logger().warn(
"Input {} was provided by parent execution {} for subflow {}.{} but isn't declared at the subflow inputs",
inputKey,
currentExecution.getId(),
currentTask.subflowId().namespace(),
currentTask.subflowId().flowId()
);
}
}
}
// inject the traceparent into the new execution
propagator.ifPresent(pg -> pg.inject(Context.current(), execution, ExecutionTextMapSetter.INSTANCE));

View File

@@ -49,7 +49,15 @@ import java.time.Duration;
import java.time.Instant;
import java.time.LocalDate;
import java.time.LocalTime;
import java.util.*;
import java.util.AbstractMap;
import java.util.Collection;
import java.util.Collections;
import java.util.HashMap;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.Objects;
import java.util.Optional;
import java.util.function.Function;
import java.util.function.Supplier;
import java.util.regex.Matcher;
@@ -223,19 +231,6 @@ public class FlowInputOutput {
return new AbstractMap.SimpleEntry<>(it.input().getId(), it.value());
})
.collect(HashMap::new, (m,v)-> m.put(v.getKey(), v.getValue()), HashMap::putAll);
if (resolved.size() < data.size()) {
RunContext runContext = runContextFactory.of(flow, execution);
for (var inputKey : data.keySet()) {
if (!resolved.containsKey(inputKey)) {
runContext.logger().warn(
"Input {} was provided for workflow {}.{} but isn't declared in the workflow inputs",
inputKey,
flow.getNamespace(),
flow.getId()
);
}
}
}
return MapUtils.flattenToNestedMap(resolved);
}
@@ -318,15 +313,15 @@ public class FlowInputOutput {
});
resolvable.setInput(input);
Object value = resolvable.get().value();
// resolve default if needed
if (value == null && input.getDefaults() != null) {
value = resolveDefaultValue(input, runContext);
resolvable.isDefault(true);
}
// validate and parse input value
if (value == null) {
if (input.getRequired()) {
@@ -355,7 +350,7 @@ public class FlowInputOutput {
return resolvable.get();
}
public static Object resolveDefaultValue(Input<?> input, PropertyContext renderer) throws IllegalVariableEvaluationException {
return switch (input.getType()) {
case STRING, ENUM, SELECT, SECRET, EMAIL -> resolveDefaultPropertyAs(input, renderer, String.class);
@@ -372,7 +367,7 @@ public class FlowInputOutput {
case MULTISELECT -> resolveDefaultPropertyAsList(input, renderer, String.class);
};
}
@SuppressWarnings("unchecked")
private static <T> Object resolveDefaultPropertyAs(Input<?> input, PropertyContext renderer, Class<T> clazz) throws IllegalVariableEvaluationException {
return Property.as((Property<T>) input.getDefaults(), renderer, clazz);
@@ -381,7 +376,7 @@ public class FlowInputOutput {
private static <T> Object resolveDefaultPropertyAsList(Input<?> input, PropertyContext renderer, Class<T> clazz) throws IllegalVariableEvaluationException {
return Property.asList((Property<List<T>>) input.getDefaults(), renderer, clazz);
}
private RunContext buildRunContextForExecutionAndInputs(final FlowInterface flow, final Execution execution, Map<String, InputAndValue> dependencies) {
Map<String, Object> flattenInputs = MapUtils.flattenToNestedMap(dependencies.entrySet()
.stream()
@@ -458,7 +453,7 @@ public class FlowInputOutput {
if (data.getType() == null) {
return Optional.of(new AbstractMap.SimpleEntry<>(data.getId(), current));
}
final Type elementType = data instanceof ItemTypeInterface itemTypeInterface ? itemTypeInterface.getItemType() : null;
return Optional.of(new AbstractMap.SimpleEntry<>(
@@ -535,17 +530,17 @@ public class FlowInputOutput {
throw new Exception("Expected `" + type + "` but received `" + current + "` with errors:\n```\n" + e.getMessage() + "\n```");
}
}
public static Map<String, Object> renderFlowOutputs(List<Output> outputs, RunContext runContext) throws IllegalVariableEvaluationException {
if (outputs == null) return Map.of();
// render required outputs
Map<String, Object> outputsById = outputs
.stream()
.filter(output -> output.getRequired() == null || output.getRequired())
.collect(HashMap::new, (map, entry) -> map.put(entry.getId(), entry.getValue()), Map::putAll);
outputsById = runContext.render(outputsById);
// render optional outputs one by one to catch, log, and skip any error.
for (io.kestra.core.models.flows.Output output : outputs) {
if (Boolean.FALSE.equals(output.getRequired())) {
@@ -588,9 +583,9 @@ public class FlowInputOutput {
}
public void isDefault(boolean isDefault) {
this.input = new InputAndValue(this.input.input(), this.input.value(), this.input.enabled(), isDefault, this.input.exception());
this.input = new InputAndValue(this.input.input(), this.input.value(), this.input.enabled(), isDefault, this.input.exception());
}
public void setInput(final Input<?> input) {
this.input = new InputAndValue(input, this.input.value(), this.input.enabled(), this.input.isDefault(), this.input.exception());
}

View File

@@ -500,7 +500,7 @@ public class FlowableUtils {
ArrayList<ResolvedTask> result = new ArrayList<>();
int iteration = 0;
int index = 0;
for (Object current : distinctValue) {
try {
String resolvedValue = current instanceof String stringValue ? stringValue : MAPPER.writeValueAsString(current);
@@ -508,7 +508,7 @@ public class FlowableUtils {
result.add(ResolvedTask.builder()
.task(task)
.value(resolvedValue)
.iteration(iteration)
.iteration(index++)
.parentId(parentTaskRun.getId())
.build()
);
@@ -516,7 +516,6 @@ public class FlowableUtils {
} catch (JsonProcessingException e) {
throw new IllegalVariableEvaluationException(e);
}
iteration++;
}
return result;

View File

@@ -168,7 +168,6 @@ public class Extension extends AbstractExtension {
functions.put("randomPort", new RandomPortFunction());
functions.put("fileExists", fileExistsFunction);
functions.put("isFileEmpty", isFileEmptyFunction);
functions.put("nanoId", new NanoIDFunction());
functions.put("tasksWithState", new TasksWithStateFunction());
functions.put(HttpFunction.NAME, httpFunction);
return functions;

View File

@@ -30,6 +30,6 @@ public class TimestampMicroFilter extends AbstractDate implements Filter {
ZoneId zoneId = zoneId(timeZone);
ZonedDateTime date = convert(input, zoneId, existingFormat);
return String.valueOf(TimeUnit.SECONDS.toMicros(date.toEpochSecond()) + TimeUnit.NANOSECONDS.toMicros(date.getNano()));
return String.valueOf(TimeUnit.SECONDS.toNanos(date.toEpochSecond()) + TimeUnit.NANOSECONDS.toMicros(date.getNano()));
}
}

View File

@@ -5,8 +5,6 @@ import io.kestra.core.http.HttpRequest;
import io.kestra.core.http.HttpResponse;
import io.kestra.core.http.client.HttpClient;
import io.kestra.core.http.client.HttpClientException;
import io.kestra.core.http.client.HttpClientRequestException;
import io.kestra.core.http.client.HttpClientResponseException;
import io.kestra.core.http.client.configurations.HttpConfiguration;
import io.kestra.core.runners.RunContext;
import io.kestra.core.runners.RunContextFactory;
@@ -103,15 +101,8 @@ public class HttpFunction<T> implements Function {
try (HttpClient httpClient = new HttpClient(runContext, httpConfiguration)) {
HttpResponse<Object> response = httpClient.request(httpRequest, Object.class);
return response.getBody();
} catch (HttpClientResponseException e) {
if (e.getResponse() != null) {
String msg = "Failed to execute HTTP Request, server respond with status " + e.getResponse().getStatus().getCode() + " : " + e.getResponse().getStatus().getReason();
throw new PebbleException(e, msg , lineNumber, self.getName());
} else {
throw new PebbleException( e, "Failed to execute HTTP request ", lineNumber, self.getName());
}
} catch(HttpClientException | IllegalVariableEvaluationException | IOException e ) {
throw new PebbleException( e, "Failed to execute HTTP request ", lineNumber, self.getName());
} catch (HttpClientException | IllegalVariableEvaluationException | IOException e) {
throw new PebbleException(e, "Unable to execute HTTP request", lineNumber, self.getName());
}
}

View File

@@ -1,66 +0,0 @@
package io.kestra.core.runners.pebble.functions;
import io.pebbletemplates.pebble.error.PebbleException;
import io.pebbletemplates.pebble.extension.Function;
import io.pebbletemplates.pebble.template.EvaluationContext;
import io.pebbletemplates.pebble.template.PebbleTemplate;
import java.security.SecureRandom;
import java.util.List;
import java.util.Map;
public class NanoIDFunction implements Function {
private static final int DEFAULT_LENGTH = 21;
private static final char[] DEFAULT_ALPHABET = "0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz-_".toCharArray();
private static final SecureRandom secureRandom = new SecureRandom();
private static final String LENGTH = "length";
private static final String ALPHABET = "alphabet";
private static final int MAX_LENGTH = 1000;
@Override
public Object execute(
Map<String, Object> args, PebbleTemplate self, EvaluationContext context, int lineNumber) {
int length = DEFAULT_LENGTH;
if (args.containsKey(LENGTH) && (args.get(LENGTH) instanceof Long)) {
length = parseLength(args, self, lineNumber);
}
char[] alphabet = DEFAULT_ALPHABET;
if (args.containsKey(ALPHABET) && (args.get(ALPHABET) instanceof String)) {
alphabet = ((String) args.get(ALPHABET)).toCharArray();
}
return createNanoID(length, alphabet);
}
private static int parseLength(Map<String, Object> args, PebbleTemplate self, int lineNumber) {
var value = (Long) args.get(LENGTH);
if(value > MAX_LENGTH) {
throw new PebbleException(
null,
"The 'nanoId()' function field 'length' must be lower than: " + MAX_LENGTH,
lineNumber,
self.getName());
}
return Math.toIntExact(value);
}
@Override
public List<String> getArgumentNames() {
return List.of(LENGTH,ALPHABET);
}
String createNanoID(int length, char[] alphabet){
final char[] data = new char[length];
final byte[] bytes = new byte[length];
final int mask = alphabet.length-1;
secureRandom.nextBytes(bytes);
for (int i = 0; i < length; ++i) {
data[i] = alphabet[bytes[i] & mask];
}
return String.valueOf(data);
}
}

View File

@@ -180,13 +180,23 @@ public final class FileSerde {
}
private static <T> MappingIterator<T> createMappingIterator(ObjectMapper objectMapper, Reader reader, TypeReference<T> type) throws IOException {
// See https://github.com/FasterXML/jackson-dataformats-binary/issues/493
// There is a limitation with the MappingIterator that cannot differentiate between an array of things (of whatever shape)
// and a sequence/stream of things (of Array shape).
// To work around that, we need to create a JsonParser and advance to the first token.
try (var parser = objectMapper.createParser(reader)) {
parser.nextToken();
return objectMapper.readerFor(type).readValues(parser);
}
}
private static <T> MappingIterator<T> createMappingIterator(ObjectMapper objectMapper, Reader reader, Class<T> type) throws IOException {
// See https://github.com/FasterXML/jackson-dataformats-binary/issues/493
// There is a limitation with the MappingIterator that cannot differentiate between an array of things (of whatever shape)
// and a sequence/stream of things (of Array shape).
// To work around that, we need to create a JsonParser and advance to the first token.
try (var parser = objectMapper.createParser(reader)) {
parser.nextToken();
return objectMapper.readerFor(type).readValues(parser);
}
}

View File

@@ -32,84 +32,48 @@ public class Version implements Comparable<Version> {
* @param version the version.
* @return a new {@link Version} instance.
*/
public static Version of(final Object version) {
public static Version of(String version) {
if (Objects.isNull(version)) {
throw new IllegalArgumentException("Invalid version, cannot parse null version");
}
String strVersion = version.toString();
if (strVersion.startsWith("v")) {
strVersion = strVersion.substring(1);
if (version.startsWith("v")) {
version = version.substring(1);
}
int qualifier = strVersion.indexOf("-");
int qualifier = version.indexOf("-");
final String[] versions = qualifier > 0 ?
strVersion.substring(0, qualifier).split("\\.") :
strVersion.split("\\.");
version.substring(0, qualifier).split("\\.") :
version.split("\\.");
try {
final int majorVersion = Integer.parseInt(versions[0]);
final Integer minorVersion = versions.length > 1 ? Integer.parseInt(versions[1]) : null;
final Integer incrementalVersion = versions.length > 2 ? Integer.parseInt(versions[2]) : null;
final int minorVersion = versions.length > 1 ? Integer.parseInt(versions[1]) : 0;
final int incrementalVersion = versions.length > 2 ? Integer.parseInt(versions[2]) : 0;
return new Version(
majorVersion,
minorVersion,
incrementalVersion,
qualifier > 0 ? strVersion.substring(qualifier + 1) : null,
strVersion
qualifier > 0 ? version.substring(qualifier + 1) : null,
version
);
} catch (NumberFormatException e) {
throw new IllegalArgumentException("Invalid version, cannot parse '" + version + "'");
}
}
/**
* Resolves the most appropriate stable version from a collection, based on a given input version.
* <p>
* The matching rules are:
* <ul>
* <li>If {@code from} specifies only a major version (e.g. {@code 1}), return the latest stable version
* with the same major (e.g. {@code 1.2.3}).</li>
* <li>If {@code from} specifies a major and minor version only (e.g. {@code 1.2}), return the latest
* stable version with the same major and minor (e.g. {@code 1.2.3}).</li>
* <li>If {@code from} specifies a full version with major, minor, and patch (e.g. {@code 1.2.2}),
* then only return it if it is exactly present (and stable) in {@code versions}.
* No "upgrade" is performed in this case.</li>
* <li>If no suitable version is found, returns {@code null}.</li>
* </ul>
* Static helper method for returning the most recent stable version for a current {@link Version}.
*
* @param from the reference version (may specify only major, or major+minor, or major+minor+patch).
* @param versions the collection of candidate versions to resolve against.
* @return the best matching stable version, or {@code null} if none match.
* @param from the current version.
* @param versions the list of version.
*
* @return the last stable version.
*/
public static Version getStable(final Version from, final Collection<Version> versions) {
// Case 1: "from" is only a major (e.g. 1)
if (from.hasOnlyMajor()) {
List<Version> sameMajor = versions.stream()
.filter(v -> v.majorVersion() == from.majorVersion())
.toList();
return sameMajor.isEmpty() ? null : Version.getLatest(sameMajor);
}
// Case 2: "from" is major+minor only (e.g. 1.2)
if (from.hasMajorAndMinorOnly()) {
List<Version> sameMinor = versions.stream()
.filter(v -> v.majorVersion() == from.majorVersion()
&& v.minorVersion() == from.minorVersion())
.toList();
return sameMinor.isEmpty() ? null : Version.getLatest(sameMinor);
}
// Case 3: "from" is full version (major+minor+patch)
if (versions.contains(from)) {
return from;
}
// No match
return null;
List<Version> compatibleVersions = versions.stream()
.filter(v -> v.majorVersion() == from.majorVersion() && v.minorVersion() == from.minorVersion())
.toList();
if (compatibleVersions.isEmpty()) return null;
return Version.getLatest(compatibleVersions);
}
/**
@@ -159,8 +123,8 @@ public class Version implements Comparable<Version> {
}
private final int majorVersion;
private final Integer minorVersion;
private final Integer patchVersion;
private final int minorVersion;
private final int incrementalVersion;
private final Qualifier qualifier;
private final String originalVersion;
@@ -170,14 +134,14 @@ public class Version implements Comparable<Version> {
*
* @param majorVersion the major version (must be superior or equal to 0).
* @param minorVersion the minor version (must be superior or equal to 0).
* @param patchVersion the incremental version (must be superior or equal to 0).
* @param incrementalVersion the incremental version (must be superior or equal to 0).
* @param qualifier the qualifier.
*/
public Version(final int majorVersion,
final int minorVersion,
final int patchVersion,
final int incrementalVersion,
final String qualifier) {
this(majorVersion, minorVersion, patchVersion, qualifier, null);
this(majorVersion, minorVersion, incrementalVersion, qualifier, null);
}
/**
@@ -185,25 +149,25 @@ public class Version implements Comparable<Version> {
*
* @param majorVersion the major version (must be superior or equal to 0).
* @param minorVersion the minor version (must be superior or equal to 0).
* @param patchVersion the incremental version (must be superior or equal to 0).
* @param incrementalVersion the incremental version (must be superior or equal to 0).
* @param qualifier the qualifier.
* @param originalVersion the original string version.
*/
private Version(final Integer majorVersion,
final Integer minorVersion,
final Integer patchVersion,
private Version(final int majorVersion,
final int minorVersion,
final int incrementalVersion,
final String qualifier,
final String originalVersion) {
this.majorVersion = requirePositive(majorVersion, "major");
this.minorVersion = requirePositive(minorVersion, "minor");
this.patchVersion = requirePositive(patchVersion, "incremental");
this.incrementalVersion = requirePositive(incrementalVersion, "incremental");
this.qualifier = qualifier != null ? new Qualifier(qualifier) : null;
this.originalVersion = originalVersion;
}
private static Integer requirePositive(Integer version, final String message) {
if (version != null && version < 0) {
private static int requirePositive(int version, final String message) {
if (version < 0) {
throw new IllegalArgumentException(String.format("The '%s' version must super or equal to 0", message));
}
return version;
@@ -214,11 +178,11 @@ public class Version implements Comparable<Version> {
}
public int minorVersion() {
return minorVersion != null ? minorVersion : 0;
return minorVersion;
}
public int patchVersion() {
return patchVersion != null ? patchVersion : 0;
public int incrementalVersion() {
return incrementalVersion;
}
public Qualifier qualifier() {
@@ -233,9 +197,9 @@ public class Version implements Comparable<Version> {
if (this == o) return true;
if (!(o instanceof Version)) return false;
Version version = (Version) o;
return Objects.equals(majorVersion,version.majorVersion) &&
Objects.equals(minorVersion, version.minorVersion) &&
Objects.equals(patchVersion,version.patchVersion) &&
return majorVersion == version.majorVersion &&
minorVersion == version.minorVersion &&
incrementalVersion == version.incrementalVersion &&
Objects.equals(qualifier, version.qualifier);
}
@@ -244,7 +208,7 @@ public class Version implements Comparable<Version> {
*/
@Override
public int hashCode() {
return Objects.hash(majorVersion, minorVersion, patchVersion, qualifier);
return Objects.hash(majorVersion, minorVersion, incrementalVersion, qualifier);
}
/**
@@ -254,7 +218,7 @@ public class Version implements Comparable<Version> {
public String toString() {
if (originalVersion != null) return originalVersion;
String version = majorVersion + "." + minorVersion + "." + patchVersion;
String version = majorVersion + "." + minorVersion + "." + incrementalVersion;
return (qualifier != null) ? version +"-" + qualifier : version;
}
@@ -274,7 +238,7 @@ public class Version implements Comparable<Version> {
return compareMinor;
}
int compareIncremental = Integer.compare(that.patchVersion, this.patchVersion);
int compareIncremental = Integer.compare(that.incrementalVersion, this.incrementalVersion);
if (compareIncremental != 0) {
return compareIncremental;
}
@@ -289,21 +253,6 @@ public class Version implements Comparable<Version> {
return this.qualifier.compareTo(that.qualifier);
}
/**
* @return true if only major is specified (e.g. "1")
*/
private boolean hasOnlyMajor() {
return minorVersion == null && patchVersion == null;
}
/**
* @return true if major+minor are specified, but no patch (e.g. "1.2")
*/
private boolean hasMajorAndMinorOnly() {
return minorVersion != null && patchVersion == null;
}
/**
* Checks whether this version is before the given one.

View File

@@ -46,19 +46,16 @@ public class VersionProvider {
this.date = loadTime(gitProperties);
this.version = loadVersion(buildProperties, gitProperties);
// check the version in the settings and update if needed, we didn't use it would allow us to detect incompatible update later if needed
settingRepository.ifPresent(
settingRepositoryInterface -> persistVersion(settingRepositoryInterface, version));
}
private static synchronized void persistVersion(SettingRepositoryInterface settingRepositoryInterface, String version) {
Optional<Setting> versionSetting = settingRepositoryInterface.findByKey(Setting.INSTANCE_VERSION);
if (versionSetting.isEmpty() || !versionSetting.get().getValue().equals(version)) {
settingRepositoryInterface.save(Setting.builder()
.key(Setting.INSTANCE_VERSION)
.value(version)
.build()
);
// check the version in the settings and update if needed, we did't use it would allow us to detect incompatible update later if needed
if (settingRepository.isPresent()) {
Optional<Setting> versionSetting = settingRepository.get().findByKey(Setting.INSTANCE_VERSION);
if (versionSetting.isEmpty() || !versionSetting.get().getValue().equals(this.version)) {
settingRepository.get().save(Setting.builder()
.key(Setting.INSTANCE_VERSION)
.value(this.version)
.build()
);
}
}
}

View File

@@ -127,24 +127,9 @@ public class Labels extends Task implements ExecutionUpdatableTask {
}
// check for system labels: none can be passed at runtime
Optional<Map.Entry<String, String>> systemLabel = labelsAsMap.entrySet().stream()
.filter(entry -> entry.getKey().startsWith(SYSTEM_PREFIX))
.findFirst();
if (systemLabel.isPresent()) {
throw new IllegalArgumentException(
"System labels can only be set by Kestra itself, offending label: " +
systemLabel.get().getKey() + "=" + systemLabel.get().getValue()
);
}
// check for empty label values
Optional<Map.Entry<String, String>> emptyValue = labelsAsMap.entrySet().stream()
.filter(entry -> entry.getValue().isEmpty())
.findFirst();
if (emptyValue.isPresent()) {
throw new IllegalArgumentException(
"Label values cannot be empty, offending label: " + emptyValue.get().getKey()
);
Optional<Map.Entry<String, String>> first = labelsAsMap.entrySet().stream().filter(entry -> entry.getKey().startsWith(SYSTEM_PREFIX)).findFirst();
if (first.isPresent()) {
throw new IllegalArgumentException("System labels can only be set by Kestra itself, offending label: " + first.get().getKey() + "=" + first.get().getValue());
}
Map<String, String> newLabels = ListUtils.emptyOnNull(execution.getLabels()).stream()
@@ -155,7 +140,6 @@ public class Labels extends Task implements ExecutionUpdatableTask {
newLabels.putAll(labelsAsMap);
return execution.withLabels(newLabels.entrySet().stream()
.filter(Label.getEntryNotEmptyPredicate())
.map(entry -> new Label(
entry.getKey(),
entry.getValue()

View File

@@ -21,13 +21,10 @@ import java.nio.file.Paths;
import java.util.Arrays;
import java.util.List;
import java.util.Objects;
import org.junit.jupiter.api.parallel.Execution;
import org.junit.jupiter.api.parallel.ExecutionMode;
import static org.assertj.core.api.Assertions.assertThat;
@KestraTest
@Execution(ExecutionMode.SAME_THREAD)
class DocumentationGeneratorTest {
@Inject
JsonSchemaGenerator jsonSchemaGenerator;

View File

@@ -37,7 +37,6 @@ import lombok.Value;
import org.apache.commons.io.IOUtils;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.parallel.ExecutionMode;
import org.junit.jupiter.params.ParameterizedTest;
import org.junit.jupiter.params.provider.Arguments;
import org.junit.jupiter.params.provider.MethodSource;
@@ -68,7 +67,6 @@ import static org.junit.jupiter.api.Assertions.assertThrows;
@KestraTest
@Testcontainers
@org.junit.jupiter.api.parallel.Execution(ExecutionMode.SAME_THREAD)
class HttpClientTest {
@Inject
private ApplicationContext applicationContext;

View File

@@ -1,32 +1,19 @@
package io.kestra.core.models;
import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.models.validations.ModelValidator;
import jakarta.inject.Inject;
import jakarta.validation.ConstraintViolationException;
import org.junit.jupiter.api.Test;
import java.util.List;
import java.util.Map;
import java.util.Optional;
import static org.assertj.core.api.Assertions.assertThat;
@KestraTest
class LabelTest {
@Inject
private ModelValidator modelValidator;
@Test
void shouldGetNestedMapGivenDistinctLabels() {
Map<String, Object> result = Label.toNestedMap(List.of(
new Label(Label.USERNAME, "test"),
new Label(Label.CORRELATION_ID, "id"),
new Label("", "bar"),
new Label(null, "bar"),
new Label("foo", ""),
new Label("baz", null)
)
new Label(Label.CORRELATION_ID, "id"))
);
assertThat(result).isEqualTo(
@@ -47,18 +34,6 @@ class LabelTest {
);
}
@Test
void toNestedMapShouldIgnoreEmptyOrNull() {
Map<String, Object> result = Label.toNestedMap(List.of(
new Label("", "bar"),
new Label(null, "bar"),
new Label("foo", ""),
new Label("baz", null))
);
assertThat(result).isEmpty();
}
@Test
void shouldGetMapGivenDistinctLabels() {
Map<String, String> result = Label.toMap(List.of(
@@ -84,18 +59,6 @@ class LabelTest {
);
}
@Test
void toMapShouldIgnoreEmptyOrNull() {
Map<String, String> result = Label.toMap(List.of(
new Label("", "bar"),
new Label(null, "bar"),
new Label("foo", ""),
new Label("baz", null))
);
assertThat(result).isEmpty();
}
@Test
void shouldDuplicateLabelsWithKeyOrderKept() {
List<Label> result = Label.deduplicate(List.of(
@@ -110,28 +73,4 @@ class LabelTest {
new Label(Label.CORRELATION_ID, "id")
);
}
@Test
void deduplicateShouldIgnoreEmptyAndNull() {
List<Label> result = Label.deduplicate(List.of(
new Label("", "bar"),
new Label(null, "bar"),
new Label("foo", ""),
new Label("baz", null))
);
assertThat(result).isEmpty();
}
@Test
void shouldValidateEmpty() {
Optional<ConstraintViolationException> validLabelResult = modelValidator.isValid(new Label("foo", "bar"));
assertThat(validLabelResult.isPresent()).isFalse();
Optional<ConstraintViolationException> emptyValueLabelResult = modelValidator.isValid(new Label("foo", ""));
assertThat(emptyValueLabelResult.isPresent()).isTrue();
Optional<ConstraintViolationException> emptyKeyLabelResult = modelValidator.isValid(new Label("", "bar"));
assertThat(emptyKeyLabelResult.isPresent()).isTrue();
}
}

View File

@@ -13,19 +13,19 @@ import java.util.Map;
import static org.assertj.core.api.Assertions.assertThat;
class ExecutionTest {
private static final TaskRun.TaskRunBuilder TASK_RUN = TaskRun.builder()
.id("test");
@Test
void hasTaskRunJoinableTrue() {
Execution execution = Execution.builder()
.taskRunList(Collections.singletonList(TaskRun.builder()
.id("test")
.taskRunList(Collections.singletonList(TASK_RUN
.state(new State(State.Type.RUNNING, new State()))
.build())
)
.build();
assertThat(execution.hasTaskRunJoinable(TaskRun.builder()
.id("test")
assertThat(execution.hasTaskRunJoinable(TASK_RUN
.state(new State(State.Type.FAILED, new State()
.withState(State.Type.RUNNING)
))
@@ -36,15 +36,13 @@ class ExecutionTest {
@Test
void hasTaskRunJoinableSameState() {
Execution execution = Execution.builder()
.taskRunList(Collections.singletonList(TaskRun.builder()
.id("test")
.taskRunList(Collections.singletonList(TASK_RUN
.state(new State())
.build())
)
.build();
assertThat(execution.hasTaskRunJoinable(TaskRun.builder()
.id("test")
assertThat(execution.hasTaskRunJoinable(TASK_RUN
.state(new State())
.build()
)).isFalse();
@@ -53,8 +51,7 @@ class ExecutionTest {
@Test
void hasTaskRunJoinableFailedExecutionFromExecutor() {
Execution execution = Execution.builder()
.taskRunList(Collections.singletonList(TaskRun.builder()
.id("test")
.taskRunList(Collections.singletonList(TASK_RUN
.state(new State(State.Type.FAILED, new State()
.withState(State.Type.RUNNING)
))
@@ -62,8 +59,7 @@ class ExecutionTest {
)
.build();
assertThat(execution.hasTaskRunJoinable(TaskRun.builder()
.id("test")
assertThat(execution.hasTaskRunJoinable(TASK_RUN
.state(new State(State.Type.RUNNING, new State()))
.build()
)).isFalse();
@@ -72,8 +68,7 @@ class ExecutionTest {
@Test
void hasTaskRunJoinableRestartFailed() {
Execution execution = Execution.builder()
.taskRunList(Collections.singletonList(TaskRun.builder()
.id("test")
.taskRunList(Collections.singletonList(TASK_RUN
.state(new State(State.Type.CREATED, new State()
.withState(State.Type.RUNNING)
.withState(State.Type.FAILED)
@@ -82,8 +77,7 @@ class ExecutionTest {
)
.build();
assertThat(execution.hasTaskRunJoinable(TaskRun.builder()
.id("test")
assertThat(execution.hasTaskRunJoinable(TASK_RUN
.state(new State(State.Type.FAILED, new State()
.withState(State.Type.RUNNING)
))
@@ -94,8 +88,7 @@ class ExecutionTest {
@Test
void hasTaskRunJoinableRestartSuccess() {
Execution execution = Execution.builder()
.taskRunList(Collections.singletonList(TaskRun.builder()
.id("test")
.taskRunList(Collections.singletonList(TASK_RUN
.state(new State(State.Type.CREATED, new State()
.withState(State.Type.RUNNING)
.withState(State.Type.SUCCESS)
@@ -104,8 +97,7 @@ class ExecutionTest {
)
.build();
assertThat(execution.hasTaskRunJoinable(TaskRun.builder()
.id("test")
assertThat(execution.hasTaskRunJoinable(TASK_RUN
.state(new State(State.Type.SUCCESS, new State()
.withState(State.Type.RUNNING)
.withState(State.Type.SUCCESS)
@@ -117,8 +109,7 @@ class ExecutionTest {
@Test
void hasTaskRunJoinableAfterRestart() {
Execution execution = Execution.builder()
.taskRunList(Collections.singletonList(TaskRun.builder()
.id("test")
.taskRunList(Collections.singletonList(TASK_RUN
.state(new State(State.Type.CREATED, new State()
.withState(State.Type.RUNNING)
.withState(State.Type.FAILED)
@@ -127,8 +118,7 @@ class ExecutionTest {
)
.build();
assertThat(execution.hasTaskRunJoinable(TaskRun.builder()
.id("test")
assertThat(execution.hasTaskRunJoinable(TASK_RUN
.state(new State(State.Type.SUCCESS, new State()
.withState(State.Type.RUNNING)
.withState(State.Type.FAILED)

View File

@@ -7,11 +7,12 @@ import io.kestra.core.junit.annotations.ExecuteFlow;
import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.junit.annotations.LoadFlows;
import io.kestra.core.models.executions.Execution;
import io.kestra.core.models.flows.Flow;
import io.kestra.core.models.flows.FlowWithSource;
import io.kestra.core.models.triggers.Trigger;
import io.kestra.core.queues.QueueException;
import io.kestra.core.repositories.TriggerRepositoryInterface;
import io.kestra.core.runners.TestRunnerUtils;
import io.kestra.core.runners.RunnerUtils;
import io.kestra.core.serializers.YamlParser;
import io.kestra.core.services.GraphService;
import io.kestra.core.utils.GraphUtils;
@@ -44,7 +45,7 @@ class FlowGraphTest {
private TriggerRepositoryInterface triggerRepositoryInterface;
@Inject
private TestRunnerUtils runnerUtils;
private RunnerUtils runnerUtils;
@Test
void simple() throws IllegalVariableEvaluationException, IOException {
@@ -260,10 +261,10 @@ class FlowGraphTest {
}
@Test
@LoadFlows(value = {"flows/valids/task-flow.yaml",
"flows/valids/switch.yaml"}, tenantId = "tenant1")
@LoadFlows({"flows/valids/task-flow.yaml",
"flows/valids/switch.yaml"})
void subflow() throws IllegalVariableEvaluationException, IOException, FlowProcessingException {
FlowWithSource flow = this.parse("flows/valids/task-flow.yaml", "tenant1");
FlowWithSource flow = this.parse("flows/valids/task-flow.yaml");
FlowGraph flowGraph = GraphUtils.flowGraph(flow, null);
assertThat(flowGraph.getNodes().size()).isEqualTo(6);
@@ -292,15 +293,15 @@ class FlowGraphTest {
}
@Test
@LoadFlows(value = {"flows/valids/task-flow-dynamic.yaml",
"flows/valids/switch.yaml"}, tenantId = "tenant2")
@LoadFlows({"flows/valids/task-flow-dynamic.yaml",
"flows/valids/switch.yaml"})
void dynamicIdSubflow() throws IllegalVariableEvaluationException, TimeoutException, QueueException, IOException, FlowProcessingException {
FlowWithSource flow = this.parse("flows/valids/task-flow-dynamic.yaml", "tenant2").toBuilder().revision(1).build();
FlowWithSource flow = this.parse("flows/valids/task-flow-dynamic.yaml").toBuilder().revision(1).build();
IllegalArgumentException illegalArgumentException = Assertions.assertThrows(IllegalArgumentException.class, () -> graphService.flowGraph(flow, Collections.singletonList("root.launch")));
assertThat(illegalArgumentException.getMessage()).isEqualTo("Can't expand subflow task 'launch' because namespace and/or flowId contains dynamic values. This can only be viewed on an execution.");
Execution execution = runnerUtils.runOne("tenant2", "io.kestra.tests", "task-flow-dynamic", 1, (f, e) -> Map.of(
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "task-flow-dynamic", 1, (f, e) -> Map.of(
"namespace", f.getNamespace(),
"flowId", "switch"
));
@@ -372,17 +373,13 @@ class FlowGraphTest {
}
private FlowWithSource parse(String path) throws IOException {
return parse(path, MAIN_TENANT);
}
private FlowWithSource parse(String path, String tenantId) throws IOException {
URL resource = TestsUtils.class.getClassLoader().getResource(path);
assert resource != null;
File file = new File(resource.getFile());
return YamlParser.parse(file, FlowWithSource.class).toBuilder()
.tenantId(tenantId)
.tenantId(MAIN_TENANT)
.source(Files.readString(file.toPath()))
.build();
}

View File

@@ -4,7 +4,6 @@ import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.runners.*;
import io.kestra.core.storages.StorageContext;
import io.kestra.core.storages.StorageInterface;
import io.kestra.core.utils.IdUtils;
import jakarta.inject.Inject;
import org.junit.jupiter.api.Test;
import org.mockito.Mockito;
@@ -85,9 +84,8 @@ class URIFetcherTest {
@Test
void shouldFetchFromNsfile() throws IOException {
String namespace = IdUtils.create();
URI uri = createNsFile(namespace, false);
RunContext runContext = runContextFactory.of(Map.of("flow", Map.of("namespace", namespace)));
URI uri = createNsFile(false);
RunContext runContext = runContextFactory.of(Map.of("flow", Map.of("namespace", "namespace")));
try (var fetch = URIFetcher.of(uri).fetch(runContext)) {
String fetchedContent = new String(fetch.readAllBytes());
@@ -97,8 +95,7 @@ class URIFetcherTest {
@Test
void shouldFetchFromNsfileFromOtherNs() throws IOException {
String namespace = IdUtils.create();
URI uri = createNsFile(namespace, true);
URI uri = createNsFile(true);
RunContext runContext = runContextFactory.of(Map.of("flow", Map.of("namespace", "other")));
try (var fetch = URIFetcher.of(uri).fetch(runContext)) {
@@ -142,7 +139,8 @@ class URIFetcherTest {
);
}
private URI createNsFile(String namespace, boolean nsInAuthority) throws IOException {
private URI createNsFile(boolean nsInAuthority) throws IOException {
String namespace = "namespace";
String filePath = "file.txt";
storage.createDirectory(MAIN_TENANT, namespace, URI.create(StorageContext.namespaceFilePrefix(namespace)));
storage.put(MAIN_TENANT, namespace, URI.create(StorageContext.namespaceFilePrefix(namespace) + "/" + filePath), new ByteArrayInputStream("Hello World".getBytes()));

View File

@@ -10,7 +10,6 @@ import io.kestra.core.models.tasks.Task;
import io.kestra.core.runners.RunContext;
import io.kestra.core.runners.RunContextFactory;
import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.utils.IdUtils;
import jakarta.inject.Inject;
import org.junit.jupiter.api.Test;
@@ -34,15 +33,17 @@ class ScriptServiceTest {
@Test
void replaceInternalStorage() throws IOException {
String tenant = IdUtils.create();
var runContext = runContextFactory.of("id", "namespace", tenant);
var runContext = runContextFactory.of();
var command = ScriptService.replaceInternalStorage(runContext, null, false);
assertThat(command).isEqualTo("");
command = ScriptService.replaceInternalStorage(runContext, "my command", false);
assertThat(command).isEqualTo("my command");
Path path = createFile(tenant, "file");
Path path = Path.of("/tmp/unittest/main/file.txt");
if (!path.toFile().exists()) {
Files.createFile(path);
}
String internalStorageUri = "kestra://some/file.txt";
File localFile = null;
@@ -69,10 +70,12 @@ class ScriptServiceTest {
@Test
void replaceInternalStorageUnicode() throws IOException {
String tenant = IdUtils.create();
var runContext = runContextFactory.of("id", "namespace", tenant);
var runContext = runContextFactory.of();
Path path = createFile(tenant, "file-龍");
Path path = Path.of("/tmp/unittest/main/file-龍.txt");
if (!path.toFile().exists()) {
Files.createFile(path);
}
String internalStorageUri = "kestra://some/file-龍.txt";
File localFile = null;
@@ -92,10 +95,12 @@ class ScriptServiceTest {
@Test
void uploadInputFiles() throws IOException {
String tenant = IdUtils.create();
var runContext = runContextFactory.of("id", "namespace", tenant);
var runContext = runContextFactory.of();
Path path = createFile(tenant, "file");
Path path = Path.of("/tmp/unittest/main/file.txt");
if (!path.toFile().exists()) {
Files.createFile(path);
}
List<File> filesToDelete = new ArrayList<>();
String internalStorageUri = "kestra://some/file.txt";
@@ -138,11 +143,13 @@ class ScriptServiceTest {
@Test
void uploadOutputFiles() throws IOException {
String tenant = IdUtils.create();
var runContext = runContextFactory.of("id", "namespace", tenant);
Path path = createFile(tenant, "file");
var runContext = runContextFactory.of();
Path path = Path.of("/tmp/unittest/main/file.txt");
if (!path.toFile().exists()) {
Files.createFile(path);
}
var outputFiles = ScriptService.uploadOutputFiles(runContext, Path.of("/tmp/unittest/%s".formatted(tenant)));
var outputFiles = ScriptService.uploadOutputFiles(runContext, Path.of("/tmp/unittest/main"));
assertThat(outputFiles, not(anEmptyMap()));
assertThat(outputFiles.get("file.txt")).isEqualTo(URI.create("kestra:///file.txt"));
@@ -225,13 +232,4 @@ class ScriptServiceTest {
.build();
return runContextFactory.of(flow, task, execution, taskRun);
}
private static Path createFile(String tenant, String fileName) throws IOException {
Path path = Path.of("/tmp/unittest/%s/%s.txt".formatted(tenant, fileName));
if (!path.toFile().exists()) {
Files.createDirectory(Path.of("/tmp/unittest/%s".formatted(tenant)));
Files.createFile(path);
}
return path;
}
}

View File

@@ -3,7 +3,6 @@ package io.kestra.core.models.triggers.multipleflows;
import com.google.common.collect.ImmutableMap;
import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.models.property.Property;
import io.kestra.core.utils.TestsUtils;
import org.apache.commons.lang3.tuple.Pair;
import org.junit.jupiter.api.Test;
import io.kestra.plugin.core.condition.ExecutionFlow;
@@ -34,9 +33,8 @@ public abstract class AbstractMultipleConditionStorageTest {
@Test
void allDefault() {
MultipleConditionStorageInterface multipleConditionStorage = multipleConditionStorage();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Pair<Flow, MultipleCondition> pair = mockFlow(tenant, TimeWindow.builder().build());
Pair<Flow, MultipleCondition> pair = mockFlow(TimeWindow.builder().build());
MultipleConditionWindow window = multipleConditionStorage.getOrCreate(pair.getKey(), pair.getRight(), Collections.emptyMap());
@@ -52,9 +50,8 @@ public abstract class AbstractMultipleConditionStorageTest {
@Test
void daily() {
MultipleConditionStorageInterface multipleConditionStorage = multipleConditionStorage();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Pair<Flow, MultipleCondition> pair = mockFlow(tenant, TimeWindow.builder().window(Duration.ofDays(1)).windowAdvance(Duration.ofSeconds(0)).build());
Pair<Flow, MultipleCondition> pair = mockFlow(TimeWindow.builder().window(Duration.ofDays(1)).windowAdvance(Duration.ofSeconds(0)).build());
MultipleConditionWindow window = multipleConditionStorage.getOrCreate(pair.getKey(), pair.getRight(), Collections.emptyMap());
@@ -70,9 +67,8 @@ public abstract class AbstractMultipleConditionStorageTest {
@Test
void dailyAdvance() {
MultipleConditionStorageInterface multipleConditionStorage = multipleConditionStorage();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Pair<Flow, MultipleCondition> pair = mockFlow(tenant, TimeWindow.builder().window(Duration.ofDays(1)).windowAdvance(Duration.ofHours(4).negated()).build());
Pair<Flow, MultipleCondition> pair = mockFlow(TimeWindow.builder().window(Duration.ofDays(1)).windowAdvance(Duration.ofHours(4).negated()).build());
MultipleConditionWindow window = multipleConditionStorage.getOrCreate(pair.getKey(), pair.getRight(), Collections.emptyMap());
@@ -88,9 +84,8 @@ public abstract class AbstractMultipleConditionStorageTest {
@Test
void hourly() {
MultipleConditionStorageInterface multipleConditionStorage = multipleConditionStorage();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Pair<Flow, MultipleCondition> pair = mockFlow(tenant, TimeWindow.builder().window(Duration.ofHours(1)).windowAdvance(Duration.ofHours(4).negated()).build());
Pair<Flow, MultipleCondition> pair = mockFlow(TimeWindow.builder().window(Duration.ofHours(1)).windowAdvance(Duration.ofHours(4).negated()).build());
MultipleConditionWindow window = multipleConditionStorage.getOrCreate(pair.getKey(), pair.getRight(), Collections.emptyMap());
@@ -107,9 +102,8 @@ public abstract class AbstractMultipleConditionStorageTest {
@Test
void minutely() {
MultipleConditionStorageInterface multipleConditionStorage = multipleConditionStorage();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Pair<Flow, MultipleCondition> pair = mockFlow(tenant, TimeWindow.builder().window(Duration.ofMinutes(15)).windowAdvance(Duration.ofMinutes(5).negated()).build());
Pair<Flow, MultipleCondition> pair = mockFlow(TimeWindow.builder().window(Duration.ofMinutes(15)).windowAdvance(Duration.ofMinutes(5).negated()).build());
MultipleConditionWindow window = multipleConditionStorage.getOrCreate(pair.getKey(), pair.getRight(), Collections.emptyMap());
@@ -121,9 +115,8 @@ public abstract class AbstractMultipleConditionStorageTest {
@Test
void expiration() throws Exception {
MultipleConditionStorageInterface multipleConditionStorage = multipleConditionStorage();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Pair<Flow, MultipleCondition> pair = mockFlow(tenant, TimeWindow.builder().window(Duration.ofSeconds(2)).windowAdvance(Duration.ofMinutes(0).negated()).build());
Pair<Flow, MultipleCondition> pair = mockFlow(TimeWindow.builder().window(Duration.ofSeconds(2)).windowAdvance(Duration.ofMinutes(0).negated()).build());
MultipleConditionWindow window = multipleConditionStorage.getOrCreate(pair.getKey(), pair.getRight(), Collections.emptyMap());
this.save(multipleConditionStorage, pair.getLeft(), Collections.singletonList(window.with(ImmutableMap.of("a", true))));
@@ -143,9 +136,8 @@ public abstract class AbstractMultipleConditionStorageTest {
@Test
void expired() throws Exception {
MultipleConditionStorageInterface multipleConditionStorage = multipleConditionStorage();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Pair<Flow, MultipleCondition> pair = mockFlow(tenant, TimeWindow.builder().window(Duration.ofSeconds(2)).windowAdvance(Duration.ofMinutes(0).negated()).build());
Pair<Flow, MultipleCondition> pair = mockFlow(TimeWindow.builder().window(Duration.ofSeconds(2)).windowAdvance(Duration.ofMinutes(0).negated()).build());
MultipleConditionWindow window = multipleConditionStorage.getOrCreate(pair.getKey(), pair.getRight(), Collections.emptyMap());
this.save(multipleConditionStorage, pair.getLeft(), Collections.singletonList(window.with(ImmutableMap.of("a", true))));
@@ -154,21 +146,20 @@ public abstract class AbstractMultipleConditionStorageTest {
assertThat(window.getResults().get("a")).isTrue();
List<MultipleConditionWindow> expired = multipleConditionStorage.expired(tenant);
List<MultipleConditionWindow> expired = multipleConditionStorage.expired(null);
assertThat(expired.size()).isZero();
Thread.sleep(2005);
expired = multipleConditionStorage.expired(tenant);
expired = multipleConditionStorage.expired(null);
assertThat(expired.size()).isEqualTo(1);
}
@Test
void dailyTimeDeadline() throws Exception {
MultipleConditionStorageInterface multipleConditionStorage = multipleConditionStorage();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Pair<Flow, MultipleCondition> pair = mockFlow(tenant, TimeWindow.builder().type(Type.DAILY_TIME_DEADLINE).deadline(LocalTime.now().plusSeconds(2)).build());
Pair<Flow, MultipleCondition> pair = mockFlow(TimeWindow.builder().type(Type.DAILY_TIME_DEADLINE).deadline(LocalTime.now().plusSeconds(2)).build());
MultipleConditionWindow window = multipleConditionStorage.getOrCreate(pair.getKey(), pair.getRight(), Collections.emptyMap());
this.save(multipleConditionStorage, pair.getLeft(), Collections.singletonList(window.with(ImmutableMap.of("a", true))));
@@ -177,21 +168,20 @@ public abstract class AbstractMultipleConditionStorageTest {
assertThat(window.getResults().get("a")).isTrue();
List<MultipleConditionWindow> expired = multipleConditionStorage.expired(tenant);
List<MultipleConditionWindow> expired = multipleConditionStorage.expired(null);
assertThat(expired.size()).isZero();
Thread.sleep(2005);
expired = multipleConditionStorage.expired(tenant);
expired = multipleConditionStorage.expired(null);
assertThat(expired.size()).isEqualTo(1);
}
@Test
void dailyTimeDeadline_Expired() throws Exception {
MultipleConditionStorageInterface multipleConditionStorage = multipleConditionStorage();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Pair<Flow, MultipleCondition> pair = mockFlow(tenant, TimeWindow.builder().type(Type.DAILY_TIME_DEADLINE).deadline(LocalTime.now().minusSeconds(1)).build());
Pair<Flow, MultipleCondition> pair = mockFlow(TimeWindow.builder().type(Type.DAILY_TIME_DEADLINE).deadline(LocalTime.now().minusSeconds(1)).build());
MultipleConditionWindow window = multipleConditionStorage.getOrCreate(pair.getKey(), pair.getRight(), Collections.emptyMap());
this.save(multipleConditionStorage, pair.getLeft(), Collections.singletonList(window.with(ImmutableMap.of("a", true))));
@@ -200,17 +190,16 @@ public abstract class AbstractMultipleConditionStorageTest {
assertThat(window.getResults()).isEmpty();
List<MultipleConditionWindow> expired = multipleConditionStorage.expired(tenant);
List<MultipleConditionWindow> expired = multipleConditionStorage.expired(null);
assertThat(expired.size()).isEqualTo(1);
}
@Test
void dailyTimeWindow() {
void dailyTimeWindow() throws Exception {
MultipleConditionStorageInterface multipleConditionStorage = multipleConditionStorage();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
LocalTime startTime = LocalTime.now().truncatedTo(ChronoUnit.MINUTES);
Pair<Flow, MultipleCondition> pair = mockFlow(tenant, TimeWindow.builder().type(Type.DAILY_TIME_WINDOW).startTime(startTime).endTime(startTime.plusMinutes(5)).build());
Pair<Flow, MultipleCondition> pair = mockFlow(TimeWindow.builder().type(Type.DAILY_TIME_WINDOW).startTime(startTime).endTime(startTime.plusMinutes(5)).build());
MultipleConditionWindow window = multipleConditionStorage.getOrCreate(pair.getKey(), pair.getRight(), Collections.emptyMap());
this.save(multipleConditionStorage, pair.getLeft(), Collections.singletonList(window.with(ImmutableMap.of("a", true))));
@@ -219,16 +208,15 @@ public abstract class AbstractMultipleConditionStorageTest {
assertThat(window.getResults().get("a")).isTrue();
List<MultipleConditionWindow> expired = multipleConditionStorage.expired(tenant);
List<MultipleConditionWindow> expired = multipleConditionStorage.expired(null);
assertThat(expired.size()).isZero();
}
@Test
void slidingWindow() throws Exception {
MultipleConditionStorageInterface multipleConditionStorage = multipleConditionStorage();
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Pair<Flow, MultipleCondition> pair = mockFlow(tenant, TimeWindow.builder().type(Type.SLIDING_WINDOW).window(Duration.ofHours(1)).build());
Pair<Flow, MultipleCondition> pair = mockFlow(TimeWindow.builder().type(Type.SLIDING_WINDOW).window(Duration.ofHours(1)).build());
MultipleConditionWindow window = multipleConditionStorage.getOrCreate(pair.getKey(), pair.getRight(), Collections.emptyMap());
this.save(multipleConditionStorage, pair.getLeft(), Collections.singletonList(window.with(ImmutableMap.of("a", true))));
@@ -237,13 +225,13 @@ public abstract class AbstractMultipleConditionStorageTest {
assertThat(window.getResults().get("a")).isTrue();
List<MultipleConditionWindow> expired = multipleConditionStorage.expired(tenant);
List<MultipleConditionWindow> expired = multipleConditionStorage.expired(null);
assertThat(expired.size()).isZero();
}
private static Pair<Flow, MultipleCondition> mockFlow(String tenantId, TimeWindow sla) {
private static Pair<Flow, MultipleCondition> mockFlow(TimeWindow sla) {
var multipleCondition = MultipleCondition.builder()
.id("condition-multiple-%s".formatted(tenantId))
.id("condition-multiple")
.conditions(ImmutableMap.of(
"flow-a", ExecutionFlow.builder()
.flowId(Property.ofValue("flow-a"))
@@ -260,7 +248,6 @@ public abstract class AbstractMultipleConditionStorageTest {
Flow flow = Flow.builder()
.namespace(NAMESPACE)
.id("multiple-flow")
.tenantId(tenantId)
.revision(1)
.triggers(Collections.singletonList(io.kestra.plugin.core.trigger.Flow.builder()
.id("trigger-flow")

View File

@@ -13,20 +13,21 @@ import static org.assertj.core.api.Assertions.assertThat;
@KestraTest
public abstract class AbstractFeatureUsageReportTest {
@Inject
FeatureUsageReport featureUsageReport;
@Test
public void shouldGetReport() {
// When
Instant now = Instant.now();
FeatureUsageReport.UsageEvent event = featureUsageReport.report(
now,
now,
Reportable.TimeInterval.of(now.minus(Duration.ofDays(1)).atZone(ZoneId.systemDefault()), now.atZone(ZoneId.systemDefault()))
);
// Then
assertThat(event.getExecutions().getDailyExecutionsCount().size()).isGreaterThan(0);
assertThat(event.getExecutions().getDailyTaskRunsCount()).isNull();
}
}

View File

@@ -10,6 +10,7 @@ import io.kestra.core.server.ServiceType;
import io.kestra.core.utils.IdUtils;
import jakarta.inject.Inject;
import org.junit.jupiter.api.Assertions;
import org.junit.jupiter.api.Disabled;
import org.junit.jupiter.api.Test;
import java.time.Duration;
@@ -30,6 +31,7 @@ public abstract class AbstractServiceUsageReportTest {
ServiceInstanceRepositoryInterface serviceInstanceRepository;
@Test
@Disabled
public void shouldGetReport() {
// Given
final LocalDate start = LocalDate.now().withDayOfMonth(1);

View File

@@ -18,10 +18,10 @@ import static org.assertj.core.api.Assertions.assertThat;
@KestraTest
class SystemInformationReportTest {
@Inject
private SystemInformationReport systemInformationReport;
@Test
void shouldGetReport() {
SystemInformationReport.SystemInformationEvent event = systemInformationReport.report(Instant.now());
@@ -32,34 +32,34 @@ class SystemInformationReportTest {
assertThat(event.host().getHardware().getLogicalProcessorCount()).isNotNull();
assertThat(event.host().getJvm().getName()).isNotNull();
assertThat(event.host().getOs().getFamily()).isNotNull();
assertThat(event.configurations().getRepositoryType()).isEqualTo("h2");
assertThat(event.configurations().getQueueType()).isEqualTo("h2");
assertThat(event.configurations().getRepositoryType()).isEqualTo("memory");
assertThat(event.configurations().getQueueType()).isEqualTo("memory");
}
@MockBean(SettingRepositoryInterface.class)
@Singleton
static class TestSettingRepository implements SettingRepositoryInterface {
public static Object UUID = null;
@Override
public Optional<Setting> findByKey(String key) {
return Optional.empty();
}
@Override
public List<Setting> findAll() {
return new ArrayList<>();
}
@Override
public Setting save(Setting setting) throws ConstraintViolationException {
if (setting.getKey().equals(Setting.INSTANCE_UUID)) {
UUID = setting.getValue();
}
return setting;
}
@Override
public Setting delete(Setting setting) {
return setting;

View File

@@ -25,7 +25,6 @@ import io.kestra.core.models.tasks.ResolvedTask;
import io.kestra.core.repositories.ExecutionRepositoryInterface.ChildFilter;
import io.kestra.core.utils.IdUtils;
import io.kestra.core.utils.NamespaceUtils;
import io.kestra.core.utils.TestsUtils;
import io.kestra.plugin.core.dashboard.data.Executions;
import io.kestra.plugin.core.debug.Return;
import io.micronaut.data.model.Pageable;
@@ -49,6 +48,7 @@ import java.util.stream.Collectors;
import java.util.stream.Stream;
import static io.kestra.core.models.flows.FlowScope.USER;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
import static org.junit.jupiter.api.Assertions.assertThrows;
import static org.mockito.Mockito.doReturn;
@@ -62,17 +62,17 @@ public abstract class AbstractExecutionRepositoryTest {
@Inject
protected ExecutionRepositoryInterface executionRepository;
public static Execution.ExecutionBuilder builder(String tenantId, State.Type state, String flowId) {
return builder(tenantId, state, flowId, NAMESPACE);
public static Execution.ExecutionBuilder builder(State.Type state, String flowId) {
return builder(state, flowId, NAMESPACE);
}
public static Execution.ExecutionBuilder builder(String tenantId, State.Type state, String flowId, String namespace) {
public static Execution.ExecutionBuilder builder(State.Type state, String flowId, String namespace) {
State finalState = randomDuration(state);
Execution.ExecutionBuilder execution = Execution.builder()
.id(FriendlyId.createFriendlyId())
.namespace(namespace)
.tenantId(tenantId)
.tenantId(MAIN_TENANT)
.flowId(flowId == null ? FLOW : flowId)
.flowRevision(1)
.state(finalState);
@@ -126,11 +126,11 @@ public abstract class AbstractExecutionRepositoryTest {
return finalState;
}
protected void inject(String tenantId) {
inject(tenantId, null);
protected void inject() {
inject(null);
}
protected void inject(String tenantId, String executionTriggerId) {
protected void inject(String executionTriggerId) {
ExecutionTrigger executionTrigger = null;
if (executionTriggerId != null) {
@@ -139,7 +139,7 @@ public abstract class AbstractExecutionRepositoryTest {
.build();
}
executionRepository.save(builder(tenantId, State.Type.RUNNING, null)
executionRepository.save(builder(State.Type.RUNNING, null)
.labels(List.of(
new Label("key", "value"),
new Label("key2", "value2")
@@ -149,7 +149,6 @@ public abstract class AbstractExecutionRepositoryTest {
);
for (int i = 1; i < 28; i++) {
executionRepository.save(builder(
tenantId,
i < 5 ? State.Type.RUNNING : (i < 8 ? State.Type.FAILED : State.Type.SUCCESS),
i < 15 ? null : "second"
).trigger(executionTrigger).build());
@@ -157,7 +156,6 @@ public abstract class AbstractExecutionRepositoryTest {
// add a test execution, this should be ignored in search & statistics
executionRepository.save(builder(
tenantId,
State.Type.SUCCESS,
null
)
@@ -169,10 +167,9 @@ public abstract class AbstractExecutionRepositoryTest {
@ParameterizedTest
@MethodSource("filterCombinations")
void should_find_all(QueryFilter filter, int expectedSize){
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
inject(tenant, "executionTriggerId");
inject("executionTriggerId");
ArrayListTotal<Execution> entries = executionRepository.find(Pageable.UNPAGED, tenant, List.of(filter));
ArrayListTotal<Execution> entries = executionRepository.find(Pageable.UNPAGED, MAIN_TENANT, List.of(filter));
assertThat(entries).hasSize(expectedSize);
}
@@ -195,8 +192,7 @@ public abstract class AbstractExecutionRepositoryTest {
@ParameterizedTest
@MethodSource("errorFilterCombinations")
void should_fail_to_find_all(QueryFilter filter){
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
assertThrows(InvalidQueryFiltersException.class, () -> executionRepository.find(Pageable.UNPAGED, tenant, List.of(filter)));
assertThrows(InvalidQueryFiltersException.class, () -> executionRepository.find(Pageable.UNPAGED, MAIN_TENANT, List.of(filter)));
}
static Stream<QueryFilter> errorFilterCombinations() {
@@ -212,10 +208,9 @@ public abstract class AbstractExecutionRepositoryTest {
@Test
protected void find() {
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
inject(tenant);
inject();
ArrayListTotal<Execution> executions = executionRepository.find(Pageable.from(1, 10), tenant, null);
ArrayListTotal<Execution> executions = executionRepository.find(Pageable.from(1, 10), MAIN_TENANT, null);
assertThat(executions.getTotal()).isEqualTo(28L);
assertThat(executions.size()).isEqualTo(10);
@@ -224,7 +219,7 @@ public abstract class AbstractExecutionRepositoryTest {
.operation(QueryFilter.Op.EQUALS)
.value( List.of(State.Type.RUNNING, State.Type.FAILED))
.build());
executions = executionRepository.find(Pageable.from(1, 10), tenant, filters);
executions = executionRepository.find(Pageable.from(1, 10), MAIN_TENANT, filters);
assertThat(executions.getTotal()).isEqualTo(8L);
filters = List.of(QueryFilter.builder()
@@ -232,7 +227,7 @@ public abstract class AbstractExecutionRepositoryTest {
.operation(QueryFilter.Op.EQUALS)
.value(Map.of("key", "value"))
.build());
executions = executionRepository.find(Pageable.from(1, 10), tenant, filters);
executions = executionRepository.find(Pageable.from(1, 10), MAIN_TENANT, filters);
assertThat(executions.getTotal()).isEqualTo(1L);
filters = List.of(QueryFilter.builder()
@@ -240,7 +235,7 @@ public abstract class AbstractExecutionRepositoryTest {
.operation(QueryFilter.Op.EQUALS)
.value(Map.of("key", "value2"))
.build());
executions = executionRepository.find(Pageable.from(1, 10), tenant, filters);
executions = executionRepository.find(Pageable.from(1, 10), MAIN_TENANT, filters);
assertThat(executions.getTotal()).isEqualTo(0L);
filters = List.of(QueryFilter.builder()
@@ -249,7 +244,7 @@ public abstract class AbstractExecutionRepositoryTest {
.value(Map.of("key", "value", "keyTest", "valueTest"))
.build()
);
executions = executionRepository.find(Pageable.from(1, 10), tenant, filters);
executions = executionRepository.find(Pageable.from(1, 10), MAIN_TENANT, filters);
assertThat(executions.getTotal()).isEqualTo(0L);
filters = List.of(QueryFilter.builder()
@@ -257,7 +252,7 @@ public abstract class AbstractExecutionRepositoryTest {
.operation(QueryFilter.Op.EQUALS)
.value("second")
.build());
executions = executionRepository.find(Pageable.from(1, 10), tenant, filters);
executions = executionRepository.find(Pageable.from(1, 10), MAIN_TENANT, filters);
assertThat(executions.getTotal()).isEqualTo(13L);
filters = List.of(QueryFilter.builder()
@@ -271,7 +266,7 @@ public abstract class AbstractExecutionRepositoryTest {
.value(NAMESPACE)
.build()
);
executions = executionRepository.find(Pageable.from(1, 10), tenant, filters);
executions = executionRepository.find(Pageable.from(1, 10), MAIN_TENANT, filters);
assertThat(executions.getTotal()).isEqualTo(13L);
filters = List.of(QueryFilter.builder()
@@ -279,7 +274,7 @@ public abstract class AbstractExecutionRepositoryTest {
.operation(QueryFilter.Op.STARTS_WITH)
.value("io.kestra")
.build());
executions = executionRepository.find(Pageable.from(1, 10), tenant, filters);
executions = executionRepository.find(Pageable.from(1, 10), MAIN_TENANT, filters);
assertThat(executions.getTotal()).isEqualTo(28L);
}
@@ -287,16 +282,15 @@ public abstract class AbstractExecutionRepositoryTest {
protected void findTriggerExecutionId() {
String executionTriggerId = IdUtils.create();
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
inject(tenant, executionTriggerId);
inject(tenant);
inject(executionTriggerId);
inject();
var filters = List.of(QueryFilter.builder()
.field(QueryFilter.Field.TRIGGER_EXECUTION_ID)
.operation(QueryFilter.Op.EQUALS)
.value(executionTriggerId)
.build());
ArrayListTotal<Execution> executions = executionRepository.find(Pageable.from(1, 10), tenant, filters);
ArrayListTotal<Execution> executions = executionRepository.find(Pageable.from(1, 10), MAIN_TENANT, filters);
assertThat(executions.getTotal()).isEqualTo(28L);
assertThat(executions.size()).isEqualTo(10);
assertThat(executions.getFirst().getTrigger().getVariables().get("executionId")).isEqualTo(executionTriggerId);
@@ -306,7 +300,7 @@ public abstract class AbstractExecutionRepositoryTest {
.value(ExecutionRepositoryInterface.ChildFilter.CHILD)
.build());
executions = executionRepository.find(Pageable.from(1, 10), tenant, filters);
executions = executionRepository.find(Pageable.from(1, 10), MAIN_TENANT, filters);
assertThat(executions.getTotal()).isEqualTo(28L);
assertThat(executions.size()).isEqualTo(10);
assertThat(executions.getFirst().getTrigger().getVariables().get("executionId")).isEqualTo(executionTriggerId);
@@ -317,21 +311,20 @@ public abstract class AbstractExecutionRepositoryTest {
.value(ExecutionRepositoryInterface.ChildFilter.MAIN)
.build());
executions = executionRepository.find(Pageable.from(1, 10), tenant, filters );
executions = executionRepository.find(Pageable.from(1, 10), MAIN_TENANT, filters );
assertThat(executions.getTotal()).isEqualTo(28L);
assertThat(executions.size()).isEqualTo(10);
assertThat(executions.getFirst().getTrigger()).isNull();
executions = executionRepository.find(Pageable.from(1, 10), tenant, null);
executions = executionRepository.find(Pageable.from(1, 10), MAIN_TENANT, null);
assertThat(executions.getTotal()).isEqualTo(56L);
}
@Test
protected void findWithSort() {
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
inject(tenant);
inject();
ArrayListTotal<Execution> executions = executionRepository.find(Pageable.from(1, 10, Sort.of(Sort.Order.desc("id"))), tenant, null);
ArrayListTotal<Execution> executions = executionRepository.find(Pageable.from(1, 10, Sort.of(Sort.Order.desc("id"))), MAIN_TENANT, null);
assertThat(executions.getTotal()).isEqualTo(28L);
assertThat(executions.size()).isEqualTo(10);
@@ -340,92 +333,100 @@ public abstract class AbstractExecutionRepositoryTest {
.operation(QueryFilter.Op.EQUALS)
.value(List.of(State.Type.RUNNING, State.Type.FAILED))
.build());
executions = executionRepository.find(Pageable.from(1, 10), tenant, filters);
executions = executionRepository.find(Pageable.from(1, 10), MAIN_TENANT, filters);
assertThat(executions.getTotal()).isEqualTo(8L);
}
@Test
protected void findById() {
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
var execution1 = ExecutionFixture.EXECUTION_1(tenant);
executionRepository.save(execution1);
protected void findTaskRun() {
inject();
Optional<Execution> full = executionRepository.findById(tenant, execution1.getId());
ArrayListTotal<TaskRun> taskRuns = executionRepository.findTaskRun(Pageable.from(1, 10), MAIN_TENANT, null);
assertThat(taskRuns.getTotal()).isEqualTo(74L);
assertThat(taskRuns.size()).isEqualTo(10);
var filters = List.of(QueryFilter.builder()
.field(QueryFilter.Field.LABELS)
.operation(QueryFilter.Op.EQUALS)
.value(Map.of("key", "value"))
.build());
taskRuns = executionRepository.findTaskRun(Pageable.from(1, 10), MAIN_TENANT, filters);
assertThat(taskRuns.getTotal()).isEqualTo(1L);
assertThat(taskRuns.size()).isEqualTo(1);
}
@Test
protected void findById() {
executionRepository.save(ExecutionFixture.EXECUTION_1);
Optional<Execution> full = executionRepository.findById(MAIN_TENANT, ExecutionFixture.EXECUTION_1.getId());
assertThat(full.isPresent()).isTrue();
full.ifPresent(current -> {
assertThat(full.get().getId()).isEqualTo(execution1.getId());
assertThat(full.get().getId()).isEqualTo(ExecutionFixture.EXECUTION_1.getId());
});
}
@Test
protected void shouldFindByIdTestExecution() {
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
var executionTest = ExecutionFixture.EXECUTION_TEST(tenant);
executionRepository.save(executionTest);
executionRepository.save(ExecutionFixture.EXECUTION_TEST);
Optional<Execution> full = executionRepository.findById(tenant, executionTest.getId());
Optional<Execution> full = executionRepository.findById(null, ExecutionFixture.EXECUTION_TEST.getId());
assertThat(full.isPresent()).isTrue();
full.ifPresent(current -> {
assertThat(full.get().getId()).isEqualTo(executionTest.getId());
assertThat(full.get().getId()).isEqualTo(ExecutionFixture.EXECUTION_TEST.getId());
});
}
@Test
protected void purge() {
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
var execution1 = ExecutionFixture.EXECUTION_1(tenant);
executionRepository.save(execution1);
executionRepository.save(ExecutionFixture.EXECUTION_1);
Optional<Execution> full = executionRepository.findById(tenant, execution1.getId());
Optional<Execution> full = executionRepository.findById(MAIN_TENANT, ExecutionFixture.EXECUTION_1.getId());
assertThat(full.isPresent()).isTrue();
executionRepository.purge(execution1);
executionRepository.purge(ExecutionFixture.EXECUTION_1);
full = executionRepository.findById(tenant, execution1.getId());
full = executionRepository.findById(null, ExecutionFixture.EXECUTION_1.getId());
assertThat(full.isPresent()).isFalse();
}
@Test
protected void delete() {
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
var execution1 = ExecutionFixture.EXECUTION_1(tenant);
executionRepository.save(execution1);
executionRepository.save(ExecutionFixture.EXECUTION_1);
Optional<Execution> full = executionRepository.findById(tenant, execution1.getId());
Optional<Execution> full = executionRepository.findById(MAIN_TENANT, ExecutionFixture.EXECUTION_1.getId());
assertThat(full.isPresent()).isTrue();
executionRepository.delete(execution1);
executionRepository.delete(ExecutionFixture.EXECUTION_1);
full = executionRepository.findById(tenant, execution1.getId());
full = executionRepository.findById(MAIN_TENANT, ExecutionFixture.EXECUTION_1.getId());
assertThat(full.isPresent()).isFalse();
}
@Test
protected void mappingConflict() {
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
executionRepository.save(ExecutionFixture.EXECUTION_2(tenant));
executionRepository.save(ExecutionFixture.EXECUTION_1(tenant));
executionRepository.save(ExecutionFixture.EXECUTION_2);
executionRepository.save(ExecutionFixture.EXECUTION_1);
ArrayListTotal<Execution> page1 = executionRepository.findByFlowId(tenant, NAMESPACE, FLOW, Pageable.from(1, 10));
ArrayListTotal<Execution> page1 = executionRepository.findByFlowId(MAIN_TENANT, NAMESPACE, FLOW, Pageable.from(1, 10));
assertThat(page1.size()).isEqualTo(2);
}
@Test
protected void dailyStatistics() throws InterruptedException {
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
for (int i = 0; i < 28; i++) {
executionRepository.save(builder(
tenant,
i < 5 ? State.Type.RUNNING : (i < 8 ? State.Type.FAILED : State.Type.SUCCESS),
i < 15 ? null : "second"
).build());
}
executionRepository.save(builder(
tenant,
State.Type.SUCCESS,
"second"
).namespace(NamespaceUtils.SYSTEM_FLOWS_DEFAULT_NAMESPACE).build());
@@ -435,14 +436,15 @@ public abstract class AbstractExecutionRepositoryTest {
List<DailyExecutionStatistics> result = executionRepository.dailyStatistics(
null,
tenant,
MAIN_TENANT,
null,
null,
null,
ZonedDateTime.now().minusDays(10),
ZonedDateTime.now(),
null,
null);
null,
false);
assertThat(result.size()).isEqualTo(11);
assertThat(result.get(10).getExecutionCounts().size()).isEqualTo(11);
@@ -454,52 +456,131 @@ public abstract class AbstractExecutionRepositoryTest {
result = executionRepository.dailyStatistics(
null,
tenant,
MAIN_TENANT,
List.of(FlowScope.USER, FlowScope.SYSTEM),
null,
null,
ZonedDateTime.now().minusDays(10),
ZonedDateTime.now(),
null,
null);
null,
false);
assertThat(result.size()).isEqualTo(11);
assertThat(result.get(10).getExecutionCounts().get(State.Type.SUCCESS)).isEqualTo(21L);
result = executionRepository.dailyStatistics(
null,
tenant,
MAIN_TENANT,
List.of(FlowScope.USER),
null,
null,
ZonedDateTime.now().minusDays(10),
ZonedDateTime.now(),
null,
null);
null,
false);
assertThat(result.size()).isEqualTo(11);
assertThat(result.get(10).getExecutionCounts().get(State.Type.SUCCESS)).isEqualTo(20L);
result = executionRepository.dailyStatistics(
null,
tenant,
MAIN_TENANT,
List.of(FlowScope.SYSTEM),
null,
null,
ZonedDateTime.now().minusDays(10),
ZonedDateTime.now(),
null,
null);
null,
false);
assertThat(result.size()).isEqualTo(11);
assertThat(result.get(10).getExecutionCounts().get(State.Type.SUCCESS)).isEqualTo(1L);
}
@Test
protected void taskRunsDailyStatistics() {
for (int i = 0; i < 28; i++) {
executionRepository.save(builder(
i < 5 ? State.Type.RUNNING : (i < 8 ? State.Type.FAILED : State.Type.SUCCESS),
i < 15 ? null : "second"
).build());
}
executionRepository.save(builder(
State.Type.SUCCESS,
"second"
).namespace(NamespaceUtils.SYSTEM_FLOWS_DEFAULT_NAMESPACE).build());
List<DailyExecutionStatistics> result = executionRepository.dailyStatistics(
null,
MAIN_TENANT,
null,
null,
null,
ZonedDateTime.now().minusDays(10),
ZonedDateTime.now(),
null,
null,
true);
assertThat(result.size()).isEqualTo(11);
assertThat(result.get(10).getExecutionCounts().size()).isEqualTo(11);
assertThat(result.get(10).getDuration().getAvg().toMillis()).isGreaterThan(0L);
assertThat(result.get(10).getExecutionCounts().get(State.Type.FAILED)).isEqualTo(3L * 2);
assertThat(result.get(10).getExecutionCounts().get(State.Type.RUNNING)).isEqualTo(5L * 2);
assertThat(result.get(10).getExecutionCounts().get(State.Type.SUCCESS)).isEqualTo(57L);
result = executionRepository.dailyStatistics(
null,
MAIN_TENANT,
List.of(FlowScope.USER, FlowScope.SYSTEM),
null,
null,
ZonedDateTime.now().minusDays(10),
ZonedDateTime.now(),
null,
null,
true);
assertThat(result.size()).isEqualTo(11);
assertThat(result.get(10).getExecutionCounts().get(State.Type.SUCCESS)).isEqualTo(57L);
result = executionRepository.dailyStatistics(
null,
MAIN_TENANT,
List.of(FlowScope.USER),
null,
null,
ZonedDateTime.now().minusDays(10),
ZonedDateTime.now(),
null,
null,
true);
assertThat(result.size()).isEqualTo(11);
assertThat(result.get(10).getExecutionCounts().get(State.Type.SUCCESS)).isEqualTo(55L);
result = executionRepository.dailyStatistics(
null,
MAIN_TENANT,
List.of(FlowScope.SYSTEM),
null,
null,
ZonedDateTime.now().minusDays(10),
ZonedDateTime.now(),
null,
null,
true);
assertThat(result.size()).isEqualTo(11);
assertThat(result.get(10).getExecutionCounts().get(State.Type.SUCCESS)).isEqualTo(2L);
}
@SuppressWarnings("OptionalGetWithoutIsPresent")
@Test
protected void executionsCount() throws InterruptedException {
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
for (int i = 0; i < 14; i++) {
executionRepository.save(builder(
tenant,
State.Type.SUCCESS,
i < 2 ? "first" : (i < 5 ? "second" : "third")
).build());
@@ -509,7 +590,7 @@ public abstract class AbstractExecutionRepositoryTest {
Thread.sleep(500);
List<ExecutionCount> result = executionRepository.executionCounts(
tenant,
MAIN_TENANT,
List.of(
new Flow(NAMESPACE, "first"),
new Flow(NAMESPACE, "second"),
@@ -528,7 +609,7 @@ public abstract class AbstractExecutionRepositoryTest {
assertThat(result.stream().filter(executionCount -> executionCount.getFlowId().equals("missing")).findFirst().get().getCount()).isEqualTo(0L);
result = executionRepository.executionCounts(
tenant,
MAIN_TENANT,
List.of(
new Flow(NAMESPACE, "first"),
new Flow(NAMESPACE, "second"),
@@ -545,7 +626,7 @@ public abstract class AbstractExecutionRepositoryTest {
assertThat(result.stream().filter(executionCount -> executionCount.getFlowId().equals("third")).findFirst().get().getCount()).isEqualTo(9L);
result = executionRepository.executionCounts(
tenant,
MAIN_TENANT,
null,
null,
null,
@@ -558,15 +639,14 @@ public abstract class AbstractExecutionRepositoryTest {
@Test
protected void update() {
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Execution execution = ExecutionFixture.EXECUTION_1(tenant);
executionRepository.save(execution);
Execution execution = ExecutionFixture.EXECUTION_1;
executionRepository.save(ExecutionFixture.EXECUTION_1);
Label label = new Label("key", "value");
Execution updated = execution.toBuilder().labels(List.of(label)).build();
executionRepository.update(updated);
Optional<Execution> validation = executionRepository.findById(tenant, updated.getId());
Optional<Execution> validation = executionRepository.findById(MAIN_TENANT, updated.getId());
assertThat(validation.isPresent()).isTrue();
assertThat(validation.get().getLabels().size()).isEqualTo(1);
assertThat(validation.get().getLabels().getFirst()).isEqualTo(label);
@@ -574,14 +654,13 @@ public abstract class AbstractExecutionRepositoryTest {
@Test
void shouldFindLatestExecutionGivenState() {
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Execution earliest = buildWithCreatedDate(tenant, Instant.now().minus(Duration.ofMinutes(10)));
Execution latest = buildWithCreatedDate(tenant, Instant.now().minus(Duration.ofMinutes(5)));
Execution earliest = buildWithCreatedDate(Instant.now().minus(Duration.ofMinutes(10)));
Execution latest = buildWithCreatedDate(Instant.now().minus(Duration.ofMinutes(5)));
executionRepository.save(earliest);
executionRepository.save(latest);
Optional<Execution> result = executionRepository.findLatestForStates(tenant, "io.kestra.unittest", "full", List.of(State.Type.CREATED));
Optional<Execution> result = executionRepository.findLatestForStates(MAIN_TENANT, "io.kestra.unittest", "full", List.of(State.Type.CREATED));
assertThat(result.isPresent()).isTrue();
assertThat(result.get().getId()).isEqualTo(latest.getId());
}
@@ -621,11 +700,11 @@ public abstract class AbstractExecutionRepositoryTest {
assertThat(data.get(0).get("date")).isEqualTo(DateTimeFormatter.ofPattern("yyyy-MM-dd'T'HH:mm:ss.SSSXXX").format(ZonedDateTime.ofInstant(startDate, ZoneId.systemDefault()).withSecond(0).withNano(0)));
}
private static Execution buildWithCreatedDate(String tenant, Instant instant) {
private static Execution buildWithCreatedDate(Instant instant) {
return Execution.builder()
.id(IdUtils.create())
.namespace("io.kestra.unittest")
.tenantId(tenant)
.tenantId(MAIN_TENANT)
.flowId("full")
.flowRevision(1)
.state(new State(State.Type.CREATED, List.of(new State.History(State.Type.CREATED, instant))))
@@ -636,24 +715,22 @@ public abstract class AbstractExecutionRepositoryTest {
@Test
protected void findAllAsync() {
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
inject(tenant);
inject();
List<Execution> executions = executionRepository.findAllAsync(tenant).collectList().block();
List<Execution> executions = executionRepository.findAllAsync(MAIN_TENANT).collectList().block();
assertThat(executions).hasSize(29); // used by the backup so it contains TEST executions
}
@Test
protected void shouldFindByLabel() {
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
inject(tenant);
inject();
List<QueryFilter> filters = List.of(QueryFilter.builder()
.field(QueryFilter.Field.LABELS)
.operation(QueryFilter.Op.EQUALS)
.value(Map.of("key", "value"))
.build());
List<Execution> executions = executionRepository.find(Pageable.from(1, 10), tenant, filters);
List<Execution> executions = executionRepository.find(Pageable.from(1, 10), MAIN_TENANT, filters);
assertThat(executions.size()).isEqualTo(1L);
// Filtering by two pairs of labels, since now its a and behavior, it should not return anything
@@ -662,16 +739,15 @@ inject(tenant);
.operation(QueryFilter.Op.EQUALS)
.value(Map.of("key", "value", "keyother", "valueother"))
.build());
executions = executionRepository.find(Pageable.from(1, 10), tenant, filters);
executions = executionRepository.find(Pageable.from(1, 10), MAIN_TENANT, filters);
assertThat(executions.size()).isEqualTo(0L);
}
@Test
protected void shouldReturnLastExecutionsWhenInputsAreNull() {
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
inject(tenant);
inject();
List<Execution> lastExecutions = executionRepository.lastExecutions(tenant, null);
List<Execution> lastExecutions = executionRepository.lastExecutions(MAIN_TENANT, null);
assertThat(lastExecutions).isNotEmpty();
Set<String> flowIds = lastExecutions.stream().map(Execution::getFlowId).collect(Collectors.toSet());

View File

@@ -1,6 +1,7 @@
package io.kestra.core.repositories;
import com.google.common.collect.ImmutableMap;
import io.kestra.core.Helpers;
import io.kestra.core.events.CrudEvent;
import io.kestra.core.events.CrudEventType;
import io.kestra.core.exceptions.InvalidQueryFiltersException;
@@ -9,6 +10,7 @@ import io.kestra.core.models.Label;
import io.kestra.core.models.QueryFilter;
import io.kestra.core.models.QueryFilter.Field;
import io.kestra.core.models.QueryFilter.Op;
import io.kestra.core.models.SearchResult;
import io.kestra.core.models.conditions.ConditionContext;
import io.kestra.core.models.executions.Execution;
import io.kestra.core.models.executions.ExecutionTrigger;
@@ -18,6 +20,7 @@ import io.kestra.core.models.property.Property;
import io.kestra.core.models.triggers.AbstractTrigger;
import io.kestra.core.models.triggers.PollingTriggerInterface;
import io.kestra.core.models.triggers.TriggerContext;
import io.kestra.core.queues.QueueException;
import io.kestra.core.repositories.ExecutionRepositoryInterface.ChildFilter;
import io.kestra.core.services.FlowService;
import io.kestra.core.utils.Await;
@@ -26,19 +29,22 @@ import io.kestra.core.utils.TestsUtils;
import io.kestra.plugin.core.debug.Return;
import io.micronaut.context.event.ApplicationEventListener;
import io.micronaut.data.model.Pageable;
import io.micronaut.data.model.Sort;
import jakarta.inject.Inject;
import jakarta.inject.Singleton;
import jakarta.validation.ConstraintViolationException;
import java.util.concurrent.CopyOnWriteArrayList;
import lombok.*;
import lombok.experimental.SuperBuilder;
import org.junit.jupiter.api.Assertions;
import org.junit.jupiter.api.BeforeAll;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.TestInstance;
import org.junit.jupiter.params.ParameterizedTest;
import org.junit.jupiter.params.provider.MethodSource;
import org.slf4j.event.Level;
import java.io.IOException;
import java.net.URISyntaxException;
import java.time.Duration;
import java.time.ZonedDateTime;
import java.util.*;
@@ -46,12 +52,16 @@ import java.util.concurrent.TimeoutException;
import java.util.stream.Stream;
import static io.kestra.core.models.flows.FlowScope.SYSTEM;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static io.kestra.core.utils.NamespaceUtils.SYSTEM_FLOWS_DEFAULT_NAMESPACE;
import static org.assertj.core.api.Assertions.assertThat;
import static org.junit.jupiter.api.Assertions.assertThrows;
// If some counts are wrong in this test it means that one of the tests is not properly deleting what it created
@KestraTest
@TestInstance(TestInstance.Lifecycle.PER_CLASS)
public abstract class AbstractFlowRepositoryTest {
public static final String TEST_TENANT_ID = "tenant";
public static final String TEST_NAMESPACE = "io.kestra.unittest";
public static final String TEST_FLOW_ID = "test";
@Inject
@@ -60,18 +70,21 @@ public abstract class AbstractFlowRepositoryTest {
@Inject
protected ExecutionRepositoryInterface executionRepository;
@BeforeAll
protected static void init() {
@Inject
private LocalFlowRepositoryLoader repositoryLoader;
@BeforeEach
protected void init() throws IOException, URISyntaxException {
TestsUtils.loads(MAIN_TENANT, repositoryLoader);
FlowListener.reset();
}
private static FlowWithSource.FlowWithSourceBuilder<?, ?> builder(String tenantId) {
return builder(tenantId, IdUtils.create(), TEST_FLOW_ID);
private static FlowWithSource.FlowWithSourceBuilder<?, ?> builder() {
return builder(IdUtils.create(), TEST_FLOW_ID);
}
private static FlowWithSource.FlowWithSourceBuilder<?, ?> builder(String tenantId, String flowId, String taskId) {
private static FlowWithSource.FlowWithSourceBuilder<?, ?> builder(String flowId, String taskId) {
return FlowWithSource.builder()
.tenantId(tenantId)
.id(flowId)
.namespace(TEST_NAMESPACE)
.tasks(Collections.singletonList(Return.builder().id(taskId).type(Return.class.getName()).format(Property.ofValue(TEST_FLOW_ID)).build()));
@@ -80,16 +93,16 @@ public abstract class AbstractFlowRepositoryTest {
@ParameterizedTest
@MethodSource("filterCombinations")
void should_find_all(QueryFilter filter){
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
FlowWithSource flow = FlowWithSource.builder()
.id("filterFlowId")
.namespace(SYSTEM_FLOWS_DEFAULT_NAMESPACE)
.tenantId(tenant)
.tenantId(MAIN_TENANT)
.labels(Label.from(Map.of("key", "value")))
.build();
flow = flowRepository.create(GenericFlow.of(flow));
try {
ArrayListTotal<Flow> entries = flowRepository.find(Pageable.UNPAGED, tenant, List.of(filter));
ArrayListTotal<Flow> entries = flowRepository.find(Pageable.UNPAGED, MAIN_TENANT, List.of(filter));
assertThat(entries).hasSize(1);
} finally {
@@ -100,16 +113,16 @@ public abstract class AbstractFlowRepositoryTest {
@ParameterizedTest
@MethodSource("filterCombinations")
void should_find_all_with_source(QueryFilter filter){
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
FlowWithSource flow = FlowWithSource.builder()
.id("filterFlowId")
.namespace(SYSTEM_FLOWS_DEFAULT_NAMESPACE)
.tenantId(tenant)
.tenantId(MAIN_TENANT)
.labels(Label.from(Map.of("key", "value")))
.build();
flow = flowRepository.create(GenericFlow.of(flow));
try {
ArrayListTotal<FlowWithSource> entries = flowRepository.findWithSource(Pageable.UNPAGED, tenant, List.of(filter));
ArrayListTotal<FlowWithSource> entries = flowRepository.findWithSource(Pageable.UNPAGED, MAIN_TENANT, List.of(filter));
assertThat(entries).hasSize(1);
} finally {
@@ -131,7 +144,7 @@ public abstract class AbstractFlowRepositoryTest {
void should_fail_to_find_all(QueryFilter filter){
assertThrows(
InvalidQueryFiltersException.class,
() -> flowRepository.find(Pageable.UNPAGED, TestsUtils.randomTenant(this.getClass().getSimpleName()), List.of(filter)));
() -> flowRepository.find(Pageable.UNPAGED, MAIN_TENANT, List.of(filter)));
}
@@ -140,7 +153,7 @@ public abstract class AbstractFlowRepositoryTest {
void should_fail_to_find_all_with_source(QueryFilter filter){
assertThrows(
InvalidQueryFiltersException.class,
() -> flowRepository.findWithSource(Pageable.UNPAGED, TestsUtils.randomTenant(this.getClass().getSimpleName()), List.of(filter)));
() -> flowRepository.findWithSource(Pageable.UNPAGED, MAIN_TENANT, List.of(filter)));
}
@@ -163,17 +176,17 @@ public abstract class AbstractFlowRepositoryTest {
@Test
void findById() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
FlowWithSource flow = builder(tenant)
FlowWithSource flow = builder()
.tenantId(MAIN_TENANT)
.revision(3)
.build();
flow = flowRepository.create(GenericFlow.of(flow));
try {
Optional<Flow> full = flowRepository.findById(tenant, flow.getNamespace(), flow.getId());
Optional<Flow> full = flowRepository.findById(MAIN_TENANT, flow.getNamespace(), flow.getId());
assertThat(full.isPresent()).isTrue();
assertThat(full.get().getRevision()).isEqualTo(1);
full = flowRepository.findById(tenant, flow.getNamespace(), flow.getId(), Optional.empty());
full = flowRepository.findById(MAIN_TENANT, flow.getNamespace(), flow.getId(), Optional.empty());
assertThat(full.isPresent()).isTrue();
} finally {
deleteFlow(flow);
@@ -182,18 +195,17 @@ public abstract class AbstractFlowRepositoryTest {
@Test
void findByIdWithoutAcl() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
FlowWithSource flow = builder(tenant)
.tenantId(tenant)
FlowWithSource flow = builder()
.tenantId(MAIN_TENANT)
.revision(3)
.build();
flow = flowRepository.create(GenericFlow.of(flow));
try {
Optional<Flow> full = flowRepository.findByIdWithoutAcl(tenant, flow.getNamespace(), flow.getId(), Optional.empty());
Optional<Flow> full = flowRepository.findByIdWithoutAcl(MAIN_TENANT, flow.getNamespace(), flow.getId(), Optional.empty());
assertThat(full.isPresent()).isTrue();
assertThat(full.get().getRevision()).isEqualTo(1);
full = flowRepository.findByIdWithoutAcl(tenant, flow.getNamespace(), flow.getId(), Optional.empty());
full = flowRepository.findByIdWithoutAcl(MAIN_TENANT, flow.getNamespace(), flow.getId(), Optional.empty());
assertThat(full.isPresent()).isTrue();
} finally {
deleteFlow(flow);
@@ -202,16 +214,15 @@ public abstract class AbstractFlowRepositoryTest {
@Test
void findByIdWithSource() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
FlowWithSource flow = builder(tenant)
.tenantId(tenant)
FlowWithSource flow = builder()
.tenantId(MAIN_TENANT)
.revision(3)
.build();
String source = "# comment\n" + flow.sourceOrGenerateIfNull();
flow = flowRepository.create(GenericFlow.fromYaml(tenant, source));
flow = flowRepository.create(GenericFlow.fromYaml(MAIN_TENANT, source));
try {
Optional<FlowWithSource> full = flowRepository.findByIdWithSource(tenant, flow.getNamespace(), flow.getId());
Optional<FlowWithSource> full = flowRepository.findByIdWithSource(MAIN_TENANT, flow.getNamespace(), flow.getId());
assertThat(full.isPresent()).isTrue();
full.ifPresent(current -> {
@@ -226,8 +237,7 @@ public abstract class AbstractFlowRepositoryTest {
@Test
void save() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
FlowWithSource flow = builder(tenant).revision(12).build();
FlowWithSource flow = builder().revision(12).build();
FlowWithSource save = flowRepository.create(GenericFlow.of(flow));
try {
@@ -239,8 +249,7 @@ public abstract class AbstractFlowRepositoryTest {
@Test
void saveNoRevision() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
FlowWithSource flow = builder(tenant).build();
FlowWithSource flow = builder().build();
FlowWithSource save = flowRepository.create(GenericFlow.of(flow));
try {
@@ -251,17 +260,68 @@ public abstract class AbstractFlowRepositoryTest {
}
@Test
void findAll() {
List<Flow> save = flowRepository.findAll(MAIN_TENANT);
assertThat((long) save.size()).isEqualTo(Helpers.FLOWS_COUNT);
}
@Test
void findAllWithSource() {
List<FlowWithSource> save = flowRepository.findAllWithSource(MAIN_TENANT);
assertThat((long) save.size()).isEqualTo(Helpers.FLOWS_COUNT);
}
@Test
void findAllForAllTenants() {
List<Flow> save = flowRepository.findAllForAllTenants();
assertThat((long) save.size()).isEqualTo(Helpers.FLOWS_COUNT);
}
@Test
void findAllWithSourceForAllTenants() {
List<FlowWithSource> save = flowRepository.findAllWithSourceForAllTenants();
assertThat((long) save.size()).isEqualTo(Helpers.FLOWS_COUNT);
}
@Test
void findByNamespace() {
List<Flow> save = flowRepository.findByNamespace(MAIN_TENANT, "io.kestra.tests");
assertThat((long) save.size()).isEqualTo(Helpers.FLOWS_COUNT - 24);
save = flowRepository.findByNamespace(MAIN_TENANT, "io.kestra.tests2");
assertThat((long) save.size()).isEqualTo(1L);
save = flowRepository.findByNamespace(MAIN_TENANT, "io.kestra.tests.minimal.bis");
assertThat((long) save.size()).isEqualTo(1L);
}
@Test
void findByNamespacePrefix() {
List<Flow> save = flowRepository.findByNamespacePrefix(MAIN_TENANT, "io.kestra.tests");
assertThat((long) save.size()).isEqualTo(Helpers.FLOWS_COUNT - 1);
save = flowRepository.findByNamespace(MAIN_TENANT, "io.kestra.tests2");
assertThat((long) save.size()).isEqualTo(1L);
save = flowRepository.findByNamespace(MAIN_TENANT, "io.kestra.tests.minimal.bis");
assertThat((long) save.size()).isEqualTo(1L);
}
@Test
void findByNamespaceWithSource() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Flow flow = builder(tenant)
Flow flow = builder()
.revision(3)
.build();
String flowSource = "# comment\n" + flow.sourceOrGenerateIfNull();
flow = flowRepository.create(GenericFlow.fromYaml(tenant, flowSource));
flow = flowRepository.create(GenericFlow.fromYaml(MAIN_TENANT, flowSource));
try {
List<FlowWithSource> save = flowRepository.findByNamespaceWithSource(tenant, flow.getNamespace());
List<FlowWithSource> save = flowRepository.findByNamespaceWithSource(MAIN_TENANT, flow.getNamespace());
assertThat((long) save.size()).isEqualTo(1L);
assertThat(save.getFirst().getSource()).isEqualTo(FlowService.cleanupSource(flowSource));
@@ -270,15 +330,175 @@ public abstract class AbstractFlowRepositoryTest {
}
}
@Test
void findByNamespacePrefixWithSource() {
List<FlowWithSource> save = flowRepository.findByNamespacePrefixWithSource(MAIN_TENANT, "io.kestra.tests");
assertThat((long) save.size()).isEqualTo(Helpers.FLOWS_COUNT - 1);
}
@Test
void find_paginationPartial() {
assertThat(flowRepository.find(Pageable.from(1, (int) Helpers.FLOWS_COUNT - 1, Sort.UNSORTED), MAIN_TENANT, null)
.size())
.describedAs("When paginating at MAX-1, it should return MAX-1")
.isEqualTo(Helpers.FLOWS_COUNT - 1);
assertThat(flowRepository.findWithSource(Pageable.from(1, (int) Helpers.FLOWS_COUNT - 1, Sort.UNSORTED), MAIN_TENANT, null)
.size())
.describedAs("When paginating at MAX-1, it should return MAX-1")
.isEqualTo(Helpers.FLOWS_COUNT - 1);
}
@Test
void find_paginationGreaterThanExisting() {
assertThat(flowRepository.find(Pageable.from(1, (int) Helpers.FLOWS_COUNT + 1, Sort.UNSORTED), MAIN_TENANT, null)
.size())
.describedAs("When paginating requesting a larger amount than existing, it should return existing MAX")
.isEqualTo(Helpers.FLOWS_COUNT);
assertThat(flowRepository.findWithSource(Pageable.from(1, (int) Helpers.FLOWS_COUNT + 1, Sort.UNSORTED), MAIN_TENANT, null)
.size())
.describedAs("When paginating requesting a larger amount than existing, it should return existing MAX")
.isEqualTo(Helpers.FLOWS_COUNT);
}
@Test
void find_prefixMatchingAllNamespaces() {
assertThat(flowRepository.find(
Pageable.UNPAGED,
MAIN_TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.NAMESPACE).operation(QueryFilter.Op.STARTS_WITH).value("io.kestra.tests").build()
)
).size())
.describedAs("When filtering on NAMESPACE START_WITH a pattern that match all, it should return all")
.isEqualTo(Helpers.FLOWS_COUNT);
assertThat(flowRepository.findWithSource(
Pageable.UNPAGED,
MAIN_TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.NAMESPACE).operation(QueryFilter.Op.STARTS_WITH).value("io.kestra.tests").build()
)
).size())
.describedAs("When filtering on NAMESPACE START_WITH a pattern that match all, it should return all")
.isEqualTo(Helpers.FLOWS_COUNT);
}
@Test
void find_aSpecifiedNamespace() {
assertThat(flowRepository.find(
Pageable.UNPAGED,
MAIN_TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.NAMESPACE).operation(QueryFilter.Op.EQUALS).value("io.kestra.tests2").build()
)
).size()).isEqualTo(1L);
assertThat(flowRepository.findWithSource(
Pageable.UNPAGED,
MAIN_TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.NAMESPACE).operation(QueryFilter.Op.EQUALS).value("io.kestra.tests2").build()
)
).size()).isEqualTo(1L);
}
@Test
void find_aSpecificSubNamespace() {
assertThat(flowRepository.find(
Pageable.UNPAGED,
MAIN_TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.NAMESPACE).operation(QueryFilter.Op.EQUALS).value("io.kestra.tests.minimal.bis").build()
)
).size())
.isEqualTo(1L);
assertThat(flowRepository.findWithSource(
Pageable.UNPAGED,
MAIN_TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.NAMESPACE).operation(QueryFilter.Op.EQUALS).value("io.kestra.tests.minimal.bis").build()
)
).size())
.isEqualTo(1L);
}
@Test
void find_aSpecificLabel() {
assertThat(
flowRepository.find(Pageable.UNPAGED, MAIN_TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.LABELS).operation(QueryFilter.Op.EQUALS).value(Map.of("country", "FR")).build()
)
).size())
.isEqualTo(1);
assertThat(
flowRepository.findWithSource(Pageable.UNPAGED, MAIN_TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.LABELS).operation(QueryFilter.Op.EQUALS).value(Map.of("country", "FR")).build()
)
).size())
.isEqualTo(1);
}
@Test
void find_aSpecificFlowByNamespaceAndLabel() {
assertThat(
flowRepository.find(Pageable.UNPAGED, MAIN_TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.NAMESPACE).operation(QueryFilter.Op.EQUALS).value("io.kestra.tests").build(),
QueryFilter.builder().field(QueryFilter.Field.LABELS).operation(QueryFilter.Op.EQUALS).value(Map.of("key2", "value2")).build()
)
).size())
.isEqualTo(1);
assertThat(
flowRepository.findWithSource(Pageable.UNPAGED, MAIN_TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.NAMESPACE).operation(QueryFilter.Op.EQUALS).value("io.kestra.tests").build(),
QueryFilter.builder().field(QueryFilter.Field.LABELS).operation(QueryFilter.Op.EQUALS).value(Map.of("key2", "value2")).build()
)
).size())
.isEqualTo(1);
}
@Test
void find_noResult_forAnUnknownNamespace() {
assertThat(
flowRepository.find(Pageable.UNPAGED, MAIN_TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.NAMESPACE).operation(QueryFilter.Op.EQUALS).value("io.kestra.tests").build(),
QueryFilter.builder().field(QueryFilter.Field.LABELS).operation(QueryFilter.Op.EQUALS).value(Map.of("key1", "value2")).build()
)
).size())
.isEqualTo(0);
assertThat(
flowRepository.findWithSource(Pageable.UNPAGED, MAIN_TENANT,
List.of(
QueryFilter.builder().field(QueryFilter.Field.NAMESPACE).operation(QueryFilter.Op.EQUALS).value("io.kestra.tests").build(),
QueryFilter.builder().field(QueryFilter.Field.LABELS).operation(QueryFilter.Op.EQUALS).value(Map.of("key1", "value2")).build()
)
).size())
.isEqualTo(0);
}
@Test
protected void findSpecialChars() {
ArrayListTotal<SearchResult<Flow>> save = flowRepository.findSourceCode(Pageable.unpaged(), "https://api.chucknorris.io", MAIN_TENANT, null);
assertThat((long) save.size()).isEqualTo(2L);
}
@Test
void delete() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Flow flow = builder(tenant).tenantId(tenant).build();
Flow flow = builder().tenantId(MAIN_TENANT).build();
FlowWithSource save = flowRepository.create(GenericFlow.of(flow));
try {
assertThat(flowRepository.findById(tenant, save.getNamespace(), save.getId()).isPresent()).isTrue();
assertThat(flowRepository.findById(MAIN_TENANT, save.getNamespace(), save.getId()).isPresent()).isTrue();
} catch (Throwable e) {
deleteFlow(save);
throw e;
@@ -286,22 +506,21 @@ public abstract class AbstractFlowRepositoryTest {
Flow delete = flowRepository.delete(save);
assertThat(flowRepository.findById(tenant, flow.getNamespace(), flow.getId()).isPresent()).isFalse();
assertThat(flowRepository.findById(tenant, flow.getNamespace(), flow.getId(), Optional.of(save.getRevision())).isPresent()).isTrue();
assertThat(flowRepository.findById(MAIN_TENANT, flow.getNamespace(), flow.getId()).isPresent()).isFalse();
assertThat(flowRepository.findById(MAIN_TENANT, flow.getNamespace(), flow.getId(), Optional.of(save.getRevision())).isPresent()).isTrue();
List<FlowWithSource> revisions = flowRepository.findRevisions(tenant, flow.getNamespace(), flow.getId());
List<FlowWithSource> revisions = flowRepository.findRevisions(MAIN_TENANT, flow.getNamespace(), flow.getId());
assertThat(revisions.getLast().getRevision()).isEqualTo(delete.getRevision());
}
@Test
void updateConflict() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
String flowId = IdUtils.create();
Flow flow = Flow.builder()
.id(flowId)
.namespace(TEST_NAMESPACE)
.tenantId(tenant)
.tenantId(MAIN_TENANT)
.inputs(List.of(StringInput.builder().type(Type.STRING).id("a").build()))
.tasks(Collections.singletonList(Return.builder().id(TEST_FLOW_ID).type(Return.class.getName()).format(Property.ofValue(TEST_FLOW_ID)).build()))
.build();
@@ -309,12 +528,12 @@ public abstract class AbstractFlowRepositoryTest {
Flow save = flowRepository.create(GenericFlow.of(flow));
try {
assertThat(flowRepository.findById(tenant, flow.getNamespace(), flow.getId()).isPresent()).isTrue();
assertThat(flowRepository.findById(MAIN_TENANT, flow.getNamespace(), flow.getId()).isPresent()).isTrue();
Flow update = Flow.builder()
.id(IdUtils.create())
.namespace("io.kestra.unittest2")
.tenantId(tenant)
.tenantId(MAIN_TENANT)
.inputs(List.of(StringInput.builder().type(Type.STRING).id("b").build()))
.tasks(Collections.singletonList(Return.builder().id(TEST_FLOW_ID).type(Return.class.getName()).format(Property.ofValue(TEST_FLOW_ID)).build()))
.build();
@@ -332,14 +551,13 @@ public abstract class AbstractFlowRepositoryTest {
}
@Test
public void removeTrigger() throws TimeoutException {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
void removeTrigger() throws TimeoutException, QueueException {
String flowId = IdUtils.create();
Flow flow = Flow.builder()
.id(flowId)
.namespace(TEST_NAMESPACE)
.tenantId(tenant)
.tenantId(MAIN_TENANT)
.triggers(Collections.singletonList(UnitTest.builder()
.id("sleep")
.type(UnitTest.class.getName())
@@ -349,12 +567,12 @@ public abstract class AbstractFlowRepositoryTest {
flow = flowRepository.create(GenericFlow.of(flow));
try {
assertThat(flowRepository.findById(tenant, flow.getNamespace(), flow.getId()).isPresent()).isTrue();
assertThat(flowRepository.findById(MAIN_TENANT, flow.getNamespace(), flow.getId()).isPresent()).isTrue();
Flow update = Flow.builder()
.id(flowId)
.namespace(TEST_NAMESPACE)
.tenantId(tenant)
.tenantId(MAIN_TENANT)
.tasks(Collections.singletonList(Return.builder().id(TEST_FLOW_ID).type(Return.class.getName()).format(Property.ofValue(TEST_FLOW_ID)).build()))
.build();
;
@@ -365,25 +583,21 @@ public abstract class AbstractFlowRepositoryTest {
deleteFlow(flow);
}
Await.until(() -> FlowListener.filterByTenant(tenant)
.size() == 3, Duration.ofMillis(100), Duration.ofSeconds(5));
assertThat(FlowListener.filterByTenant(tenant).stream()
.filter(r -> r.getType() == CrudEventType.CREATE).count()).isEqualTo(1L);
assertThat(FlowListener.filterByTenant(tenant).stream()
.filter(r -> r.getType() == CrudEventType.UPDATE).count()).isEqualTo(1L);
assertThat(FlowListener.filterByTenant(tenant).stream()
.filter(r -> r.getType() == CrudEventType.DELETE).count()).isEqualTo(1L);
Await.until(() -> FlowListener.getEmits().size() == 3, Duration.ofMillis(100), Duration.ofSeconds(5));
assertThat(FlowListener.getEmits().stream().filter(r -> r.getType() == CrudEventType.CREATE).count()).isEqualTo(1L);
assertThat(FlowListener.getEmits().stream().filter(r -> r.getType() == CrudEventType.UPDATE).count()).isEqualTo(1L);
assertThat(FlowListener.getEmits().stream().filter(r -> r.getType() == CrudEventType.DELETE).count()).isEqualTo(1L);
}
@Test
void removeTriggerDelete() throws TimeoutException {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
String flowId = IdUtils.create();
Flow flow = Flow.builder()
.id(flowId)
.namespace(TEST_NAMESPACE)
.tenantId(tenant)
.tenantId(MAIN_TENANT)
.triggers(Collections.singletonList(UnitTest.builder()
.id("sleep")
.type(UnitTest.class.getName())
@@ -393,39 +607,40 @@ public abstract class AbstractFlowRepositoryTest {
Flow save = flowRepository.create(GenericFlow.of(flow));
try {
assertThat(flowRepository.findById(tenant, flow.getNamespace(), flow.getId()).isPresent()).isTrue();
assertThat(flowRepository.findById(MAIN_TENANT, flow.getNamespace(), flow.getId()).isPresent()).isTrue();
} finally {
deleteFlow(save);
}
Await.until(() -> FlowListener.filterByTenant(tenant)
.size() == 2, Duration.ofMillis(100), Duration.ofSeconds(5));
assertThat(FlowListener.filterByTenant(tenant).stream()
.filter(r -> r.getType() == CrudEventType.CREATE).count()).isEqualTo(1L);
assertThat(FlowListener.filterByTenant(tenant).stream()
.filter(r -> r.getType() == CrudEventType.DELETE).count()).isEqualTo(1L);
Await.until(() -> FlowListener.getEmits().size() == 2, Duration.ofMillis(100), Duration.ofSeconds(5));
assertThat(FlowListener.getEmits().stream().filter(r -> r.getType() == CrudEventType.CREATE).count()).isEqualTo(1L);
assertThat(FlowListener.getEmits().stream().filter(r -> r.getType() == CrudEventType.DELETE).count()).isEqualTo(1L);
}
@Test
void findDistinctNamespace() {
List<String> distinctNamespace = flowRepository.findDistinctNamespace(MAIN_TENANT);
assertThat((long) distinctNamespace.size()).isEqualTo(9L);
}
@Test
protected void shouldReturnNullRevisionForNonExistingFlow() {
assertThat(flowRepository.lastRevision(TestsUtils.randomTenant(this.getClass().getSimpleName()), TEST_NAMESPACE, IdUtils.create())).isNull();
assertThat(flowRepository.lastRevision(TEST_TENANT_ID, TEST_NAMESPACE, IdUtils.create())).isNull();
}
@Test
protected void shouldReturnLastRevisionOnCreate() {
// Given
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
final List<Flow> toDelete = new ArrayList<>();
final String flowId = IdUtils.create();
try {
// When
toDelete.add(flowRepository.create(createTestingLogFlow(tenant, flowId, "???")));
Integer result = flowRepository.lastRevision(tenant, TEST_NAMESPACE, flowId);
toDelete.add(flowRepository.create(createTestingLogFlow(flowId, "???")));
Integer result = flowRepository.lastRevision(TEST_TENANT_ID, TEST_NAMESPACE, flowId);
// Then
assertThat(result).isEqualTo(1);
assertThat(flowRepository.lastRevision(tenant, TEST_NAMESPACE, flowId)).isEqualTo(1);
assertThat(flowRepository.lastRevision(TEST_TENANT_ID, TEST_NAMESPACE, flowId)).isEqualTo(1);
} finally {
toDelete.forEach(this::deleteFlow);
}
@@ -434,36 +649,34 @@ public abstract class AbstractFlowRepositoryTest {
@Test
protected void shouldIncrementRevisionOnDelete() {
// Given
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
final String flowId = IdUtils.create();
FlowWithSource created = flowRepository.create(createTestingLogFlow(tenant, flowId, "first"));
assertThat(flowRepository.findRevisions(tenant, TEST_NAMESPACE, flowId).size()).isEqualTo(1);
FlowWithSource created = flowRepository.create(createTestingLogFlow(flowId, "first"));
assertThat(flowRepository.findRevisions(TEST_TENANT_ID, TEST_NAMESPACE, flowId).size()).isEqualTo(1);
// When
flowRepository.delete(created);
// Then
assertThat(flowRepository.findRevisions(tenant, TEST_NAMESPACE, flowId).size()).isEqualTo(2);
assertThat(flowRepository.findRevisions(TEST_TENANT_ID, TEST_NAMESPACE, flowId).size()).isEqualTo(2);
}
@Test
protected void shouldIncrementRevisionOnCreateAfterDelete() {
// Given
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
final List<Flow> toDelete = new ArrayList<>();
final String flowId = IdUtils.create();
try {
// Given
flowRepository.delete(
flowRepository.create(createTestingLogFlow(tenant, flowId, "first"))
flowRepository.create(createTestingLogFlow(flowId, "first"))
);
// When
toDelete.add(flowRepository.create(createTestingLogFlow(tenant, flowId, "second")));
toDelete.add(flowRepository.create(createTestingLogFlow(flowId, "second")));
// Then
assertThat(flowRepository.findRevisions(tenant, TEST_NAMESPACE, flowId).size()).isEqualTo(3);
assertThat(flowRepository.lastRevision(tenant, TEST_NAMESPACE, flowId)).isEqualTo(3);
assertThat(flowRepository.findRevisions(TEST_TENANT_ID, TEST_NAMESPACE, flowId).size()).isEqualTo(3);
assertThat(flowRepository.lastRevision(TEST_TENANT_ID, TEST_NAMESPACE, flowId)).isEqualTo(3);
} finally {
toDelete.forEach(this::deleteFlow);
}
@@ -472,23 +685,22 @@ public abstract class AbstractFlowRepositoryTest {
@Test
protected void shouldReturnNullForLastRevisionAfterDelete() {
// Given
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
final List<Flow> toDelete = new ArrayList<>();
final String flowId = IdUtils.create();
try {
// Given
FlowWithSource created = flowRepository.create(createTestingLogFlow(tenant, flowId, "first"));
FlowWithSource created = flowRepository.create(createTestingLogFlow(flowId, "first"));
toDelete.add(created);
FlowWithSource updated = flowRepository.update(createTestingLogFlow(tenant, flowId, "second"), created);
FlowWithSource updated = flowRepository.update(createTestingLogFlow(flowId, "second"), created);
toDelete.add(updated);
// When
flowRepository.delete(updated);
// Then
assertThat(flowRepository.findById(tenant, TEST_NAMESPACE, flowId, Optional.empty())).isEqualTo(Optional.empty());
assertThat(flowRepository.lastRevision(tenant, TEST_NAMESPACE, flowId)).isNull();
assertThat(flowRepository.findById(TEST_TENANT_ID, TEST_NAMESPACE, flowId, Optional.empty())).isEqualTo(Optional.empty());
assertThat(flowRepository.lastRevision(TEST_TENANT_ID, TEST_NAMESPACE, flowId)).isNull();
} finally {
toDelete.forEach(this::deleteFlow);
}
@@ -497,23 +709,22 @@ public abstract class AbstractFlowRepositoryTest {
@Test
protected void shouldFindAllRevisionsAfterDelete() {
// Given
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
final List<Flow> toDelete = new ArrayList<>();
final String flowId = IdUtils.create();
try {
// Given
FlowWithSource created = flowRepository.create(createTestingLogFlow(tenant, flowId, "first"));
FlowWithSource created = flowRepository.create(createTestingLogFlow(flowId, "first"));
toDelete.add(created);
FlowWithSource updated = flowRepository.update(createTestingLogFlow(tenant, flowId, "second"), created);
FlowWithSource updated = flowRepository.update(createTestingLogFlow(flowId, "second"), created);
toDelete.add(updated);
// When
flowRepository.delete(updated);
// Then
assertThat(flowRepository.findById(tenant, TEST_NAMESPACE, flowId, Optional.empty())).isEqualTo(Optional.empty());
assertThat(flowRepository.findRevisions(tenant, TEST_NAMESPACE, flowId).size()).isEqualTo(3);
assertThat(flowRepository.findById(TEST_TENANT_ID, TEST_NAMESPACE, flowId, Optional.empty())).isEqualTo(Optional.empty());
assertThat(flowRepository.findRevisions(TEST_TENANT_ID, TEST_NAMESPACE, flowId).size()).isEqualTo(3);
} finally {
toDelete.forEach(this::deleteFlow);
}
@@ -521,22 +732,21 @@ public abstract class AbstractFlowRepositoryTest {
@Test
protected void shouldIncrementRevisionOnUpdateGivenNotEqualSource() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
final List<Flow> toDelete = new ArrayList<>();
final String flowId = IdUtils.create();
try {
// Given
FlowWithSource created = flowRepository.create(createTestingLogFlow(tenant, flowId, "first"));
FlowWithSource created = flowRepository.create(createTestingLogFlow(flowId, "first"));
toDelete.add(created);
// When
FlowWithSource updated = flowRepository.update(createTestingLogFlow(tenant, flowId, "second"), created);
FlowWithSource updated = flowRepository.update(createTestingLogFlow(flowId, "second"), created);
toDelete.add(updated);
// Then
assertThat(updated.getRevision()).isEqualTo(2);
assertThat(flowRepository.lastRevision(tenant, TEST_NAMESPACE, flowId)).isEqualTo(2);
assertThat(flowRepository.lastRevision(TEST_TENANT_ID, TEST_NAMESPACE, flowId)).isEqualTo(2);
} finally {
toDelete.forEach(this::deleteFlow);
@@ -545,39 +755,48 @@ public abstract class AbstractFlowRepositoryTest {
@Test
protected void shouldNotIncrementRevisionOnUpdateGivenEqualSource() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
final List<Flow> toDelete = new ArrayList<>();
final String flowId = IdUtils.create();
try {
// Given
FlowWithSource created = flowRepository.create(createTestingLogFlow(tenant, flowId, "first"));
FlowWithSource created = flowRepository.create(createTestingLogFlow(flowId, "first"));
toDelete.add(created);
// When
FlowWithSource updated = flowRepository.update(createTestingLogFlow(tenant, flowId, "first"), created);
FlowWithSource updated = flowRepository.update(createTestingLogFlow(flowId, "first"), created);
toDelete.add(updated);
// Then
assertThat(updated.getRevision()).isEqualTo(1);
assertThat(flowRepository.lastRevision(tenant, TEST_NAMESPACE, flowId)).isEqualTo(1);
assertThat(flowRepository.lastRevision(TEST_TENANT_ID, TEST_NAMESPACE, flowId)).isEqualTo(1);
} finally {
toDelete.forEach(this::deleteFlow);
}
}
@Test
void shouldReturnForGivenQueryWildCardFilters() {
List<QueryFilter> filters = List.of(
QueryFilter.builder().field(QueryFilter.Field.QUERY).operation(QueryFilter.Op.EQUALS).value("*").build()
);
ArrayListTotal<Flow> flows = flowRepository.find(Pageable.from(1, 10), MAIN_TENANT, filters);
assertThat(flows.size()).isEqualTo(10);
assertThat(flows.getTotal()).isEqualTo(Helpers.FLOWS_COUNT);
}
@Test
void findByExecution() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Flow flow = builder(tenant)
Flow flow = builder()
.tenantId(MAIN_TENANT)
.revision(1)
.build();
flowRepository.create(GenericFlow.of(flow));
Execution execution = Execution.builder()
.id(IdUtils.create())
.namespace(flow.getNamespace())
.tenantId(tenant)
.tenantId(MAIN_TENANT)
.flowId(flow.getId())
.flowRevision(flow.getRevision())
.state(new State())
@@ -602,13 +821,11 @@ public abstract class AbstractFlowRepositoryTest {
@Test
void findByExecutionNoRevision() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Flow flow = builder(tenant)
Flow flow = builder()
.revision(3)
.build();
flowRepository.create(GenericFlow.of(flow));
Execution execution = Execution.builder()
.tenantId(tenant)
.id(IdUtils.create())
.namespace(flow.getNamespace())
.flowId(flow.getId())
@@ -634,14 +851,13 @@ public abstract class AbstractFlowRepositoryTest {
@Test
void shouldCountForNullTenant() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
FlowWithSource toDelete = null;
try {
// Given
Flow flow = createTestFlowForNamespace(tenant, TEST_NAMESPACE);
Flow flow = createTestFlowForNamespace(TEST_NAMESPACE);
toDelete = flowRepository.create(GenericFlow.of(flow));
// When
int count = flowRepository.count(tenant);
int count = flowRepository.count(MAIN_TENANT);
// Then
Assertions.assertTrue(count > 0);
@@ -652,11 +868,11 @@ public abstract class AbstractFlowRepositoryTest {
}
}
private static Flow createTestFlowForNamespace(String tenantId, String namespace) {
private static Flow createTestFlowForNamespace(String namespace) {
return Flow.builder()
.id(IdUtils.create())
.namespace(namespace)
.tenantId(tenantId)
.tenantId(MAIN_TENANT)
.tasks(List.of(Return.builder()
.id(IdUtils.create())
.type(Return.class.getName())
@@ -675,31 +891,21 @@ public abstract class AbstractFlowRepositoryTest {
}
@Singleton
public static class FlowListener implements ApplicationEventListener<CrudEvent<AbstractFlow>> {
private static List<CrudEvent<AbstractFlow>> emits = new CopyOnWriteArrayList<>();
public static class FlowListener implements ApplicationEventListener<CrudEvent<Flow>> {
@Getter
private static List<CrudEvent<Flow>> emits = new ArrayList<>();
@Override
public void onApplicationEvent(CrudEvent<AbstractFlow> event) {
//This has to be done because Micronaut may send CrudEvent<Setting> for example, and we don't want them.
if ((event.getModel() != null && event.getModel() instanceof AbstractFlow)||
(event.getPreviousModel() != null && event.getPreviousModel() instanceof AbstractFlow)) {
emits.add(event);
}
public void onApplicationEvent(CrudEvent<Flow> event) {
emits.add(event);
}
public static void reset() {
emits = new CopyOnWriteArrayList<>();
}
public static List<CrudEvent<AbstractFlow>> filterByTenant(String tenantId){
return emits.stream()
.filter(e -> (e.getPreviousModel() != null && e.getPreviousModel().getTenantId().equals(tenantId)) ||
(e.getModel() != null && e.getModel().getTenantId().equals(tenantId)))
.toList();
emits = new ArrayList<>();
}
}
private static GenericFlow createTestingLogFlow(String tenantId, String id, String logMessage) {
private static GenericFlow createTestingLogFlow(String id, String logMessage) {
String source = """
id: %s
namespace: %s
@@ -708,7 +914,7 @@ public abstract class AbstractFlowRepositoryTest {
type: io.kestra.plugin.core.log.Log
message: %s
""".formatted(id, TEST_NAMESPACE, logMessage);
return GenericFlow.fromYaml(tenantId, source);
return GenericFlow.fromYaml(TEST_TENANT_ID, source);
}
protected static int COUNTER = 0;

View File

@@ -4,7 +4,7 @@ import io.kestra.core.models.topologies.FlowNode;
import io.kestra.core.models.topologies.FlowRelation;
import io.kestra.core.models.topologies.FlowTopology;
import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.utils.TestsUtils;
import io.kestra.core.tenant.TenantService;
import jakarta.inject.Inject;
import org.junit.jupiter.api.Test;
@@ -17,21 +17,21 @@ public abstract class AbstractFlowTopologyRepositoryTest {
@Inject
private FlowTopologyRepositoryInterface flowTopologyRepository;
protected FlowTopology createSimpleFlowTopology(String tenantId, String flowA, String flowB, String namespace) {
protected FlowTopology createSimpleFlowTopology(String flowA, String flowB, String namespace) {
return FlowTopology.builder()
.relation(FlowRelation.FLOW_TASK)
.source(FlowNode.builder()
.id(flowA)
.namespace(namespace)
.tenantId(tenantId)
.uid(tenantId + flowA)
.tenantId(TenantService.MAIN_TENANT)
.uid(flowA)
.build()
)
.destination(FlowNode.builder()
.id(flowB)
.namespace(namespace)
.tenantId(tenantId)
.uid(tenantId + flowB)
.tenantId(TenantService.MAIN_TENANT)
.uid(flowB)
.build()
)
.build();
@@ -39,45 +39,42 @@ public abstract class AbstractFlowTopologyRepositoryTest {
@Test
void findByFlow() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
flowTopologyRepository.save(
createSimpleFlowTopology(tenant, "flow-a", "flow-b", "io.kestra.tests")
createSimpleFlowTopology("flow-a", "flow-b", "io.kestra.tests")
);
List<FlowTopology> list = flowTopologyRepository.findByFlow(tenant, "io.kestra.tests", "flow-a", false);
List<FlowTopology> list = flowTopologyRepository.findByFlow(TenantService.MAIN_TENANT, "io.kestra.tests", "flow-a", false);
assertThat(list.size()).isEqualTo(1);
}
@Test
void findByNamespace() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
flowTopologyRepository.save(
createSimpleFlowTopology(tenant, "flow-a", "flow-b", "io.kestra.tests")
createSimpleFlowTopology("flow-a", "flow-b", "io.kestra.tests")
);
flowTopologyRepository.save(
createSimpleFlowTopology(tenant, "flow-c", "flow-d", "io.kestra.tests")
createSimpleFlowTopology("flow-c", "flow-d", "io.kestra.tests")
);
List<FlowTopology> list = flowTopologyRepository.findByNamespace(tenant, "io.kestra.tests");
List<FlowTopology> list = flowTopologyRepository.findByNamespace(TenantService.MAIN_TENANT, "io.kestra.tests");
assertThat(list.size()).isEqualTo(2);
}
@Test
void findAll() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
flowTopologyRepository.save(
createSimpleFlowTopology(tenant, "flow-a", "flow-b", "io.kestra.tests")
createSimpleFlowTopology("flow-a", "flow-b", "io.kestra.tests")
);
flowTopologyRepository.save(
createSimpleFlowTopology(tenant, "flow-c", "flow-d", "io.kestra.tests")
createSimpleFlowTopology("flow-c", "flow-d", "io.kestra.tests")
);
flowTopologyRepository.save(
createSimpleFlowTopology(tenant, "flow-e", "flow-f", "io.kestra.tests.2")
createSimpleFlowTopology("flow-e", "flow-f", "io.kestra.tests.2")
);
List<FlowTopology> list = flowTopologyRepository.findAll(tenant);
List<FlowTopology> list = flowTopologyRepository.findAll(TenantService.MAIN_TENANT);
assertThat(list.size()).isEqualTo(3);
}

View File

@@ -13,7 +13,6 @@ import io.kestra.core.models.executions.LogEntry;
import io.kestra.core.models.flows.State;
import io.kestra.core.repositories.ExecutionRepositoryInterface.ChildFilter;
import io.kestra.core.utils.IdUtils;
import io.kestra.core.utils.TestsUtils;
import io.kestra.plugin.core.dashboard.data.Logs;
import io.micronaut.data.model.Pageable;
import jakarta.inject.Inject;
@@ -33,7 +32,9 @@ import java.util.stream.Stream;
import static io.kestra.core.models.flows.FlowScope.SYSTEM;
import static io.kestra.core.models.flows.FlowScope.USER;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
import static org.assertj.core.api.Assertions.assertThatReflectiveOperationException;
import static org.junit.jupiter.api.Assertions.assertThrows;
@KestraTest
@@ -41,11 +42,11 @@ public abstract class AbstractLogRepositoryTest {
@Inject
protected LogRepositoryInterface logRepository;
protected static LogEntry.LogEntryBuilder logEntry(String tenantId, Level level) {
return logEntry(tenantId, level, IdUtils.create());
protected static LogEntry.LogEntryBuilder logEntry(Level level) {
return logEntry(level, IdUtils.create());
}
protected static LogEntry.LogEntryBuilder logEntry(String tenantId, Level level, String executionId) {
protected static LogEntry.LogEntryBuilder logEntry(Level level, String executionId) {
return LogEntry.builder()
.flowId("flowId")
.namespace("io.kestra.unittest")
@@ -56,7 +57,7 @@ public abstract class AbstractLogRepositoryTest {
.timestamp(Instant.now())
.level(level)
.thread("")
.tenantId(tenantId)
.tenantId(MAIN_TENANT)
.triggerId("triggerId")
.message("john doe");
}
@@ -64,10 +65,9 @@ public abstract class AbstractLogRepositoryTest {
@ParameterizedTest
@MethodSource("filterCombinations")
void should_find_all(QueryFilter filter){
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
logRepository.save(logEntry(tenant, Level.INFO, "executionId").build());
logRepository.save(logEntry(Level.INFO, "executionId").build());
ArrayListTotal<LogEntry> entries = logRepository.find(Pageable.UNPAGED, tenant, List.of(filter));
ArrayListTotal<LogEntry> entries = logRepository.find(Pageable.UNPAGED, MAIN_TENANT, List.of(filter));
assertThat(entries).hasSize(1);
}
@@ -75,10 +75,9 @@ public abstract class AbstractLogRepositoryTest {
@ParameterizedTest
@MethodSource("filterCombinations")
void should_find_async(QueryFilter filter){
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
logRepository.save(logEntry(tenant, Level.INFO, "executionId").build());
logRepository.save(logEntry(Level.INFO, "executionId").build());
Flux<LogEntry> find = logRepository.findAsync(tenant, List.of(filter));
Flux<LogEntry> find = logRepository.findAsync(MAIN_TENANT, List.of(filter));
List<LogEntry> logEntries = find.collectList().block();
assertThat(logEntries).hasSize(1);
@@ -87,12 +86,11 @@ public abstract class AbstractLogRepositoryTest {
@ParameterizedTest
@MethodSource("filterCombinations")
void should_delete_with_filter(QueryFilter filter){
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
logRepository.save(logEntry(tenant, Level.INFO, "executionId").build());
logRepository.save(logEntry(Level.INFO, "executionId").build());
logRepository.deleteByFilters(tenant, List.of(filter));
logRepository.deleteByFilters(MAIN_TENANT, List.of(filter));
assertThat(logRepository.findAllAsync(tenant).collectList().block()).isEmpty();
assertThat(logRepository.findAllAsync(MAIN_TENANT).collectList().block()).isEmpty();
}
@@ -152,10 +150,7 @@ public abstract class AbstractLogRepositoryTest {
void should_fail_to_find_all(QueryFilter filter){
assertThrows(
InvalidQueryFiltersException.class,
() -> logRepository.find(
Pageable.UNPAGED,
TestsUtils.randomTenant(this.getClass().getSimpleName()),
List.of(filter)));
() -> logRepository.find(Pageable.UNPAGED, MAIN_TENANT, List.of(filter)));
}
@@ -173,17 +168,16 @@ public abstract class AbstractLogRepositoryTest {
@Test
void all() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
LogEntry.LogEntryBuilder builder = logEntry(tenant, Level.INFO);
LogEntry.LogEntryBuilder builder = logEntry(Level.INFO);
ArrayListTotal<LogEntry> find = logRepository.find(Pageable.UNPAGED, tenant, null);
ArrayListTotal<LogEntry> find = logRepository.find(Pageable.UNPAGED, MAIN_TENANT, null);
assertThat(find.size()).isZero();
LogEntry save = logRepository.save(builder.build());
logRepository.save(builder.executionKind(ExecutionKind.TEST).build()); // should only be loaded by execution id
find = logRepository.find(Pageable.UNPAGED, tenant, null);
find = logRepository.find(Pageable.UNPAGED, MAIN_TENANT, null);
assertThat(find.size()).isEqualTo(1);
assertThat(find.getFirst().getExecutionId()).isEqualTo(save.getExecutionId());
var filters = List.of(QueryFilter.builder()
@@ -199,7 +193,7 @@ public abstract class AbstractLogRepositoryTest {
find = logRepository.find(Pageable.UNPAGED, "doe", filters);
assertThat(find.size()).isZero();
find = logRepository.find(Pageable.UNPAGED, tenant, null);
find = logRepository.find(Pageable.UNPAGED, MAIN_TENANT, null);
assertThat(find.size()).isEqualTo(1);
assertThat(find.getFirst().getExecutionId()).isEqualTo(save.getExecutionId());
@@ -207,146 +201,141 @@ public abstract class AbstractLogRepositoryTest {
assertThat(find.size()).isEqualTo(1);
assertThat(find.getFirst().getExecutionId()).isEqualTo(save.getExecutionId());
List<LogEntry> list = logRepository.findByExecutionId(tenant, save.getExecutionId(), null);
List<LogEntry> list = logRepository.findByExecutionId(MAIN_TENANT, save.getExecutionId(), null);
assertThat(list.size()).isEqualTo(2);
assertThat(list.getFirst().getExecutionId()).isEqualTo(save.getExecutionId());
list = logRepository.findByExecutionId(tenant, "io.kestra.unittest", "flowId", save.getExecutionId(), null);
list = logRepository.findByExecutionId(MAIN_TENANT, "io.kestra.unittest", "flowId", save.getExecutionId(), null);
assertThat(list.size()).isEqualTo(2);
assertThat(list.getFirst().getExecutionId()).isEqualTo(save.getExecutionId());
list = logRepository.findByExecutionIdAndTaskId(tenant, save.getExecutionId(), save.getTaskId(), null);
list = logRepository.findByExecutionIdAndTaskId(MAIN_TENANT, save.getExecutionId(), save.getTaskId(), null);
assertThat(list.size()).isEqualTo(2);
assertThat(list.getFirst().getExecutionId()).isEqualTo(save.getExecutionId());
list = logRepository.findByExecutionIdAndTaskId(tenant, "io.kestra.unittest", "flowId", save.getExecutionId(), save.getTaskId(), null);
list = logRepository.findByExecutionIdAndTaskId(MAIN_TENANT, "io.kestra.unittest", "flowId", save.getExecutionId(), save.getTaskId(), null);
assertThat(list.size()).isEqualTo(2);
assertThat(list.getFirst().getExecutionId()).isEqualTo(save.getExecutionId());
list = logRepository.findByExecutionIdAndTaskRunId(tenant, save.getExecutionId(), save.getTaskRunId(), null);
list = logRepository.findByExecutionIdAndTaskRunId(MAIN_TENANT, save.getExecutionId(), save.getTaskRunId(), null);
assertThat(list.size()).isEqualTo(2);
assertThat(list.getFirst().getExecutionId()).isEqualTo(save.getExecutionId());
list = logRepository.findByExecutionIdAndTaskRunIdAndAttempt(tenant, save.getExecutionId(), save.getTaskRunId(), null, 0);
list = logRepository.findByExecutionIdAndTaskRunIdAndAttempt(MAIN_TENANT, save.getExecutionId(), save.getTaskRunId(), null, 0);
assertThat(list.size()).isEqualTo(2);
assertThat(list.getFirst().getExecutionId()).isEqualTo(save.getExecutionId());
Integer countDeleted = logRepository.purge(Execution.builder().id(save.getExecutionId()).build());
assertThat(countDeleted).isEqualTo(2);
list = logRepository.findByExecutionIdAndTaskId(tenant, save.getExecutionId(), save.getTaskId(), null);
list = logRepository.findByExecutionIdAndTaskId(MAIN_TENANT, save.getExecutionId(), save.getTaskId(), null);
assertThat(list.size()).isZero();
}
@Test
void pageable() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
String executionId = "123";
LogEntry.LogEntryBuilder builder = logEntry(tenant, Level.INFO);
LogEntry.LogEntryBuilder builder = logEntry(Level.INFO);
builder.executionId(executionId);
for (int i = 0; i < 80; i++) {
logRepository.save(builder.build());
}
builder = logEntry(tenant, Level.INFO).executionId(executionId).taskId("taskId2").taskRunId("taskRunId2");
builder = logEntry(Level.INFO).executionId(executionId).taskId("taskId2").taskRunId("taskRunId2");
LogEntry logEntry2 = logRepository.save(builder.build());
for (int i = 0; i < 20; i++) {
logRepository.save(builder.build());
}
ArrayListTotal<LogEntry> find = logRepository.findByExecutionId(tenant, executionId, null, Pageable.from(1, 50));
ArrayListTotal<LogEntry> find = logRepository.findByExecutionId(MAIN_TENANT, executionId, null, Pageable.from(1, 50));
assertThat(find.size()).isEqualTo(50);
assertThat(find.getTotal()).isEqualTo(101L);
find = logRepository.findByExecutionId(tenant, executionId, null, Pageable.from(3, 50));
find = logRepository.findByExecutionId(MAIN_TENANT, executionId, null, Pageable.from(3, 50));
assertThat(find.size()).isEqualTo(1);
assertThat(find.getTotal()).isEqualTo(101L);
find = logRepository.findByExecutionIdAndTaskId(tenant, executionId, logEntry2.getTaskId(), null, Pageable.from(1, 50));
find = logRepository.findByExecutionIdAndTaskId(MAIN_TENANT, executionId, logEntry2.getTaskId(), null, Pageable.from(1, 50));
assertThat(find.size()).isEqualTo(21);
assertThat(find.getTotal()).isEqualTo(21L);
find = logRepository.findByExecutionIdAndTaskRunId(tenant, executionId, logEntry2.getTaskRunId(), null, Pageable.from(1, 10));
find = logRepository.findByExecutionIdAndTaskRunId(MAIN_TENANT, executionId, logEntry2.getTaskRunId(), null, Pageable.from(1, 10));
assertThat(find.size()).isEqualTo(10);
assertThat(find.getTotal()).isEqualTo(21L);
find = logRepository.findByExecutionIdAndTaskRunIdAndAttempt(tenant, executionId, logEntry2.getTaskRunId(), null, 0, Pageable.from(1, 10));
find = logRepository.findByExecutionIdAndTaskRunIdAndAttempt(MAIN_TENANT, executionId, logEntry2.getTaskRunId(), null, 0, Pageable.from(1, 10));
assertThat(find.size()).isEqualTo(10);
assertThat(find.getTotal()).isEqualTo(21L);
find = logRepository.findByExecutionIdAndTaskRunId(tenant, executionId, logEntry2.getTaskRunId(), null, Pageable.from(10, 10));
find = logRepository.findByExecutionIdAndTaskRunId(MAIN_TENANT, executionId, logEntry2.getTaskRunId(), null, Pageable.from(10, 10));
assertThat(find.size()).isZero();
}
@Test
void shouldFindByExecutionIdTestLogs() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
var builder = logEntry(tenant, Level.INFO).executionId("123").executionKind(ExecutionKind.TEST).build();
var builder = logEntry(Level.INFO).executionId("123").executionKind(ExecutionKind.TEST).build();
logRepository.save(builder);
List<LogEntry> logs = logRepository.findByExecutionId(tenant, builder.getExecutionId(), null);
List<LogEntry> logs = logRepository.findByExecutionId(MAIN_TENANT, builder.getExecutionId(), null);
assertThat(logs).hasSize(1);
}
@Test
void deleteByQuery() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
LogEntry log1 = logEntry(tenant, Level.INFO).build();
LogEntry log1 = logEntry(Level.INFO).build();
logRepository.save(log1);
logRepository.deleteByQuery(tenant, log1.getExecutionId(), null, null, null, null);
logRepository.deleteByQuery(MAIN_TENANT, log1.getExecutionId(), null, null, null, null);
ArrayListTotal<LogEntry> find = logRepository.findByExecutionId(tenant, log1.getExecutionId(), null, Pageable.from(1, 50));
ArrayListTotal<LogEntry> find = logRepository.findByExecutionId(MAIN_TENANT, log1.getExecutionId(), null, Pageable.from(1, 50));
assertThat(find.size()).isZero();
logRepository.save(log1);
logRepository.deleteByQuery(tenant, "io.kestra.unittest", "flowId", null, List.of(Level.TRACE, Level.DEBUG, Level.INFO), null, ZonedDateTime.now().plusMinutes(1));
logRepository.deleteByQuery(MAIN_TENANT, "io.kestra.unittest", "flowId", null, List.of(Level.TRACE, Level.DEBUG, Level.INFO), null, ZonedDateTime.now().plusMinutes(1));
find = logRepository.findByExecutionId(tenant, log1.getExecutionId(), null, Pageable.from(1, 50));
find = logRepository.findByExecutionId(MAIN_TENANT, log1.getExecutionId(), null, Pageable.from(1, 50));
assertThat(find.size()).isZero();
logRepository.save(log1);
logRepository.deleteByQuery(tenant, "io.kestra.unittest", "flowId", null);
logRepository.deleteByQuery(MAIN_TENANT, "io.kestra.unittest", "flowId", null);
find = logRepository.findByExecutionId(tenant, log1.getExecutionId(), null, Pageable.from(1, 50));
find = logRepository.findByExecutionId(MAIN_TENANT, log1.getExecutionId(), null, Pageable.from(1, 50));
assertThat(find.size()).isZero();
logRepository.save(log1);
logRepository.deleteByQuery(tenant, null, null, log1.getExecutionId(), List.of(Level.TRACE, Level.DEBUG, Level.INFO), null, ZonedDateTime.now().plusMinutes(1));
logRepository.deleteByQuery(MAIN_TENANT, null, null, log1.getExecutionId(), List.of(Level.TRACE, Level.DEBUG, Level.INFO), null, ZonedDateTime.now().plusMinutes(1));
find = logRepository.findByExecutionId(tenant, log1.getExecutionId(), null, Pageable.from(1, 50));
find = logRepository.findByExecutionId(MAIN_TENANT, log1.getExecutionId(), null, Pageable.from(1, 50));
assertThat(find.size()).isZero();
}
@Test
void findAllAsync() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
logRepository.save(logEntry(tenant, Level.INFO).build());
logRepository.save(logEntry(tenant, Level.INFO).executionKind(ExecutionKind.TEST).build()); // should be present as it's used for backup
logRepository.save(logEntry(tenant, Level.ERROR).build());
logRepository.save(logEntry(tenant, Level.WARN).build());
logRepository.save(logEntry(Level.INFO).build());
logRepository.save(logEntry(Level.INFO).executionKind(ExecutionKind.TEST).build()); // should be present as it's used for backup
logRepository.save(logEntry(Level.ERROR).build());
logRepository.save(logEntry(Level.WARN).build());
Flux<LogEntry> find = logRepository.findAllAsync(tenant);
Flux<LogEntry> find = logRepository.findAllAsync(MAIN_TENANT);
List<LogEntry> logEntries = find.collectList().block();
assertThat(logEntries).hasSize(4);
}
@Test
void fetchData() throws IOException {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
logRepository.save(logEntry(tenant, Level.INFO).build());
logRepository.save(logEntry(Level.INFO).build());
var results = logRepository.fetchData(tenant,
var results = logRepository.fetchData(MAIN_TENANT,
Logs.builder()
.type(Logs.class.getName())
.columns(Map.of(

View File

@@ -7,7 +7,6 @@ import io.kestra.core.models.executions.TaskRun;
import io.kestra.core.models.executions.metrics.Counter;
import io.kestra.core.models.executions.metrics.MetricAggregations;
import io.kestra.core.models.executions.metrics.Timer;
import io.kestra.core.utils.TestsUtils;
import io.micronaut.data.model.Pageable;
import io.kestra.core.junit.annotations.KestraTest;
import jakarta.inject.Inject;
@@ -26,28 +25,27 @@ public abstract class AbstractMetricRepositoryTest {
@Test
void all() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
String executionId = FriendlyId.createFriendlyId();
TaskRun taskRun1 = taskRun(tenant, executionId, "task");
TaskRun taskRun1 = taskRun(executionId, "task");
MetricEntry counter = MetricEntry.of(taskRun1, counter("counter"), null);
MetricEntry testCounter = MetricEntry.of(taskRun1, counter("test"), ExecutionKind.TEST);
TaskRun taskRun2 = taskRun(tenant, executionId, "task");
TaskRun taskRun2 = taskRun(executionId, "task");
MetricEntry timer = MetricEntry.of(taskRun2, timer(), null);
metricRepository.save(counter);
metricRepository.save(testCounter); // should only be retrieved by execution id
metricRepository.save(timer);
List<MetricEntry> results = metricRepository.findByExecutionId(tenant, executionId, Pageable.from(1, 10));
List<MetricEntry> results = metricRepository.findByExecutionId(null, executionId, Pageable.from(1, 10));
assertThat(results.size()).isEqualTo(3);
results = metricRepository.findByExecutionIdAndTaskId(tenant, executionId, taskRun1.getTaskId(), Pageable.from(1, 10));
results = metricRepository.findByExecutionIdAndTaskId(null, executionId, taskRun1.getTaskId(), Pageable.from(1, 10));
assertThat(results.size()).isEqualTo(3);
results = metricRepository.findByExecutionIdAndTaskRunId(tenant, executionId, taskRun1.getId(), Pageable.from(1, 10));
results = metricRepository.findByExecutionIdAndTaskRunId(null, executionId, taskRun1.getId(), Pageable.from(1, 10));
assertThat(results.size()).isEqualTo(2);
MetricAggregations aggregationResults = metricRepository.aggregateByFlowId(
tenant,
null,
"namespace",
"flow",
null,
@@ -61,7 +59,7 @@ public abstract class AbstractMetricRepositoryTest {
assertThat(aggregationResults.getGroupBy()).isEqualTo("day");
aggregationResults = metricRepository.aggregateByFlowId(
tenant,
null,
"namespace",
"flow",
null,
@@ -78,12 +76,11 @@ public abstract class AbstractMetricRepositoryTest {
@Test
void names() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
String executionId = FriendlyId.createFriendlyId();
TaskRun taskRun1 = taskRun(tenant, executionId, "task");
TaskRun taskRun1 = taskRun(executionId, "task");
MetricEntry counter = MetricEntry.of(taskRun1, counter("counter"), null);
TaskRun taskRun2 = taskRun(tenant, executionId, "task2");
TaskRun taskRun2 = taskRun(executionId, "task2");
MetricEntry counter2 = MetricEntry.of(taskRun2, counter("counter2"), null);
MetricEntry test = MetricEntry.of(taskRun2, counter("test"), ExecutionKind.TEST);
@@ -93,9 +90,9 @@ public abstract class AbstractMetricRepositoryTest {
metricRepository.save(test); // should only be retrieved by execution id
List<String> flowMetricsNames = metricRepository.flowMetrics(tenant, "namespace", "flow");
List<String> taskMetricsNames = metricRepository.taskMetrics(tenant, "namespace", "flow", "task");
List<String> tasksWithMetrics = metricRepository.tasksWithMetrics(tenant, "namespace", "flow");
List<String> flowMetricsNames = metricRepository.flowMetrics(null, "namespace", "flow");
List<String> taskMetricsNames = metricRepository.taskMetrics(null, "namespace", "flow", "task");
List<String> tasksWithMetrics = metricRepository.tasksWithMetrics(null, "namespace", "flow");
assertThat(flowMetricsNames.size()).isEqualTo(2);
assertThat(taskMetricsNames.size()).isEqualTo(1);
@@ -104,18 +101,17 @@ public abstract class AbstractMetricRepositoryTest {
@Test
void findAllAsync() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
String executionId = FriendlyId.createFriendlyId();
TaskRun taskRun1 = taskRun(tenant, executionId, "task");
TaskRun taskRun1 = taskRun(executionId, "task");
MetricEntry counter = MetricEntry.of(taskRun1, counter("counter"), null);
TaskRun taskRun2 = taskRun(tenant, executionId, "task");
TaskRun taskRun2 = taskRun(executionId, "task");
MetricEntry timer = MetricEntry.of(taskRun2, timer(), null);
MetricEntry test = MetricEntry.of(taskRun2, counter("test"), ExecutionKind.TEST);
metricRepository.save(counter);
metricRepository.save(timer);
metricRepository.save(test); // should be retrieved as findAllAsync is used for backup
List<MetricEntry> results = metricRepository.findAllAsync(tenant).collectList().block();
List<MetricEntry> results = metricRepository.findAllAsync(null).collectList().block();
assertThat(results).hasSize(3);
}
@@ -127,9 +123,8 @@ public abstract class AbstractMetricRepositoryTest {
return Timer.of("counter", Duration.ofSeconds(5));
}
private TaskRun taskRun(String tenantId, String executionId, String taskId) {
private TaskRun taskRun(String executionId, String taskId) {
return TaskRun.builder()
.tenantId(tenantId)
.flowId("flow")
.namespace("namespace")
.executionId(executionId)

View File

@@ -4,8 +4,6 @@ import io.kestra.core.events.CrudEvent;
import io.kestra.core.events.CrudEventType;
import io.kestra.core.models.property.Property;
import io.kestra.core.models.templates.Template;
import io.kestra.core.utils.Await;
import io.kestra.core.utils.TestsUtils;
import io.kestra.plugin.core.debug.Return;
import io.kestra.core.utils.IdUtils;
import io.micronaut.context.event.ApplicationEventListener;
@@ -13,10 +11,7 @@ import io.micronaut.data.model.Pageable;
import io.kestra.core.junit.annotations.KestraTest;
import jakarta.inject.Inject;
import jakarta.inject.Singleton;
import java.time.Duration;
import java.util.concurrent.CopyOnWriteArrayList;
import java.util.concurrent.TimeoutException;
import org.junit.jupiter.api.BeforeAll;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import java.io.IOException;
@@ -25,8 +20,6 @@ import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import java.util.Optional;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import static org.assertj.core.api.Assertions.assertThat;
@@ -35,60 +28,55 @@ public abstract class AbstractTemplateRepositoryTest {
@Inject
protected TemplateRepositoryInterface templateRepository;
@BeforeAll
protected static void init() throws IOException, URISyntaxException {
@BeforeEach
protected void init() throws IOException, URISyntaxException {
TemplateListener.reset();
}
protected static Template.TemplateBuilder<?, ?> builder(String tenantId) {
return builder(tenantId, null);
protected static Template.TemplateBuilder<?, ?> builder() {
return builder(null);
}
protected static Template.TemplateBuilder<?, ?> builder(String tenantId, String namespace) {
protected static Template.TemplateBuilder<?, ?> builder(String namespace) {
return Template.builder()
.id(IdUtils.create())
.namespace(namespace == null ? "kestra.test" : namespace)
.tenantId(tenantId)
.tasks(Collections.singletonList(Return.builder().id("test").type(Return.class.getName()).format(Property.ofValue("test")).build()));
}
@Test
void findById() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Template template = builder(tenant).build();
Template template = builder().build();
templateRepository.create(template);
Optional<Template> full = templateRepository.findById(tenant, template.getNamespace(), template.getId());
Optional<Template> full = templateRepository.findById(null, template.getNamespace(), template.getId());
assertThat(full.isPresent()).isTrue();
assertThat(full.get().getId()).isEqualTo(template.getId());
full = templateRepository.findById(tenant, template.getNamespace(), template.getId());
full = templateRepository.findById(null, template.getNamespace(), template.getId());
assertThat(full.isPresent()).isTrue();
assertThat(full.get().getId()).isEqualTo(template.getId());
}
@Test
void findByNamespace() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Template template1 = builder(tenant).build();
Template template1 = builder().build();
Template template2 = Template.builder()
.id(IdUtils.create())
.tenantId(tenant)
.namespace("kestra.test.template").build();
templateRepository.create(template1);
templateRepository.create(template2);
List<Template> templates = templateRepository.findByNamespace(tenant, template1.getNamespace());
List<Template> templates = templateRepository.findByNamespace(null, template1.getNamespace());
assertThat(templates.size()).isGreaterThanOrEqualTo(1);
templates = templateRepository.findByNamespace(tenant, template2.getNamespace());
templates = templateRepository.findByNamespace(null, template2.getNamespace());
assertThat(templates.size()).isEqualTo(1);
}
@Test
void save() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Template template = builder(tenant).build();
Template template = builder().build();
Template save = templateRepository.create(template);
assertThat(save.getId()).isEqualTo(template.getId());
@@ -96,42 +84,41 @@ public abstract class AbstractTemplateRepositoryTest {
@Test
void findAll() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
long saveCount = templateRepository.findAll(tenant).size();
Template template = builder(tenant).build();
long saveCount = templateRepository.findAll(null).size();
Template template = builder().build();
templateRepository.create(template);
long size = templateRepository.findAll(tenant).size();
long size = templateRepository.findAll(null).size();
assertThat(size).isGreaterThan(saveCount);
templateRepository.delete(template);
assertThat((long) templateRepository.findAll(tenant).size()).isEqualTo(saveCount);
assertThat((long) templateRepository.findAll(null).size()).isEqualTo(saveCount);
}
@Test
void findAllForAllTenants() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
long saveCount = templateRepository.findAllForAllTenants().size();
Template template = builder(tenant).build();
Template template = builder().build();
templateRepository.create(template);
long size = templateRepository.findAllForAllTenants().size();
assertThat(size).isGreaterThan(saveCount);
templateRepository.delete(template);
assertThat((long) templateRepository.findAllForAllTenants().size()).isEqualTo(saveCount);
}
@Test
void find() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Template template1 = builder(tenant).build();
Template template1 = builder().build();
templateRepository.create(template1);
Template template2 = builder(tenant).build();
Template template2 = builder().build();
templateRepository.create(template2);
Template template3 = builder(tenant).build();
Template template3 = builder().build();
templateRepository.create(template3);
// with pageable
List<Template> save = templateRepository.find(Pageable.from(1, 10),null, tenant, "kestra.test");
List<Template> save = templateRepository.find(Pageable.from(1, 10),null, null, "kestra.test");
assertThat((long) save.size()).isGreaterThanOrEqualTo(3L);
// without pageable
save = templateRepository.find(null, tenant, "kestra.test");
save = templateRepository.find(null, null, "kestra.test");
assertThat((long) save.size()).isGreaterThanOrEqualTo(3L);
templateRepository.delete(template1);
@@ -139,45 +126,31 @@ public abstract class AbstractTemplateRepositoryTest {
templateRepository.delete(template3);
}
private static final Logger LOG = LoggerFactory.getLogger(AbstractTemplateRepositoryTest.class);
@Test
protected void delete() throws TimeoutException {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Template template = builder(tenant).build();
void delete() {
Template template = builder().build();
Template save = templateRepository.create(template);
templateRepository.delete(save);
assertThat(templateRepository.findById(tenant, template.getNamespace(), template.getId()).isPresent()).isFalse();
assertThat(templateRepository.findById(null, template.getNamespace(), template.getId()).isPresent()).isFalse();
Await.until(() -> {
LOG.info("-------------> number of event: {}", TemplateListener.getEmits(tenant).size());
return TemplateListener.getEmits(tenant).size() == 2;
}, Duration.ofMillis(100), Duration.ofSeconds(5));
assertThat(TemplateListener.getEmits(tenant).stream().filter(r -> r.getType() == CrudEventType.CREATE).count()).isEqualTo(1L);
assertThat(TemplateListener.getEmits(tenant).stream().filter(r -> r.getType() == CrudEventType.DELETE).count()).isEqualTo(1L);
assertThat(TemplateListener.getEmits().size()).isEqualTo(2);
assertThat(TemplateListener.getEmits().stream().filter(r -> r.getType() == CrudEventType.CREATE).count()).isEqualTo(1L);
assertThat(TemplateListener.getEmits().stream().filter(r -> r.getType() == CrudEventType.DELETE).count()).isEqualTo(1L);
}
@Singleton
public static class TemplateListener implements ApplicationEventListener<CrudEvent<Template>> {
private static List<CrudEvent<Template>> emits = new CopyOnWriteArrayList<>();
private static List<CrudEvent<Template>> emits = new ArrayList<>();
@Override
public void onApplicationEvent(CrudEvent<Template> event) {
//The instanceOf is required because Micronaut may send non Template event via this method
if ((event.getModel() != null && event.getModel() instanceof Template) ||
(event.getPreviousModel() != null && event.getPreviousModel() instanceof Template)) {
emits.add(event);
}
emits.add(event);
}
public static List<CrudEvent<Template>> getEmits(String tenantId){
return emits.stream()
.filter(e -> (e.getModel() != null && e.getModel().getTenantId().equals(tenantId)) ||
(e.getPreviousModel() != null && e.getPreviousModel().getTenantId().equals(tenantId)))
.toList();
public static List<CrudEvent<Template>> getEmits() {
return emits;
}
public static void reset() {

View File

@@ -9,7 +9,6 @@ import io.kestra.core.models.flows.State;
import io.kestra.core.models.triggers.Trigger;
import io.kestra.core.repositories.ExecutionRepositoryInterface.ChildFilter;
import io.kestra.core.utils.IdUtils;
import io.kestra.core.utils.TestsUtils;
import io.micronaut.data.model.Pageable;
import io.micronaut.data.model.Sort;
import jakarta.inject.Inject;
@@ -25,6 +24,7 @@ import java.util.Optional;
import java.util.stream.Stream;
import static io.kestra.core.models.flows.FlowScope.USER;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
import static org.junit.jupiter.api.Assertions.assertThrows;
@@ -35,9 +35,8 @@ public abstract class AbstractTriggerRepositoryTest {
@Inject
protected TriggerRepositoryInterface triggerRepository;
private static Trigger.TriggerBuilder<?, ?> trigger(String tenantId) {
private static Trigger.TriggerBuilder<?, ?> trigger() {
return Trigger.builder()
.tenantId(tenantId)
.flowId(IdUtils.create())
.namespace(TEST_NAMESPACE)
.triggerId(IdUtils.create())
@@ -45,9 +44,9 @@ public abstract class AbstractTriggerRepositoryTest {
.date(ZonedDateTime.now());
}
protected static Trigger generateDefaultTrigger(String tenantId){
protected static Trigger generateDefaultTrigger(){
Trigger trigger = Trigger.builder()
.tenantId(tenantId)
.tenantId(MAIN_TENANT)
.triggerId("triggerId")
.namespace("trigger.namespace")
.flowId("flowId")
@@ -60,10 +59,9 @@ public abstract class AbstractTriggerRepositoryTest {
@ParameterizedTest
@MethodSource("filterCombinations")
void should_find_all(QueryFilter filter){
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
triggerRepository.save(generateDefaultTrigger(tenant));
triggerRepository.save(generateDefaultTrigger());
ArrayListTotal<Trigger> entries = triggerRepository.find(Pageable.UNPAGED, tenant, List.of(filter));
ArrayListTotal<Trigger> entries = triggerRepository.find(Pageable.UNPAGED, MAIN_TENANT, List.of(filter));
assertThat(entries).hasSize(1);
}
@@ -71,10 +69,9 @@ public abstract class AbstractTriggerRepositoryTest {
@ParameterizedTest
@MethodSource("filterCombinations")
void should_find_all_async(QueryFilter filter){
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
triggerRepository.save(generateDefaultTrigger(tenant));
triggerRepository.save(generateDefaultTrigger());
List<Trigger> entries = triggerRepository.find(tenant, List.of(filter)).collectList().block();
List<Trigger> entries = triggerRepository.find(MAIN_TENANT, List.of(filter)).collectList().block();
assertThat(entries).hasSize(1);
}
@@ -95,7 +92,7 @@ public abstract class AbstractTriggerRepositoryTest {
@ParameterizedTest
@MethodSource("errorFilterCombinations")
void should_fail_to_find_all(QueryFilter filter){
assertThrows(InvalidQueryFiltersException.class, () -> triggerRepository.find(Pageable.UNPAGED, TestsUtils.randomTenant(this.getClass().getSimpleName()), List.of(filter)));
assertThrows(InvalidQueryFiltersException.class, () -> triggerRepository.find(Pageable.UNPAGED, MAIN_TENANT, List.of(filter)));
}
static Stream<QueryFilter> errorFilterCombinations() {
@@ -113,8 +110,7 @@ public abstract class AbstractTriggerRepositoryTest {
@Test
void all() {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Trigger.TriggerBuilder<?, ?> builder = trigger(tenant);
Trigger.TriggerBuilder<?, ?> builder = trigger();
Optional<Trigger> findLast = triggerRepository.findLast(builder.build());
assertThat(findLast.isPresent()).isFalse();
@@ -134,47 +130,47 @@ public abstract class AbstractTriggerRepositoryTest {
assertThat(findLast.get().getExecutionId()).isEqualTo(save.getExecutionId());
triggerRepository.save(trigger(tenant).build());
triggerRepository.save(trigger(tenant).build());
Trigger searchedTrigger = trigger(tenant).build();
triggerRepository.save(trigger().build());
triggerRepository.save(trigger().build());
Trigger searchedTrigger = trigger().build();
triggerRepository.save(searchedTrigger);
List<Trigger> all = triggerRepository.findAllForAllTenants();
assertThat(all.size()).isGreaterThanOrEqualTo(4);
assertThat(all.size()).isEqualTo(4);
all = triggerRepository.findAll(tenant);
all = triggerRepository.findAll(null);
assertThat(all.size()).isEqualTo(4);
String namespacePrefix = "io.kestra.another";
String namespace = namespacePrefix + ".ns";
Trigger trigger = trigger(tenant).namespace(namespace).build();
Trigger trigger = trigger().namespace(namespace).build();
triggerRepository.save(trigger);
List<Trigger> find = triggerRepository.find(Pageable.from(1, 4, Sort.of(Sort.Order.asc("namespace"))), null, tenant, null, null, null);
List<Trigger> find = triggerRepository.find(Pageable.from(1, 4, Sort.of(Sort.Order.asc("namespace"))), null, null, null, null, null);
assertThat(find.size()).isEqualTo(4);
assertThat(find.getFirst().getNamespace()).isEqualTo(namespace);
find = triggerRepository.find(Pageable.from(1, 4, Sort.of(Sort.Order.asc("namespace"))), null, tenant, null, searchedTrigger.getFlowId(), null);
find = triggerRepository.find(Pageable.from(1, 4, Sort.of(Sort.Order.asc("namespace"))), null, null, null, searchedTrigger.getFlowId(), null);
assertThat(find.size()).isEqualTo(1);
assertThat(find.getFirst().getFlowId()).isEqualTo(searchedTrigger.getFlowId());
find = triggerRepository.find(Pageable.from(1, 100, Sort.of(Sort.Order.asc(triggerRepository.sortMapping().apply("triggerId")))), null, tenant, namespacePrefix, null, null);
find = triggerRepository.find(Pageable.from(1, 100, Sort.of(Sort.Order.asc(triggerRepository.sortMapping().apply("triggerId")))), null, null, namespacePrefix, null, null);
assertThat(find.size()).isEqualTo(1);
assertThat(find.getFirst().getTriggerId()).isEqualTo(trigger.getTriggerId());
// Full text search is on namespace, flowId, triggerId, executionId
find = triggerRepository.find(Pageable.from(1, 100, Sort.UNSORTED), trigger.getNamespace(), tenant, null, null, null);
find = triggerRepository.find(Pageable.from(1, 100, Sort.UNSORTED), trigger.getNamespace(), null, null, null, null);
assertThat(find.size()).isEqualTo(1);
assertThat(find.getFirst().getTriggerId()).isEqualTo(trigger.getTriggerId());
find = triggerRepository.find(Pageable.from(1, 100, Sort.UNSORTED), searchedTrigger.getFlowId(), tenant, null, null, null);
find = triggerRepository.find(Pageable.from(1, 100, Sort.UNSORTED), searchedTrigger.getFlowId(), null, null, null, null);
assertThat(find.size()).isEqualTo(1);
assertThat(find.getFirst().getTriggerId()).isEqualTo(searchedTrigger.getTriggerId());
find = triggerRepository.find(Pageable.from(1, 100, Sort.UNSORTED), searchedTrigger.getTriggerId(), tenant, null, null, null);
find = triggerRepository.find(Pageable.from(1, 100, Sort.UNSORTED), searchedTrigger.getTriggerId(), null, null, null, null);
assertThat(find.size()).isEqualTo(1);
assertThat(find.getFirst().getTriggerId()).isEqualTo(searchedTrigger.getTriggerId());
find = triggerRepository.find(Pageable.from(1, 100, Sort.UNSORTED), searchedTrigger.getExecutionId(), tenant, null, null, null);
find = triggerRepository.find(Pageable.from(1, 100, Sort.UNSORTED), searchedTrigger.getExecutionId(), null, null, null, null);
assertThat(find.size()).isEqualTo(1);
assertThat(find.getFirst().getTriggerId()).isEqualTo(searchedTrigger.getTriggerId());
}
@@ -182,17 +178,15 @@ public abstract class AbstractTriggerRepositoryTest {
@Test
void shouldCountForNullTenant() {
// Given
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
triggerRepository.save(Trigger
.builder()
.tenantId(tenant)
.triggerId(IdUtils.create())
.flowId(IdUtils.create())
.namespace("io.kestra.unittest")
.build()
);
// When
int count = triggerRepository.count(tenant);
int count = triggerRepository.count(null);
// Then
assertThat(count).isEqualTo(1);
}

View File

@@ -1,92 +1,88 @@
package io.kestra.core.repositories;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import com.google.common.collect.ImmutableMap;
import io.kestra.core.models.executions.*;
import io.kestra.core.models.flows.State;
import io.kestra.core.utils.IdUtils;
import java.time.Duration;
import java.util.Collections;
class ExecutionFixture {
public static Execution EXECUTION_1(String tenant) {
return Execution.builder()
.id(IdUtils.create())
.namespace("io.kestra.unittest")
.tenantId(tenant)
.flowId("full")
.flowRevision(1)
.state(new State())
.inputs(ImmutableMap.of("test", "value"))
.taskRunList(Collections.singletonList(
TaskRun.builder()
.id(IdUtils.create())
.namespace("io.kestra.unittest")
.flowId("full")
.state(new State())
.attempts(Collections.singletonList(
TaskRunAttempt.builder()
.build()
))
.outputs(Variables.inMemory(ImmutableMap.of(
"out", "value"
)))
.build()
))
.build();
}
public static final Execution EXECUTION_1 = Execution.builder()
.id(IdUtils.create())
.namespace("io.kestra.unittest")
.tenantId(MAIN_TENANT)
.flowId("full")
.flowRevision(1)
.state(new State())
.inputs(ImmutableMap.of("test", "value"))
.taskRunList(Collections.singletonList(
TaskRun.builder()
.id(IdUtils.create())
.namespace("io.kestra.unittest")
.flowId("full")
.state(new State())
.attempts(Collections.singletonList(
TaskRunAttempt.builder()
.build()
))
.outputs(Variables.inMemory(ImmutableMap.of(
"out", "value"
)))
.build()
))
.build();
public static Execution EXECUTION_2(String tenant) {
return Execution.builder()
.id(IdUtils.create())
.namespace("io.kestra.unittest")
.tenantId(tenant)
.flowId("full")
.flowRevision(1)
.state(new State())
.inputs(ImmutableMap.of("test", 1))
.taskRunList(Collections.singletonList(
TaskRun.builder()
.id(IdUtils.create())
.namespace("io.kestra.unittest")
.flowId("full")
.state(new State())
.attempts(Collections.singletonList(
TaskRunAttempt.builder()
.build()
))
.outputs(Variables.inMemory(ImmutableMap.of(
"out", 1
)))
.build()
))
.build();
}
public static final Execution EXECUTION_2 = Execution.builder()
.id(IdUtils.create())
.namespace("io.kestra.unittest")
.tenantId(MAIN_TENANT)
.flowId("full")
.flowRevision(1)
.state(new State())
.inputs(ImmutableMap.of("test", 1))
.taskRunList(Collections.singletonList(
TaskRun.builder()
.id(IdUtils.create())
.namespace("io.kestra.unittest")
.flowId("full")
.state(new State())
.attempts(Collections.singletonList(
TaskRunAttempt.builder()
.build()
))
.outputs(Variables.inMemory(ImmutableMap.of(
"out", 1
)))
.build()
))
.build();
public static Execution EXECUTION_TEST(String tenant) {
return Execution.builder()
.id(IdUtils.create())
.namespace("io.kestra.unittest")
.tenantId(tenant)
.flowId("full")
.flowRevision(1)
.state(new State())
.inputs(ImmutableMap.of("test", 1))
.kind(ExecutionKind.TEST)
.taskRunList(Collections.singletonList(
TaskRun.builder()
.id(IdUtils.create())
.namespace("io.kestra.unittest")
.flowId("full")
.state(new State())
.attempts(Collections.singletonList(
TaskRunAttempt.builder()
.build()
))
.outputs(Variables.inMemory(ImmutableMap.of(
"out", 1
)))
.build()
))
.build();
}
}
public static final Execution EXECUTION_TEST = Execution.builder()
.id(IdUtils.create())
.namespace("io.kestra.unittest")
.flowId("full")
.flowRevision(1)
.state(new State())
.inputs(ImmutableMap.of("test", 1))
.kind(ExecutionKind.TEST)
.taskRunList(Collections.singletonList(
TaskRun.builder()
.id(IdUtils.create())
.namespace("io.kestra.unittest")
.flowId("full")
.state(new State())
.attempts(Collections.singletonList(
TaskRunAttempt.builder()
.build()
))
.outputs(Variables.inMemory(ImmutableMap.of(
"out", 1
)))
.build()
))
.build();
}

View File

@@ -1,6 +1,10 @@
package io.kestra.core.runners;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
import io.kestra.core.junit.annotations.ExecuteFlow;
import io.kestra.core.junit.annotations.FlakyTest;
import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.junit.annotations.LoadFlows;
import io.kestra.core.models.executions.Execution;
@@ -9,36 +13,36 @@ import io.kestra.core.models.flows.State;
import io.kestra.core.queues.QueueException;
import io.kestra.core.queues.QueueFactoryInterface;
import io.kestra.core.queues.QueueInterface;
import io.kestra.plugin.core.flow.*;
import io.kestra.plugin.core.flow.EachSequentialTest;
import io.kestra.plugin.core.flow.FlowCaseTest;
import io.kestra.plugin.core.flow.ForEachItemCaseTest;
import io.kestra.plugin.core.flow.PauseTest;
import io.kestra.plugin.core.flow.LoopUntilCaseTest;
import io.kestra.plugin.core.flow.WorkingDirectoryTest;
import jakarta.inject.Inject;
import jakarta.inject.Named;
import org.junit.jupiter.api.Disabled;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.TestInstance;
import java.util.Map;
import java.util.concurrent.TimeoutException;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
import org.junit.jupiter.api.Disabled;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.TestInstance;
import org.junitpioneer.jupiter.RetryingTest;
@KestraTest(startRunner = true)
@TestInstance(TestInstance.Lifecycle.PER_CLASS)
//@org.junit.jupiter.api.parallel.Execution(org.junit.jupiter.api.parallel.ExecutionMode.CONCURRENT)
// must be per-class to allow calling once init() which took a lot of time
public abstract class AbstractRunnerTest {
public static final String TENANT_1 = "tenant1";
public static final String TENANT_2 = "tenant2";
@Inject
protected TestRunnerUtils runnerUtils;
protected RunnerUtils runnerUtils;
@Inject
@Named(QueueFactoryInterface.WORKERTASKLOG_NAMED)
protected QueueInterface<LogEntry> logsQueue;
@Inject
protected RestartCaseTest restartCaseTest;
private RestartCaseTest restartCaseTest;
@Inject
protected FlowTriggerCaseTest flowTriggerCaseTest;
@@ -50,13 +54,13 @@ public abstract class AbstractRunnerTest {
private PluginDefaultsCaseTest pluginDefaultsCaseTest;
@Inject
protected FlowCaseTest flowCaseTest;
private FlowCaseTest flowCaseTest;
@Inject
private WorkingDirectoryTest.Suite workingDirectoryTest;
@Inject
protected PauseTest.Suite pauseTest;
private PauseTest.Suite pauseTest;
@Inject
private SkipExecutionCaseTest skipExecutionCaseTest;
@@ -68,10 +72,10 @@ public abstract class AbstractRunnerTest {
protected LoopUntilCaseTest loopUntilTestCaseTest;
@Inject
protected FlowConcurrencyCaseTest flowConcurrencyCaseTest;
private FlowConcurrencyCaseTest flowConcurrencyCaseTest;
@Inject
protected ScheduleDateCaseTest scheduleDateCaseTest;
private ScheduleDateCaseTest scheduleDateCaseTest;
@Inject
protected FlowInputOutput flowIO;
@@ -80,7 +84,7 @@ public abstract class AbstractRunnerTest {
private SLATestCase slaTestCase;
@Inject
protected ChangeStateTestCase changeStateTestCase;
private ChangeStateTestCase changeStateTestCase;
@Inject
private AfterExecutionTestCase afterExecutionTestCase;
@@ -111,7 +115,7 @@ public abstract class AbstractRunnerTest {
assertThat(execution.getTaskRunList()).hasSize(8);
}
@Test
@RetryingTest(5)
@ExecuteFlow("flows/valids/parallel-nested.yaml")
void parallelNested(Execution execution) {
assertThat(execution.getTaskRunList()).hasSize(11);
@@ -153,27 +157,27 @@ public abstract class AbstractRunnerTest {
restartCaseTest.restartFailedThenSuccess();
}
@Test
@RetryingTest(5)
@LoadFlows({"flows/valids/restart-each.yaml"})
void replay() throws Exception {
restartCaseTest.replay();
}
@Test
@RetryingTest(5)
@LoadFlows({"flows/valids/failed-first.yaml"})
void restartMultiple() throws Exception {
restartCaseTest.restartMultiple();
}
@Test
@RetryingTest(5) // Flaky on CI but never locally even with 100 repetitions
@LoadFlows({"flows/valids/restart_always_failed.yaml"})
void restartFailedThenFailureWithGlobalErrors() throws Exception {
restartCaseTest.restartFailedThenFailureWithGlobalErrors();
}
@Test
@RetryingTest(5)
@LoadFlows({"flows/valids/restart_local_errors.yaml"})
protected void restartFailedThenFailureWithLocalErrors() throws Exception {
void restartFailedThenFailureWithLocalErrors() throws Exception {
restartCaseTest.restartFailedThenFailureWithLocalErrors();
}
@@ -195,27 +199,29 @@ public abstract class AbstractRunnerTest {
restartCaseTest.restartFailedWithAfterExecution();
}
@Test
@LoadFlows(value = {"flows/valids/trigger-flow-listener-no-inputs.yaml",
@RetryingTest(5)
@LoadFlows({"flows/valids/trigger-flow-listener-no-inputs.yaml",
"flows/valids/trigger-flow-listener.yaml",
"flows/valids/trigger-flow-listener-namespace-condition.yaml",
"flows/valids/trigger-flow.yaml"}, tenantId = "listener-tenant")
"flows/valids/trigger-flow.yaml"})
void flowTrigger() throws Exception {
flowTriggerCaseTest.trigger("listener-tenant");
flowTriggerCaseTest.trigger();
}
@Test // flaky on CI but never fail locally
@RetryingTest(5) // flaky on CI but never fail locally
@LoadFlows({"flows/valids/trigger-flow-listener-with-pause.yaml",
"flows/valids/trigger-flow-with-pause.yaml"})
void flowTriggerWithPause() throws Exception {
flowTriggerCaseTest.triggerWithPause();
}
@FlakyTest
@Disabled
@Test
@LoadFlows(value = {"flows/valids/trigger-flow-listener-with-concurrency-limit.yaml",
"flows/valids/trigger-flow-with-concurrency-limit.yaml"}, tenantId = "trigger-tenant")
@LoadFlows({"flows/valids/trigger-flow-listener-with-concurrency-limit.yaml",
"flows/valids/trigger-flow-with-concurrency-limit.yaml"})
void flowTriggerWithConcurrencyLimit() throws Exception {
flowTriggerCaseTest.triggerWithConcurrencyLimit("trigger-tenant");
flowTriggerCaseTest.triggerWithConcurrencyLimit();
}
@Test
@@ -226,12 +232,12 @@ public abstract class AbstractRunnerTest {
multipleConditionTriggerCaseTest.trigger();
}
@Test // Flaky on CI but never locally even with 100 repetitions
@LoadFlows(value = {"flows/valids/trigger-flow-listener-namespace-condition.yaml",
@RetryingTest(5) // Flaky on CI but never locally even with 100 repetitions
@LoadFlows({"flows/valids/trigger-flow-listener-namespace-condition.yaml",
"flows/valids/trigger-multiplecondition-flow-c.yaml",
"flows/valids/trigger-multiplecondition-flow-d.yaml"}, tenantId = "condition-tenant")
"flows/valids/trigger-multiplecondition-flow-d.yaml"})
void multipleConditionTriggerFailed() throws Exception {
multipleConditionTriggerCaseTest.failed("condition-tenant");
multipleConditionTriggerCaseTest.failed();
}
@Test
@@ -242,13 +248,12 @@ public abstract class AbstractRunnerTest {
multipleConditionTriggerCaseTest.flowTriggerPreconditions();
}
@Disabled
@Test
@LoadFlows(value = {"flows/valids/flow-trigger-preconditions-flow-listen.yaml",
@LoadFlows({"flows/valids/flow-trigger-preconditions-flow-listen.yaml",
"flows/valids/flow-trigger-preconditions-flow-a.yaml",
"flows/valids/flow-trigger-preconditions-flow-b.yaml"}, tenantId = TENANT_1)
"flows/valids/flow-trigger-preconditions-flow-b.yaml"})
void flowTriggerPreconditionsMergeOutputs() throws Exception {
multipleConditionTriggerCaseTest.flowTriggerPreconditionsMergeOutputs(TENANT_1);
multipleConditionTriggerCaseTest.flowTriggerPreconditionsMergeOutputs();
}
@Test
@@ -257,7 +262,7 @@ public abstract class AbstractRunnerTest {
multipleConditionTriggerCaseTest.flowTriggerOnPaused();
}
@Test
@RetryingTest(5)
@LoadFlows({"flows/valids/each-null.yaml"})
void eachWithNull() throws Exception {
EachSequentialTest.eachNullTest(runnerUtils, logsQueue);
@@ -269,7 +274,7 @@ public abstract class AbstractRunnerTest {
pluginDefaultsCaseTest.taskDefaults();
}
@Test
@RetryingTest(5)
@LoadFlows({"flows/valids/switch.yaml",
"flows/valids/task-flow.yaml",
"flows/valids/task-flow-inherited-labels.yaml"})
@@ -278,19 +283,19 @@ public abstract class AbstractRunnerTest {
}
@Test
@LoadFlows(value = {"flows/valids/switch.yaml",
@LoadFlows({"flows/valids/switch.yaml",
"flows/valids/task-flow.yaml",
"flows/valids/task-flow-inherited-labels.yaml"}, tenantId = TENANT_1)
"flows/valids/task-flow-inherited-labels.yaml"})
void flowWaitFailed() throws Exception {
flowCaseTest.waitFailed(TENANT_1);
flowCaseTest.waitFailed();
}
@Test
@LoadFlows(value = {"flows/valids/switch.yaml",
@LoadFlows({"flows/valids/switch.yaml",
"flows/valids/task-flow.yaml",
"flows/valids/task-flow-inherited-labels.yaml"}, tenantId = TENANT_2)
"flows/valids/task-flow-inherited-labels.yaml"})
public void invalidOutputs() throws Exception {
flowCaseTest.invalidOutputs(TENANT_2);
flowCaseTest.invalidOutputs();
}
@Test
@@ -300,9 +305,9 @@ public abstract class AbstractRunnerTest {
}
@Test
@LoadFlows(value = {"flows/valids/working-directory.yaml"}, tenantId = TENANT_1)
@LoadFlows({"flows/valids/working-directory.yaml"})
public void workerFailed() throws Exception {
workingDirectoryTest.failed(TENANT_1, runnerUtils);
workingDirectoryTest.failed(runnerUtils);
}
@Test
@@ -317,7 +322,7 @@ public abstract class AbstractRunnerTest {
workingDirectoryTest.cache(runnerUtils);
}
@Test // flaky on MySQL
@RetryingTest(5) // flaky on MySQL
@LoadFlows({"flows/valids/pause-test.yaml"})
public void pauseRun() throws Exception {
pauseTest.run(runnerUtils);
@@ -353,44 +358,44 @@ public abstract class AbstractRunnerTest {
skipExecutionCaseTest.skipExecution();
}
@Test
@RetryingTest(5)
@LoadFlows({"flows/valids/for-each-item-subflow.yaml",
"flows/valids/for-each-item.yaml"})
protected void forEachItem() throws Exception {
forEachItemCaseTest.forEachItem();
}
@Test
@LoadFlows(value = {"flows/valids/for-each-item.yaml"}, tenantId = TENANT_1)
@RetryingTest(5)
@LoadFlows({"flows/valids/for-each-item.yaml"})
protected void forEachItemEmptyItems() throws Exception {
forEachItemCaseTest.forEachItemEmptyItems(TENANT_1);
forEachItemCaseTest.forEachItemEmptyItems();
}
@Test
@RetryingTest(5)
@LoadFlows({"flows/valids/for-each-item-subflow-failed.yaml",
"flows/valids/for-each-item-failed.yaml"})
protected void forEachItemFailed() throws Exception {
forEachItemCaseTest.forEachItemFailed();
}
@Test
@RetryingTest(5)
@LoadFlows({"flows/valids/for-each-item-outputs-subflow.yaml",
"flows/valids/for-each-item-outputs.yaml"})
protected void forEachItemSubflowOutputs() throws Exception {
forEachItemCaseTest.forEachItemWithSubflowOutputs();
}
@Test // flaky on CI but always pass locally even with 100 iterations
@LoadFlows(value = {"flows/valids/restart-for-each-item.yaml", "flows/valids/restart-child.yaml"}, tenantId = TENANT_1)
@RetryingTest(5) // flaky on CI but always pass locally even with 100 iterations
@LoadFlows({"flows/valids/restart-for-each-item.yaml", "flows/valids/restart-child.yaml"})
void restartForEachItem() throws Exception {
forEachItemCaseTest.restartForEachItem(TENANT_1);
forEachItemCaseTest.restartForEachItem();
}
@Test
@LoadFlows(value = {"flows/valids/for-each-item-subflow.yaml",
"flows/valids/for-each-item-in-if.yaml"}, tenantId = TENANT_1)
@RetryingTest(5)
@LoadFlows({"flows/valids/for-each-item-subflow.yaml",
"flows/valids/for-each-item-in-if.yaml"})
protected void forEachItemInIf() throws Exception {
forEachItemCaseTest.forEachItemInIf(TENANT_1);
forEachItemCaseTest.forEachItemInIf();
}
@Test
@@ -431,9 +436,9 @@ public abstract class AbstractRunnerTest {
}
@Test
@LoadFlows(value = {"flows/valids/flow-concurrency-for-each-item.yaml", "flows/valids/flow-concurrency-queue.yml"}, tenantId = TENANT_1)
@LoadFlows({"flows/valids/flow-concurrency-for-each-item.yaml", "flows/valids/flow-concurrency-queue.yml"})
protected void flowConcurrencyWithForEachItem() throws Exception {
flowConcurrencyCaseTest.flowConcurrencyWithForEachItem(TENANT_1);
flowConcurrencyCaseTest.flowConcurrencyWithForEachItem();
}
@Test
@@ -449,9 +454,9 @@ public abstract class AbstractRunnerTest {
}
@Test
@LoadFlows(value = {"flows/valids/flow-concurrency-subflow.yml", "flows/valids/flow-concurrency-cancel.yml"}, tenantId = TENANT_1)
@LoadFlows({"flows/valids/flow-concurrency-subflow.yml", "flows/valids/flow-concurrency-cancel.yml"})
void flowConcurrencySubflow() throws Exception {
flowConcurrencyCaseTest.flowConcurrencySubflow(TENANT_1);
flowConcurrencyCaseTest.flowConcurrencySubflow();
}
@Test
@@ -506,9 +511,9 @@ public abstract class AbstractRunnerTest {
}
@Test
@LoadFlows(value = {"flows/valids/minimal.yaml"}, tenantId = TENANT_1)
@LoadFlows({"flows/valids/minimal.yaml"})
void shouldScheduleOnDate() throws Exception {
scheduleDateCaseTest.shouldScheduleOnDate(TENANT_1);
scheduleDateCaseTest.shouldScheduleOnDate();
}
@Test
@@ -530,15 +535,15 @@ public abstract class AbstractRunnerTest {
}
@Test
@LoadFlows(value = {"flows/valids/sla-execution-condition.yaml"}, tenantId = TENANT_1)
@LoadFlows({"flows/valids/sla-execution-condition.yaml"})
void executionConditionSLAShouldCancel() throws Exception {
slaTestCase.executionConditionSLAShouldCancel(TENANT_1);
slaTestCase.executionConditionSLAShouldCancel();
}
@Test
@LoadFlows(value = {"flows/valids/sla-execution-condition.yaml"}, tenantId = TENANT_2)
@LoadFlows({"flows/valids/sla-execution-condition.yaml"})
void executionConditionSLAShouldLabel() throws Exception {
slaTestCase.executionConditionSLAShouldLabel(TENANT_2);
slaTestCase.executionConditionSLAShouldLabel();
}
@Test
@@ -558,15 +563,15 @@ public abstract class AbstractRunnerTest {
}
@Test
@ExecuteFlow(value = "flows/valids/failed-first.yaml", tenantId = TENANT_1)
@ExecuteFlow("flows/valids/failed-first.yaml")
public void changeStateShouldEndsInSuccess(Execution execution) throws Exception {
changeStateTestCase.changeStateShouldEndsInSuccess(execution);
}
@Test
@LoadFlows(value = {"flows/valids/failed-first.yaml", "flows/valids/subflow-parent-of-failed.yaml"}, tenantId = TENANT_2)
@LoadFlows({"flows/valids/failed-first.yaml", "flows/valids/subflow-parent-of-failed.yaml"})
public void changeStateInSubflowShouldEndsParentFlowInSuccess() throws Exception {
changeStateTestCase.changeStateInSubflowShouldEndsParentFlowInSuccess(TENANT_2);
changeStateTestCase.changeStateInSubflowShouldEndsParentFlowInSuccess();
}
@Test

View File

@@ -3,18 +3,25 @@ package io.kestra.core.runners;
import io.kestra.core.models.executions.Execution;
import io.kestra.core.models.flows.Flow;
import io.kestra.core.models.flows.State;
import io.kestra.core.models.flows.State.Type;
import io.kestra.core.queues.QueueFactoryInterface;
import io.kestra.core.queues.QueueInterface;
import io.kestra.core.repositories.FlowRepositoryInterface;
import io.kestra.core.services.ExecutionService;
import io.kestra.core.utils.TestsUtils;
import jakarta.inject.Inject;
import jakarta.inject.Named;
import jakarta.inject.Singleton;
import reactor.core.publisher.Flux;
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.atomic.AtomicReference;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
@Singleton
public class ChangeStateTestCase {
public static final String NAMESPACE = "io.kestra.tests";
@Inject
private FlowRepositoryInterface flowRepository;
@@ -22,7 +29,11 @@ public class ChangeStateTestCase {
private ExecutionService executionService;
@Inject
private TestRunnerUtils runnerUtils;
@Named(QueueFactoryInterface.EXECUTION_NAMED)
private QueueInterface<Execution> executionQueue;
@Inject
private RunnerUtils runnerUtils;
public void changeStateShouldEndsInSuccess(Execution execution) throws Exception {
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.FAILED);
@@ -30,40 +41,73 @@ public class ChangeStateTestCase {
assertThat(execution.getTaskRunList().getFirst().getState().getCurrent()).isEqualTo(State.Type.FAILED);
// await for the last execution
CountDownLatch latch = new CountDownLatch(1);
AtomicReference<Execution> lastExecution = new AtomicReference<>();
Flux<Execution> receivedExecutions = TestsUtils.receive(executionQueue, either -> {
Execution exec = either.getLeft();
if (execution.getId().equals(exec.getId()) && exec.getState().getCurrent() == State.Type.SUCCESS) {
lastExecution.set(exec);
latch.countDown();
}
});
Flow flow = flowRepository.findByExecution(execution);
Execution markedAs = executionService.markAs(execution, flow, execution.getTaskRunList().getFirst().getId(), State.Type.SUCCESS);
Execution lastExecution = runnerUtils.emitAndAwaitExecution(e -> e.getState().getCurrent().equals(Type.SUCCESS), markedAs);
executionQueue.emit(markedAs);
assertThat(lastExecution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(lastExecution.getTaskRunList()).hasSize(2);
assertThat(lastExecution.getTaskRunList().getFirst().getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(latch.await(10, TimeUnit.SECONDS)).isTrue();
receivedExecutions.blockLast();
assertThat(lastExecution.get().getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(lastExecution.get().getTaskRunList()).hasSize(2);
assertThat(lastExecution.get().getTaskRunList().getFirst().getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
}
public void changeStateInSubflowShouldEndsParentFlowInSuccess(String tenantId) throws Exception {
public void changeStateInSubflowShouldEndsParentFlowInSuccess() throws Exception {
// await for the subflow execution
CountDownLatch latch = new CountDownLatch(1);
AtomicReference<Execution> lastExecution = new AtomicReference<>();
Flux<Execution> receivedExecutions = TestsUtils.receive(executionQueue, either -> {
Execution exec = either.getLeft();
if ("failed-first".equals(exec.getFlowId()) && exec.getState().getCurrent() == State.Type.FAILED) {
lastExecution.set(exec);
latch.countDown();
}
});
// run the parent flow
Execution execution = runnerUtils.runOne(tenantId, NAMESPACE, "subflow-parent-of-failed");
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "subflow-parent-of-failed");
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.FAILED);
assertThat(execution.getTaskRunList()).hasSize(1);
assertThat(execution.getTaskRunList().getFirst().getState().getCurrent()).isEqualTo(State.Type.FAILED);
// assert on the subflow
Execution lastExecution = runnerUtils.awaitFlowExecution(e -> e.getState().getCurrent().equals(Type.FAILED), tenantId, NAMESPACE, "failed-first");
assertThat(lastExecution.getState().getCurrent()).isEqualTo(State.Type.FAILED);
assertThat(lastExecution.getTaskRunList()).hasSize(1);
assertThat(lastExecution.getTaskRunList().getFirst().getState().getCurrent()).isEqualTo(State.Type.FAILED);
assertThat(latch.await(10, TimeUnit.SECONDS)).isTrue();
receivedExecutions.blockLast();
assertThat(lastExecution.get().getState().getCurrent()).isEqualTo(State.Type.FAILED);
assertThat(lastExecution.get().getTaskRunList()).hasSize(1);
assertThat(lastExecution.get().getTaskRunList().getFirst().getState().getCurrent()).isEqualTo(State.Type.FAILED);
// await for the parent execution
CountDownLatch parentLatch = new CountDownLatch(1);
AtomicReference<Execution> lastParentExecution = new AtomicReference<>();
receivedExecutions = TestsUtils.receive(executionQueue, either -> {
Execution exec = either.getLeft();
if (execution.getId().equals(exec.getId()) && exec.getState().isTerminated()) {
lastParentExecution.set(exec);
parentLatch.countDown();
}
});
// restart the subflow
Flow flow = flowRepository.findByExecution(lastExecution);
Execution markedAs = executionService.markAs(lastExecution, flow, lastExecution.getTaskRunList().getFirst().getId(), State.Type.SUCCESS);
runnerUtils.emitAndAwaitExecution(e -> e.getState().isTerminated(), markedAs);
//We wait for the subflow execution to pass from failed to success
Execution lastParentExecution = runnerUtils.awaitFlowExecution(e ->
e.getTaskRunList().getFirst().getState().getCurrent().equals(Type.SUCCESS), tenantId, NAMESPACE, "subflow-parent-of-failed");
Flow flow = flowRepository.findByExecution(lastExecution.get());
Execution markedAs = executionService.markAs(lastExecution.get(), flow, lastExecution.get().getTaskRunList().getFirst().getId(), State.Type.SUCCESS);
executionQueue.emit(markedAs);
// assert for the parent flow
assertThat(lastParentExecution.getState().getCurrent()).isEqualTo(State.Type.FAILED); // FIXME should be success but it's FAILED on unit tests
assertThat(lastParentExecution.getTaskRunList()).hasSize(1);
assertThat(lastParentExecution.getTaskRunList().getFirst().getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(parentLatch.await(10, TimeUnit.SECONDS)).isTrue();
receivedExecutions.blockLast();
assertThat(lastParentExecution.get().getState().getCurrent()).isEqualTo(State.Type.FAILED); // FIXME should be success but it's FAILED on unit tests
assertThat(lastParentExecution.get().getTaskRunList()).hasSize(1);
assertThat(lastParentExecution.get().getTaskRunList().getFirst().getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
}
}

View File

@@ -18,7 +18,7 @@ import static org.assertj.core.api.Assertions.assertThat;
public class EmptyVariablesTest {
@Inject
private TestRunnerUtils runnerUtils;
private RunnerUtils runnerUtils;
@Inject
private FlowInputOutput flowIO;

View File

@@ -20,7 +20,6 @@ import io.kestra.plugin.core.debug.Return;
import io.kestra.plugin.core.flow.Pause;
import jakarta.inject.Inject;
import lombok.extern.slf4j.Slf4j;
import org.junit.jupiter.api.Disabled;
import org.junit.jupiter.api.Test;
import org.junitpioneer.jupiter.RetryingTest;
import org.slf4j.event.Level;
@@ -41,10 +40,6 @@ import static org.junit.jupiter.api.Assertions.assertThrows;
@Slf4j
@KestraTest(startRunner = true)
class ExecutionServiceTest {
public static final String TENANT_1 = "tenant1";
public static final String TENANT_2 = "tenant2";
public static final String TENANT_3 = "tenant3";
@Inject
ExecutionService executionService;
@@ -58,7 +53,7 @@ class ExecutionServiceTest {
LogRepositoryInterface logRepository;
@Inject
TestRunnerUtils runnerUtils;
RunnerUtils runnerUtils;
@Test
@LoadFlows({"flows/valids/restart_last_failed.yaml"})
@@ -80,13 +75,13 @@ class ExecutionServiceTest {
}
@Test
@LoadFlows(value = {"flows/valids/restart_last_failed.yaml"}, tenantId = TENANT_1)
@LoadFlows({"flows/valids/restart_last_failed.yaml"})
void restartSimpleRevision() throws Exception {
Execution execution = runnerUtils.runOne(TENANT_1, "io.kestra.tests", "restart_last_failed");
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "restart_last_failed");
assertThat(execution.getTaskRunList()).hasSize(3);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.FAILED);
FlowWithSource flow = flowRepository.findByIdWithSource(TENANT_1, "io.kestra.tests", "restart_last_failed").orElseThrow();
FlowWithSource flow = flowRepository.findByIdWithSource(MAIN_TENANT, "io.kestra.tests", "restart_last_failed").orElseThrow();
flowRepository.update(
GenericFlow.of(flow),
flow.updateTask(
@@ -129,9 +124,9 @@ class ExecutionServiceTest {
}
@RetryingTest(5)
@LoadFlows(value = {"flows/valids/restart-each.yaml"}, tenantId = TENANT_1)
@LoadFlows({"flows/valids/restart-each.yaml"})
void restartFlowable2() throws Exception {
Execution execution = runnerUtils.runOne(TENANT_1, "io.kestra.tests", "restart-each", null, (f, e) -> ImmutableMap.of("failed", "SECOND"));
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "restart-each", null, (f, e) -> ImmutableMap.of("failed", "SECOND"));
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.FAILED);
Execution restart = executionService.restart(execution, null);
@@ -182,9 +177,9 @@ class ExecutionServiceTest {
}
@Test
@LoadFlows(value = {"flows/valids/logs.yaml"}, tenantId = TENANT_1)
@LoadFlows({"flows/valids/logs.yaml"})
void replaySimple() throws Exception {
Execution execution = runnerUtils.runOne(TENANT_1, "io.kestra.tests", "logs");
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "logs");
assertThat(execution.getTaskRunList()).hasSize(5);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
@@ -201,9 +196,9 @@ class ExecutionServiceTest {
}
@Test
@LoadFlows(value = {"flows/valids/restart-each.yaml"}, tenantId = TENANT_2)
@LoadFlows({"flows/valids/restart-each.yaml"})
void replayFlowable() throws Exception {
Execution execution = runnerUtils.runOne(TENANT_2, "io.kestra.tests", "restart-each", null, (f, e) -> ImmutableMap.of("failed", "NO"));
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "restart-each", null, (f, e) -> ImmutableMap.of("failed", "NO"));
assertThat(execution.getTaskRunList()).hasSize(20);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
@@ -218,7 +213,6 @@ class ExecutionServiceTest {
assertThat(restart.getLabels()).contains(new Label(Label.REPLAY, "true"));
}
@Disabled
@Test
@LoadFlows({"flows/valids/parallel-nested.yaml"})
void replayParallel() throws Exception {
@@ -240,7 +234,7 @@ class ExecutionServiceTest {
}
@Test
@ExecuteFlow(value = "flows/valids/each-sequential-nested.yaml", tenantId = TENANT_2)
@ExecuteFlow("flows/valids/each-sequential-nested.yaml")
void replayEachSeq(Execution execution) throws Exception {
assertThat(execution.getTaskRunList()).hasSize(23);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
@@ -259,7 +253,7 @@ class ExecutionServiceTest {
}
@Test
@ExecuteFlow(value = "flows/valids/each-sequential-nested.yaml", tenantId = TENANT_1)
@ExecuteFlow("flows/valids/each-sequential-nested.yaml")
void replayEachSeq2(Execution execution) throws Exception {
assertThat(execution.getTaskRunList()).hasSize(23);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
@@ -318,9 +312,9 @@ class ExecutionServiceTest {
}
@Test
@LoadFlows(value = {"flows/valids/each-parallel-nested.yaml"}, tenantId = TENANT_1)
@LoadFlows({"flows/valids/each-parallel-nested.yaml"})
void markAsEachPara() throws Exception {
Execution execution = runnerUtils.runOne(TENANT_1, "io.kestra.tests", "each-parallel-nested");
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "each-parallel-nested");
Flow flow = flowRepository.findByExecution(execution);
assertThat(execution.getTaskRunList()).hasSize(11);
@@ -370,9 +364,9 @@ class ExecutionServiceTest {
}
@Test
@LoadFlows(value = {"flows/valids/pause-test.yaml"}, tenantId = TENANT_1)
@LoadFlows({"flows/valids/pause-test.yaml"})
void resumePausedToKilling() throws Exception {
Execution execution = runnerUtils.runOneUntilPaused(TENANT_1, "io.kestra.tests", "pause-test");
Execution execution = runnerUtils.runOneUntilPaused(MAIN_TENANT, "io.kestra.tests", "pause-test");
Flow flow = flowRepository.findByExecution(execution);
assertThat(execution.getTaskRunList()).hasSize(1);
@@ -385,7 +379,7 @@ class ExecutionServiceTest {
}
@Test
@ExecuteFlow(value = "flows/valids/logs.yaml", tenantId = TENANT_2)
@ExecuteFlow("flows/valids/logs.yaml")
void deleteExecution(Execution execution) throws IOException, TimeoutException {
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
Await.until(() -> logRepository.findByExecutionId(execution.getTenantId(), execution.getId(), Level.TRACE).size() == 5, Duration.ofMillis(10), Duration.ofSeconds(5));
@@ -397,7 +391,7 @@ class ExecutionServiceTest {
}
@Test
@ExecuteFlow(value = "flows/valids/logs.yaml", tenantId = TENANT_3)
@ExecuteFlow("flows/valids/logs.yaml")
void deleteExecutionKeepLogs(Execution execution) throws IOException, TimeoutException {
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
Await.until(() -> logRepository.findByExecutionId(execution.getTenantId(), execution.getId(), Level.TRACE).size() == 5, Duration.ofMillis(10), Duration.ofSeconds(5));
@@ -437,9 +431,9 @@ class ExecutionServiceTest {
}
@Test
@LoadFlows(value = {"flows/valids/pause_no_tasks.yaml"}, tenantId = TENANT_1)
@LoadFlows({"flows/valids/pause_no_tasks.yaml"})
void killToState() throws Exception {
Execution execution = runnerUtils.runOneUntilPaused(TENANT_1, "io.kestra.tests", "pause_no_tasks");
Execution execution = runnerUtils.runOneUntilPaused(MAIN_TENANT, "io.kestra.tests", "pause_no_tasks");
Flow flow = flowRepository.findByExecution(execution);
Execution killed = executionService.kill(execution, flow, Optional.of(State.Type.CANCELLED));

View File

@@ -18,14 +18,11 @@ import java.nio.file.Files;
import java.nio.file.Path;
import java.util.List;
import java.util.Map;
import org.junit.jupiter.api.parallel.Execution;
import org.junit.jupiter.api.parallel.ExecutionMode;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
@KestraTest(rebuildContext = true)
@Execution(ExecutionMode.SAME_THREAD)
class FilesServiceTest {
@Inject
private TestRunContextFactory runContextFactory;

View File

@@ -3,15 +3,20 @@ package io.kestra.core.runners;
import io.kestra.core.models.executions.Execution;
import io.kestra.core.models.flows.Flow;
import io.kestra.core.models.flows.State;
import io.kestra.core.models.flows.State.History;
import io.kestra.core.models.flows.State.Type;
import io.kestra.core.queues.QueueException;
import io.kestra.core.queues.QueueFactoryInterface;
import io.kestra.core.queues.QueueInterface;
import io.kestra.core.reporter.model.Count;
import io.kestra.core.repositories.FlowRepositoryInterface;
import io.kestra.core.services.ExecutionService;
import io.kestra.core.storages.StorageInterface;
import io.kestra.core.utils.TestsUtils;
import jakarta.inject.Inject;
import jakarta.inject.Named;
import jakarta.inject.Singleton;
import org.apache.commons.lang3.StringUtils;
import reactor.core.publisher.Flux;
import java.io.File;
import java.io.FileInputStream;
@@ -21,21 +26,24 @@ import java.net.URISyntaxException;
import java.nio.file.Files;
import java.time.Duration;
import java.util.*;
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.TimeoutException;
import java.util.concurrent.atomic.AtomicInteger;
import java.util.concurrent.atomic.AtomicReference;
import java.util.stream.IntStream;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
import static org.junit.jupiter.api.Assertions.assertTrue;
@Singleton
public class FlowConcurrencyCaseTest {
public static final String NAMESPACE = "io.kestra.tests";
@Inject
private StorageInterface storageInterface;
@Inject
protected TestRunnerUtils runnerUtils;
protected RunnerUtils runnerUtils;
@Inject
private FlowInputOutput flowIO;
@@ -43,168 +51,400 @@ public class FlowConcurrencyCaseTest {
@Inject
private FlowRepositoryInterface flowRepository;
@Inject
@Named(QueueFactoryInterface.EXECUTION_NAMED)
protected QueueInterface<Execution> executionQueue;
@Inject
private ExecutionService executionService;
public void flowConcurrencyCancel() throws TimeoutException, QueueException {
Execution execution1 = runnerUtils.runOneUntilRunning(MAIN_TENANT, NAMESPACE, "flow-concurrency-cancel", null, null, Duration.ofSeconds(30));
Execution execution2 = runnerUtils.runOne(MAIN_TENANT, NAMESPACE, "flow-concurrency-cancel");
public void flowConcurrencyCancel() throws TimeoutException, QueueException, InterruptedException {
Execution execution1 = runnerUtils.runOneUntilRunning(MAIN_TENANT, "io.kestra.tests", "flow-concurrency-cancel", null, null, Duration.ofSeconds(30));
Execution execution2 = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "flow-concurrency-cancel");
assertThat(execution1.getState().isRunning()).isTrue();
assertThat(execution2.getState().getCurrent()).isEqualTo(State.Type.CANCELLED);
runnerUtils.awaitExecution(e -> e.getState().getCurrent().equals(Type.SUCCESS), execution1);
CountDownLatch latch1 = new CountDownLatch(1);
Flux<Execution> receive = TestsUtils.receive(executionQueue, e -> {
if (e.getLeft().getId().equals(execution1.getId())) {
if (e.getLeft().getState().getCurrent() == State.Type.SUCCESS) {
latch1.countDown();
}
}
// FIXME we should fail if we receive the cancel execution again but on Kafka it happens
});
assertTrue(latch1.await(1, TimeUnit.MINUTES));
receive.blockLast();
}
public void flowConcurrencyFail() throws TimeoutException, QueueException {
Execution execution1 = runnerUtils.runOneUntilRunning(MAIN_TENANT, NAMESPACE, "flow-concurrency-fail", null, null, Duration.ofSeconds(30));
Execution execution2 = runnerUtils.runOne(MAIN_TENANT, NAMESPACE, "flow-concurrency-fail");
public void flowConcurrencyFail() throws TimeoutException, QueueException, InterruptedException {
Execution execution1 = runnerUtils.runOneUntilRunning(MAIN_TENANT, "io.kestra.tests", "flow-concurrency-fail", null, null, Duration.ofSeconds(30));
Execution execution2 = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "flow-concurrency-fail");
assertThat(execution1.getState().isRunning()).isTrue();
assertThat(execution2.getState().getCurrent()).isEqualTo(State.Type.FAILED);
runnerUtils.awaitExecution(e -> e.getState().getCurrent().equals(Type.SUCCESS), execution1);
CountDownLatch latch1 = new CountDownLatch(1);
Flux<Execution> receive = TestsUtils.receive(executionQueue, e -> {
if (e.getLeft().getId().equals(execution1.getId())) {
if (e.getLeft().getState().getCurrent() == State.Type.SUCCESS) {
latch1.countDown();
}
}
// FIXME we should fail if we receive the cancel execution again but on Kafka it happens
});
assertTrue(latch1.await(1, TimeUnit.MINUTES));
receive.blockLast();
}
public void flowConcurrencyQueue() throws QueueException {
Execution execution1 = runnerUtils.runOneUntilRunning(MAIN_TENANT, NAMESPACE, "flow-concurrency-queue", null, null, Duration.ofSeconds(30));
public void flowConcurrencyQueue() throws TimeoutException, QueueException, InterruptedException {
Execution execution1 = runnerUtils.runOneUntilRunning(MAIN_TENANT, "io.kestra.tests", "flow-concurrency-queue", null, null, Duration.ofSeconds(30));
Flow flow = flowRepository
.findById(MAIN_TENANT, NAMESPACE, "flow-concurrency-queue", Optional.empty())
.findById(MAIN_TENANT, "io.kestra.tests", "flow-concurrency-queue", Optional.empty())
.orElseThrow();
Execution execution2 = Execution.newExecution(flow, null, null, Optional.empty());
Execution executionResult2 = runnerUtils.emitAndAwaitExecution(e -> e.getState().getCurrent().equals(Type.SUCCESS), execution2);
Execution executionResult1 = runnerUtils.awaitExecution(e -> e.getState().getCurrent().equals(Type.SUCCESS), execution1);
executionQueue.emit(execution2);
assertThat(execution1.getState().isRunning()).isTrue();
assertThat(execution2.getState().getCurrent()).isEqualTo(State.Type.CREATED);
assertThat(executionResult1.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(executionResult2.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(executionResult2.getState().getHistories().getFirst().getState()).isEqualTo(State.Type.CREATED);
assertThat(executionResult2.getState().getHistories().get(1).getState()).isEqualTo(State.Type.QUEUED);
assertThat(executionResult2.getState().getHistories().get(2).getState()).isEqualTo(State.Type.RUNNING);
var executionResult1 = new AtomicReference<Execution>();
var executionResult2 = new AtomicReference<Execution>();
CountDownLatch latch1 = new CountDownLatch(1);
CountDownLatch latch2 = new CountDownLatch(1);
CountDownLatch latch3 = new CountDownLatch(1);
Flux<Execution> receive = TestsUtils.receive(executionQueue, e -> {
if (e.getLeft().getId().equals(execution1.getId())) {
executionResult1.set(e.getLeft());
if (e.getLeft().getState().getCurrent() == State.Type.SUCCESS) {
latch1.countDown();
}
}
if (e.getLeft().getId().equals(execution2.getId())) {
executionResult2.set(e.getLeft());
if (e.getLeft().getState().getCurrent() == State.Type.RUNNING) {
latch2.countDown();
}
if (e.getLeft().getState().getCurrent() == State.Type.SUCCESS) {
latch3.countDown();
}
}
});
assertTrue(latch1.await(1, TimeUnit.MINUTES));
assertTrue(latch2.await(1, TimeUnit.MINUTES));
assertTrue(latch3.await(1, TimeUnit.MINUTES));
receive.blockLast();
assertThat(executionResult1.get().getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(executionResult2.get().getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(executionResult2.get().getState().getHistories().getFirst().getState()).isEqualTo(State.Type.CREATED);
assertThat(executionResult2.get().getState().getHistories().get(1).getState()).isEqualTo(State.Type.QUEUED);
assertThat(executionResult2.get().getState().getHistories().get(2).getState()).isEqualTo(State.Type.RUNNING);
}
public void flowConcurrencyQueuePause() throws QueueException {
Execution execution1 = runnerUtils.runOneUntilPaused(MAIN_TENANT, NAMESPACE, "flow-concurrency-queue-pause");
public void flowConcurrencyQueuePause() throws TimeoutException, QueueException, InterruptedException {
AtomicReference<String> firstExecutionId = new AtomicReference<>();
var firstExecutionResult = new AtomicReference<Execution>();
var secondExecutionResult = new AtomicReference<Execution>();
CountDownLatch firstExecutionLatch = new CountDownLatch(1);
CountDownLatch secondExecutionLatch = new CountDownLatch(1);
Flux<Execution> receive = TestsUtils.receive(executionQueue, e -> {
if (!"flow-concurrency-queue-pause".equals(e.getLeft().getFlowId())){
return;
}
String currentId = e.getLeft().getId();
Type currentState = e.getLeft().getState().getCurrent();
if (firstExecutionId.get() == null) {
firstExecutionId.set(currentId);
}
if (currentId.equals(firstExecutionId.get())) {
if (currentState == State.Type.SUCCESS) {
firstExecutionResult.set(e.getLeft());
firstExecutionLatch.countDown();
}
} else {
if (currentState == State.Type.SUCCESS) {
secondExecutionResult.set(e.getLeft());
secondExecutionLatch.countDown();
}
}
});
Execution execution1 = runnerUtils.runOneUntilPaused(MAIN_TENANT, "io.kestra.tests", "flow-concurrency-queue-pause");
Flow flow = flowRepository
.findById(MAIN_TENANT, NAMESPACE, "flow-concurrency-queue-pause", Optional.empty())
.findById(MAIN_TENANT, "io.kestra.tests", "flow-concurrency-queue-pause", Optional.empty())
.orElseThrow();
Execution execution2 = Execution.newExecution(flow, null, null, Optional.empty());
Execution secondExecutionResult = runnerUtils.emitAndAwaitExecution(e -> e.getState().getCurrent().equals(Type.SUCCESS), execution2);
Execution firstExecutionResult = runnerUtils.awaitExecution(e -> e.getState().getCurrent().equals(Type.SUCCESS), execution1);
executionQueue.emit(execution2);
assertThat(execution1.getState().isPaused()).isTrue();
assertThat(execution2.getState().getCurrent()).isEqualTo(State.Type.CREATED);
assertThat(firstExecutionResult.getId()).isEqualTo(execution1.getId());
assertThat(firstExecutionResult.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(secondExecutionResult.getId()).isEqualTo(execution2.getId());
assertThat(secondExecutionResult.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(secondExecutionResult.getState().getHistories().getFirst().getState()).isEqualTo(State.Type.CREATED);
assertThat(secondExecutionResult.getState().getHistories().get(1).getState()).isEqualTo(State.Type.QUEUED);
assertThat(secondExecutionResult.getState().getHistories().get(2).getState()).isEqualTo(State.Type.RUNNING);
assertTrue(firstExecutionLatch.await(10, TimeUnit.SECONDS));
assertTrue(secondExecutionLatch.await(10, TimeUnit.SECONDS));
receive.blockLast();
assertThat(firstExecutionResult.get().getId()).isEqualTo(execution1.getId());
assertThat(firstExecutionResult.get().getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(secondExecutionResult.get().getId()).isEqualTo(execution2.getId());
assertThat(secondExecutionResult.get().getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(secondExecutionResult.get().getState().getHistories().getFirst().getState()).isEqualTo(State.Type.CREATED);
assertThat(secondExecutionResult.get().getState().getHistories().get(1).getState()).isEqualTo(State.Type.QUEUED);
assertThat(secondExecutionResult.get().getState().getHistories().get(2).getState()).isEqualTo(State.Type.RUNNING);
}
public void flowConcurrencyCancelPause() throws QueueException {
Execution execution1 = runnerUtils.runOneUntilPaused(MAIN_TENANT, NAMESPACE, "flow-concurrency-cancel-pause");
public void flowConcurrencyCancelPause() throws TimeoutException, QueueException, InterruptedException {
AtomicReference<String> firstExecutionId = new AtomicReference<>();
var firstExecutionResult = new AtomicReference<Execution>();
var secondExecutionResult = new AtomicReference<Execution>();
CountDownLatch firstExecLatch = new CountDownLatch(1);
CountDownLatch secondExecLatch = new CountDownLatch(1);
Flux<Execution> receive = TestsUtils.receive(executionQueue, e -> {
if (!"flow-concurrency-cancel-pause".equals(e.getLeft().getFlowId())){
return;
}
String currentId = e.getLeft().getId();
Type currentState = e.getLeft().getState().getCurrent();
if (firstExecutionId.get() == null) {
firstExecutionId.set(currentId);
}
if (currentId.equals(firstExecutionId.get())) {
if (currentState == State.Type.SUCCESS) {
firstExecutionResult.set(e.getLeft());
firstExecLatch.countDown();
}
} else {
if (currentState == State.Type.CANCELLED) {
secondExecutionResult.set(e.getLeft());
secondExecLatch.countDown();
}
}
});
Execution execution1 = runnerUtils.runOneUntilPaused(MAIN_TENANT, "io.kestra.tests", "flow-concurrency-cancel-pause");
Flow flow = flowRepository
.findById(MAIN_TENANT, NAMESPACE, "flow-concurrency-cancel-pause", Optional.empty())
.findById(MAIN_TENANT, "io.kestra.tests", "flow-concurrency-cancel-pause", Optional.empty())
.orElseThrow();
Execution execution2 = Execution.newExecution(flow, null, null, Optional.empty());
Execution secondExecutionResult = runnerUtils.emitAndAwaitExecution(e -> e.getState().getCurrent().equals(Type.CANCELLED), execution2);
Execution firstExecutionResult = runnerUtils.awaitExecution(e -> e.getState().getCurrent().equals(Type.SUCCESS), execution1);
executionQueue.emit(execution2);
assertThat(execution1.getState().isPaused()).isTrue();
assertThat(execution2.getState().getCurrent()).isEqualTo(State.Type.CREATED);
assertThat(firstExecutionResult.getId()).isEqualTo(execution1.getId());
assertThat(firstExecutionResult.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(secondExecutionResult.getId()).isEqualTo(execution2.getId());
assertThat(secondExecutionResult.getState().getCurrent()).isEqualTo(State.Type.CANCELLED);
assertThat(secondExecutionResult.getState().getHistories().getFirst().getState()).isEqualTo(State.Type.CREATED);
assertThat(secondExecutionResult.getState().getHistories().get(1).getState()).isEqualTo(State.Type.CANCELLED);
assertTrue(firstExecLatch.await(10, TimeUnit.SECONDS));
assertTrue(secondExecLatch.await(10, TimeUnit.SECONDS));
receive.blockLast();
assertThat(firstExecutionResult.get().getId()).isEqualTo(execution1.getId());
assertThat(firstExecutionResult.get().getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(secondExecutionResult.get().getId()).isEqualTo(execution2.getId());
assertThat(secondExecutionResult.get().getState().getCurrent()).isEqualTo(State.Type.CANCELLED);
assertThat(secondExecutionResult.get().getState().getHistories().getFirst().getState()).isEqualTo(State.Type.CREATED);
assertThat(secondExecutionResult.get().getState().getHistories().get(1).getState()).isEqualTo(State.Type.CANCELLED);
}
public void flowConcurrencyWithForEachItem(String tenantId) throws QueueException, URISyntaxException, IOException {
URI file = storageUpload(tenantId);
public void flowConcurrencyWithForEachItem() throws TimeoutException, QueueException, InterruptedException, URISyntaxException, IOException {
URI file = storageUpload();
Map<String, Object> inputs = Map.of("file", file.toString(), "batch", 4);
Execution forEachItem = runnerUtils.runOneUntilRunning(tenantId, NAMESPACE, "flow-concurrency-for-each-item", null,
Execution forEachItem = runnerUtils.runOneUntilRunning(MAIN_TENANT, "io.kestra.tests", "flow-concurrency-for-each-item", null,
(flow, execution1) -> flowIO.readExecutionInputs(flow, execution1, inputs), Duration.ofSeconds(5));
assertThat(forEachItem.getState().getCurrent()).isEqualTo(Type.RUNNING);
Set<String> executionIds = new HashSet<>();
Flux<Execution> receive = TestsUtils.receive(executionQueue, e -> {
if ("flow-concurrency-queue".equals(e.getLeft().getFlowId()) && e.getLeft().getState().isRunning()) {
executionIds.add(e.getLeft().getId());
}
});
Execution terminated = runnerUtils.awaitExecution(e -> e.getState().isTerminated(),forEachItem);
// wait a little to be sure there are not too many executions started
Thread.sleep(500);
assertThat(executionIds).hasSize(1);
receive.blockLast();
Execution terminated = runnerUtils.awaitExecution(e -> e.getId().equals(forEachItem.getId()) && e.getState().isTerminated(), () -> {}, Duration.ofSeconds(10));
assertThat(terminated.getState().getCurrent()).isEqualTo(Type.SUCCESS);
List<Execution> executions = runnerUtils.awaitFlowExecutionNumber(2, tenantId, NAMESPACE, "flow-concurrency-queue");
assertThat(executions).extracting(e -> e.getState().getCurrent()).containsOnly(Type.SUCCESS);
assertThat(executions.stream()
.map(e -> e.getState().getHistories())
.flatMap(List::stream)
.map(History::getState)
.toList()).contains(Type.QUEUED);
}
public void flowConcurrencyQueueRestarted() throws Exception {
Execution execution1 = runnerUtils.runOneUntilRunning(MAIN_TENANT, NAMESPACE,
"flow-concurrency-queue-fail", null, null, Duration.ofSeconds(30));
Execution execution1 = runnerUtils.runOneUntilRunning(MAIN_TENANT, "io.kestra.tests", "flow-concurrency-queue-fail", null, null, Duration.ofSeconds(30));
Flow flow = flowRepository
.findById(MAIN_TENANT, NAMESPACE, "flow-concurrency-queue-fail", Optional.empty())
.findById(MAIN_TENANT, "io.kestra.tests", "flow-concurrency-queue-fail", Optional.empty())
.orElseThrow();
Execution execution2 = Execution.newExecution(flow, null, null, Optional.empty());
runnerUtils.emitAndAwaitExecution(e -> e.getState().getCurrent().equals(Type.RUNNING), execution2);
executionQueue.emit(execution2);
assertThat(execution1.getState().isRunning()).isTrue();
assertThat(execution2.getState().getCurrent()).isEqualTo(State.Type.CREATED);
var executionResult1 = new AtomicReference<Execution>();
var executionResult2 = new AtomicReference<Execution>();
CountDownLatch latch1 = new CountDownLatch(2);
AtomicReference<Execution> failedExecution = new AtomicReference<>();
CountDownLatch latch2 = new CountDownLatch(1);
CountDownLatch latch3 = new CountDownLatch(1);
Flux<Execution> receive = TestsUtils.receive(executionQueue, e -> {
if (e.getLeft().getId().equals(execution1.getId())) {
executionResult1.set(e.getLeft());
if (e.getLeft().getState().getCurrent() == Type.FAILED) {
failedExecution.set(e.getLeft());
latch1.countDown();
}
}
if (e.getLeft().getId().equals(execution2.getId())) {
executionResult2.set(e.getLeft());
if (e.getLeft().getState().getCurrent() == State.Type.RUNNING) {
latch2.countDown();
}
if (e.getLeft().getState().getCurrent() == Type.FAILED) {
latch3.countDown();
}
}
});
assertTrue(latch2.await(1, TimeUnit.MINUTES));
assertThat(failedExecution.get()).isNotNull();
// here the first fail and the second is now running.
// we restart the first one, it should be queued then fail again.
Execution failedExecution = runnerUtils.awaitExecution(e -> e.getState().getCurrent().equals(Type.FAILED), execution1);
Execution restarted = executionService.restart(failedExecution, null);
Execution executionResult1 = runnerUtils.emitAndAwaitExecution(e -> e.getState().getCurrent().equals(Type.FAILED), restarted);
Execution executionResult2 = runnerUtils.awaitExecution(e -> e.getState().getCurrent().equals(Type.FAILED), execution2);
Execution restarted = executionService.restart(failedExecution.get(), null);
executionQueue.emit(restarted);
assertThat(executionResult1.getState().getCurrent()).isEqualTo(Type.FAILED);
assertTrue(latch3.await(1, TimeUnit.MINUTES));
assertTrue(latch1.await(1, TimeUnit.MINUTES));
receive.blockLast();
assertThat(executionResult1.get().getState().getCurrent()).isEqualTo(Type.FAILED);
// it should have been queued after restarted
assertThat(executionResult1.getState().getHistories().stream().anyMatch(history -> history.getState() == Type.RESTARTED)).isTrue();
assertThat(executionResult1.getState().getHistories().stream().anyMatch(history -> history.getState() == Type.QUEUED)).isTrue();
assertThat(executionResult2.getState().getCurrent()).isEqualTo(Type.FAILED);
assertThat(executionResult2.getState().getHistories().getFirst().getState()).isEqualTo(State.Type.CREATED);
assertThat(executionResult2.getState().getHistories().get(1).getState()).isEqualTo(State.Type.QUEUED);
assertThat(executionResult2.getState().getHistories().get(2).getState()).isEqualTo(State.Type.RUNNING);
assertThat(executionResult1.get().getState().getHistories().stream().anyMatch(history -> history.getState() == Type.RESTARTED)).isTrue();
assertThat(executionResult1.get().getState().getHistories().stream().anyMatch(history -> history.getState() == Type.QUEUED)).isTrue();
assertThat(executionResult2.get().getState().getCurrent()).isEqualTo(Type.FAILED);
assertThat(executionResult2.get().getState().getHistories().getFirst().getState()).isEqualTo(State.Type.CREATED);
assertThat(executionResult2.get().getState().getHistories().get(1).getState()).isEqualTo(State.Type.QUEUED);
assertThat(executionResult2.get().getState().getHistories().get(2).getState()).isEqualTo(State.Type.RUNNING);
}
public void flowConcurrencyQueueAfterExecution() throws QueueException {
Execution execution1 = runnerUtils.runOneUntilRunning(MAIN_TENANT, NAMESPACE, "flow-concurrency-queue-after-execution", null, null, Duration.ofSeconds(30));
public void flowConcurrencyQueueAfterExecution() throws TimeoutException, QueueException, InterruptedException {
Execution execution1 = runnerUtils.runOneUntilRunning(MAIN_TENANT, "io.kestra.tests", "flow-concurrency-queue-after-execution", null, null, Duration.ofSeconds(30));
Flow flow = flowRepository
.findById(MAIN_TENANT, NAMESPACE, "flow-concurrency-queue-after-execution", Optional.empty())
.findById(MAIN_TENANT, "io.kestra.tests", "flow-concurrency-queue-after-execution", Optional.empty())
.orElseThrow();
Execution execution2 = Execution.newExecution(flow, null, null, Optional.empty());
Execution executionResult2 = runnerUtils.emitAndAwaitExecution(e -> e.getState().getCurrent().equals(Type.SUCCESS), execution2);
Execution executionResult1 = runnerUtils.awaitExecution(e -> e.getState().getCurrent().equals(Type.SUCCESS), execution1);
executionQueue.emit(execution2);
assertThat(executionResult1.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(executionResult2.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(executionResult2.getState().getHistories().getFirst().getState()).isEqualTo(State.Type.CREATED);
assertThat(executionResult2.getState().getHistories().get(1).getState()).isEqualTo(State.Type.QUEUED);
assertThat(executionResult2.getState().getHistories().get(2).getState()).isEqualTo(State.Type.RUNNING);
assertThat(execution1.getState().isRunning()).isTrue();
assertThat(execution2.getState().getCurrent()).isEqualTo(State.Type.CREATED);
var executionResult1 = new AtomicReference<Execution>();
var executionResult2 = new AtomicReference<Execution>();
CountDownLatch latch1 = new CountDownLatch(1);
CountDownLatch latch2 = new CountDownLatch(1);
CountDownLatch latch3 = new CountDownLatch(1);
Flux<Execution> receive = TestsUtils.receive(executionQueue, e -> {
if (e.getLeft().getId().equals(execution1.getId())) {
executionResult1.set(e.getLeft());
if (e.getLeft().getState().getCurrent() == State.Type.SUCCESS) {
latch1.countDown();
}
}
if (e.getLeft().getId().equals(execution2.getId())) {
executionResult2.set(e.getLeft());
if (e.getLeft().getState().getCurrent() == State.Type.RUNNING) {
latch2.countDown();
}
if (e.getLeft().getState().getCurrent() == State.Type.SUCCESS) {
latch3.countDown();
}
}
});
assertTrue(latch1.await(1, TimeUnit.MINUTES));
assertTrue(latch2.await(1, TimeUnit.MINUTES));
assertTrue(latch3.await(1, TimeUnit.MINUTES));
receive.blockLast();
assertThat(executionResult1.get().getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(executionResult2.get().getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(executionResult2.get().getState().getHistories().getFirst().getState()).isEqualTo(State.Type.CREATED);
assertThat(executionResult2.get().getState().getHistories().get(1).getState()).isEqualTo(State.Type.QUEUED);
assertThat(executionResult2.get().getState().getHistories().get(2).getState()).isEqualTo(State.Type.RUNNING);
}
public void flowConcurrencySubflow(String tenantId) throws TimeoutException, QueueException {
runnerUtils.runOneUntilRunning(tenantId, NAMESPACE, "flow-concurrency-subflow", null, null, Duration.ofSeconds(30));
runnerUtils.runOne(tenantId, NAMESPACE, "flow-concurrency-subflow");
public void flowConcurrencySubflow() throws TimeoutException, QueueException, InterruptedException {
CountDownLatch successLatch = new CountDownLatch(1);
CountDownLatch canceledLatch = new CountDownLatch(1);
Flux<Execution> receive = TestsUtils.receive(executionQueue, e -> {
if (e.getLeft().getFlowId().equals("flow-concurrency-cancel")) {
if (e.getLeft().getState().getCurrent() == State.Type.SUCCESS) {
successLatch.countDown();
}
if (e.getLeft().getState().getCurrent() == Type.CANCELLED) {
canceledLatch.countDown();
}
}
List<Execution> subFlowExecs = runnerUtils.awaitFlowExecutionNumber(2, tenantId, NAMESPACE, "flow-concurrency-cancel");
assertThat(subFlowExecs).extracting(e -> e.getState().getCurrent()).containsExactlyInAnyOrder(Type.SUCCESS, Type.CANCELLED);
// FIXME we should fail if we receive the cancel execution again but on Kafka it happens
});
Execution execution1 = runnerUtils.runOneUntilRunning(MAIN_TENANT, "io.kestra.tests", "flow-concurrency-subflow", null, null, Duration.ofSeconds(30));
Execution execution2 = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "flow-concurrency-subflow");
assertThat(execution1.getState().isRunning()).isTrue();
assertThat(execution2.getState().getCurrent()).isEqualTo(Type.SUCCESS);
// assert we have one canceled subflow and one in success
assertTrue(canceledLatch.await(1, TimeUnit.MINUTES));
assertTrue(successLatch.await(1, TimeUnit.MINUTES));
receive.blockLast();
// run another execution to be sure that everything work (purge is correctly done)
Execution execution3 = runnerUtils.runOne(tenantId, NAMESPACE, "flow-concurrency-subflow");
CountDownLatch newSuccessLatch = new CountDownLatch(1);
Flux<Execution> secondReceive = TestsUtils.receive(executionQueue, e -> {
if (e.getLeft().getFlowId().equals("flow-concurrency-cancel")) {
if (e.getLeft().getState().getCurrent() == State.Type.SUCCESS) {
newSuccessLatch.countDown();
}
}
// FIXME we should fail if we receive the cancel execution again but on Kafka it happens
});
Execution execution3 = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "flow-concurrency-subflow");
assertThat(execution3.getState().getCurrent()).isEqualTo(Type.SUCCESS);
runnerUtils.awaitFlowExecution(e -> e.getState().getCurrent().equals(Type.SUCCESS), tenantId, NAMESPACE, "flow-concurrency-cancel");
// assert we have two successful subflow
assertTrue(newSuccessLatch.await(1, TimeUnit.MINUTES));
secondReceive.blockLast();
}
private URI storageUpload(String tenantId) throws URISyntaxException, IOException {
private URI storageUpload() throws URISyntaxException, IOException {
File tempFile = File.createTempFile("file", ".txt");
Files.write(tempFile.toPath(), content());
return storageInterface.put(
tenantId,
MAIN_TENANT,
null,
new URI("/file/storage/file.txt"),
new FileInputStream(tempFile)

View File

@@ -4,22 +4,19 @@ import io.kestra.core.models.flows.FlowWithSource;
import io.kestra.core.models.flows.GenericFlow;
import io.kestra.core.models.property.Property;
import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.utils.Await;
import io.kestra.core.utils.TestsUtils;
import java.time.Duration;
import java.util.List;
import java.util.concurrent.TimeoutException;
import lombok.SneakyThrows;
import io.kestra.core.repositories.FlowRepositoryInterface;
import io.kestra.core.services.FlowListenersInterface;
import io.kestra.plugin.core.debug.Return;
import io.kestra.core.utils.IdUtils;
import java.util.Collections;
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.atomic.AtomicInteger;
import jakarta.inject.Inject;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
@KestraTest
@@ -27,11 +24,11 @@ abstract public class FlowListenersTest {
@Inject
protected FlowRepositoryInterface flowRepository;
protected static FlowWithSource create(String tenantId, String flowId, String taskId) {
protected static FlowWithSource create(String flowId, String taskId) {
FlowWithSource flow = FlowWithSource.builder()
.id(flowId)
.namespace("io.kestra.unittest")
.tenantId(tenantId)
.tenantId(MAIN_TENANT)
.revision(1)
.tasks(Collections.singletonList(Return.builder()
.id(taskId)
@@ -42,65 +39,88 @@ abstract public class FlowListenersTest {
return flow.toBuilder().source(flow.sourceOrGenerateIfNull()).build();
}
private static final Logger LOG = LoggerFactory.getLogger(FlowListenersTest.class);
public void suite(FlowListenersInterface flowListenersService) throws TimeoutException {
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
public void suite(FlowListenersInterface flowListenersService) {
flowListenersService.run();
AtomicInteger count = new AtomicInteger();
var ref = new Ref();
flowListenersService.listen(flows -> count.set(getFlowsForTenant(flowListenersService, tenant).size()));
flowListenersService.listen(flows -> {
count.set(flows.size());
ref.countDownLatch.countDown();
});
// initial state
LOG.info("-----------> wait for zero");
Await.until(() -> count.get() == 0, Duration.ofMillis(10), Duration.ofSeconds(5));
assertThat(getFlowsForTenant(flowListenersService, tenant).size()).isZero();
wait(ref, () -> {
assertThat(count.get()).isZero();
assertThat(flowListenersService.flows().size()).isZero();
});
// resend on startup done for kafka
LOG.info("-----------> wait for zero kafka");
if (flowListenersService.getClass().getName().equals("io.kestra.ee.runner.kafka.KafkaFlowListeners")) {
Await.until(() -> count.get() == 0, Duration.ofMillis(10), Duration.ofSeconds(5));
assertThat(getFlowsForTenant(flowListenersService, tenant).size()).isZero();
wait(ref, () -> {
assertThat(count.get()).isZero();
assertThat(flowListenersService.flows().size()).isZero();
});
}
// create first
LOG.info("-----------> create fist flow");
FlowWithSource first = create(tenant, "first_" + IdUtils.create(), "test");
FlowWithSource firstUpdated = create(tenant, first.getId(), "test2");
FlowWithSource first = create("first_" + IdUtils.create(), "test");
FlowWithSource firstUpdated = create(first.getId(), "test2");
flowRepository.create(GenericFlow.of(first));
Await.until(() -> count.get() == 1, Duration.ofMillis(10), Duration.ofSeconds(5));
assertThat(getFlowsForTenant(flowListenersService, tenant).size()).isEqualTo(1);
wait(ref, () -> {
assertThat(count.get()).isEqualTo(1);
assertThat(flowListenersService.flows().size()).isEqualTo(1);
});
// create the same id than first, no additional flows
first = flowRepository.update(GenericFlow.of(firstUpdated), first);
Await.until(() -> count.get() == 1, Duration.ofMillis(10), Duration.ofSeconds(5));
assertThat(getFlowsForTenant(flowListenersService, tenant).size()).isEqualTo(1);
wait(ref, () -> {
assertThat(count.get()).isEqualTo(1);
assertThat(flowListenersService.flows().size()).isEqualTo(1);
//assertThat(flowListenersService.flows().getFirst().getFirst().getId(), is("test2"));
});
FlowWithSource second = create(tenant, "second_" + IdUtils.create(), "test");
FlowWithSource second = create("second_" + IdUtils.create(), "test");
// create a new one
flowRepository.create(GenericFlow.of(second));
Await.until(() -> count.get() == 2, Duration.ofMillis(10), Duration.ofSeconds(5));
assertThat(getFlowsForTenant(flowListenersService, tenant).size()).isEqualTo(2);
wait(ref, () -> {
assertThat(count.get()).isEqualTo(2);
assertThat(flowListenersService.flows().size()).isEqualTo(2);
});
// delete first
FlowWithSource deleted = flowRepository.delete(first);
Await.until(() -> count.get() == 1, Duration.ofMillis(10), Duration.ofSeconds(5));
assertThat(getFlowsForTenant(flowListenersService, tenant).size()).isEqualTo(1);
wait(ref, () -> {
assertThat(count.get()).isEqualTo(1);
assertThat(flowListenersService.flows().size()).isEqualTo(1);
});
// restore must works
flowRepository.create(GenericFlow.of(first));
Await.until(() -> count.get() == 2, Duration.ofMillis(10), Duration.ofSeconds(5));
assertThat(getFlowsForTenant(flowListenersService, tenant).size()).isEqualTo(2);
wait(ref, () -> {
assertThat(count.get()).isEqualTo(2);
assertThat(flowListenersService.flows().size()).isEqualTo(2);
});
FlowWithSource withTenant = first.toBuilder().tenantId("some-tenant").build();
flowRepository.create(GenericFlow.of(withTenant));
wait(ref, () -> {
assertThat(count.get()).isEqualTo(3);
assertThat(flowListenersService.flows().size()).isEqualTo(3);
});
}
public List<FlowWithSource> getFlowsForTenant(FlowListenersInterface flowListenersService, String tenantId){
return flowListenersService.flows().stream()
.filter(f -> tenantId.equals(f.getTenantId()))
.toList();
public static class Ref {
CountDownLatch countDownLatch = new CountDownLatch(1);
}
@SneakyThrows
private void wait(Ref ref, Runnable run) {
ref.countDownLatch.await(60, TimeUnit.SECONDS);
run.run();
ref.countDownLatch = new CountDownLatch(1);
}
}

View File

@@ -2,61 +2,82 @@ package io.kestra.core.runners;
import io.kestra.core.models.executions.Execution;
import io.kestra.core.models.flows.State;
import io.kestra.core.models.flows.State.Type;
import io.kestra.core.queues.QueueException;
import io.kestra.core.queues.QueueFactoryInterface;
import io.kestra.core.queues.QueueInterface;
import io.kestra.core.utils.TestsUtils;
import jakarta.inject.Inject;
import jakarta.inject.Named;
import jakarta.inject.Singleton;
import reactor.core.publisher.Flux;
import java.time.Instant;
import java.util.Comparator;
import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.TimeoutException;
import java.util.concurrent.atomic.AtomicReference;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
import static org.junit.jupiter.api.Assertions.assertTrue;
@Singleton
public class FlowTriggerCaseTest {
public static final String NAMESPACE = "io.kestra.tests.trigger";
@Inject
@Named(QueueFactoryInterface.EXECUTION_NAMED)
protected QueueInterface<Execution> executionQueue;
@Inject
protected TestRunnerUtils runnerUtils;
protected RunnerUtils runnerUtils;
public void trigger(String tenantId) throws InterruptedException, TimeoutException, QueueException {
Execution execution = runnerUtils.runOne(tenantId, NAMESPACE, "trigger-flow");
public void trigger() throws InterruptedException, TimeoutException, QueueException {
CountDownLatch countDownLatch = new CountDownLatch(3);
AtomicReference<Execution> flowListener = new AtomicReference<>();
AtomicReference<Execution> flowListenerNoInput = new AtomicReference<>();
AtomicReference<Execution> flowListenerNamespace = new AtomicReference<>();
Flux<Execution> receive = TestsUtils.receive(executionQueue, either -> {
Execution execution = either.getLeft();
if (execution.getState().getCurrent() == State.Type.SUCCESS) {
if (flowListenerNoInput.get() == null && execution.getFlowId().equals("trigger-flow-listener-no-inputs")) {
flowListenerNoInput.set(execution);
countDownLatch.countDown();
} else if (flowListener.get() == null && execution.getFlowId().equals("trigger-flow-listener")) {
flowListener.set(execution);
countDownLatch.countDown();
} else if (flowListenerNamespace.get() == null && execution.getFlowId().equals("trigger-flow-listener-namespace-condition")) {
flowListenerNamespace.set(execution);
countDownLatch.countDown();
}
}
});
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests.trigger", "trigger-flow");
assertThat(execution.getTaskRunList().size()).isEqualTo(1);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
Execution flowListenerNoInput = runnerUtils.awaitFlowExecution(
e -> e.getState().getCurrent().equals(Type.SUCCESS), tenantId, NAMESPACE,
"trigger-flow-listener-no-inputs");
Execution flowListener = runnerUtils.awaitFlowExecution(
e -> e.getState().getCurrent().equals(Type.SUCCESS), tenantId, NAMESPACE,
"trigger-flow-listener");
Execution flowListenerNamespace = runnerUtils.awaitFlowExecution(
e -> e.getState().getCurrent().equals(Type.SUCCESS), tenantId, NAMESPACE,
"trigger-flow-listener-namespace-condition");
assertTrue(countDownLatch.await(15, TimeUnit.SECONDS));
receive.blockLast();
assertThat(flowListener.get().getTaskRunList().size()).isEqualTo(1);
assertThat(flowListener.get().getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(flowListener.get().getTaskRunList().getFirst().getOutputs().get("value")).isEqualTo("childs: from parents: " + execution.getId());
assertThat(flowListener.get().getTrigger().getVariables().get("executionId")).isEqualTo(execution.getId());
assertThat(flowListener.get().getTrigger().getVariables().get("namespace")).isEqualTo("io.kestra.tests.trigger");
assertThat(flowListener.get().getTrigger().getVariables().get("flowId")).isEqualTo("trigger-flow");
assertThat(flowListener.getTaskRunList().size()).isEqualTo(1);
assertThat(flowListener.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(flowListener.getTaskRunList().getFirst().getOutputs().get("value")).isEqualTo("childs: from parents: " + execution.getId());
assertThat(flowListener.getTrigger().getVariables().get("executionId")).isEqualTo(execution.getId());
assertThat(flowListener.getTrigger().getVariables().get("namespace")).isEqualTo(NAMESPACE);
assertThat(flowListener.getTrigger().getVariables().get("flowId")).isEqualTo("trigger-flow");
assertThat(flowListenerNoInput.get().getTaskRunList().size()).isEqualTo(1);
assertThat(flowListenerNoInput.get().getTrigger().getVariables().get("executionId")).isEqualTo(execution.getId());
assertThat(flowListenerNoInput.get().getTrigger().getVariables().get("namespace")).isEqualTo("io.kestra.tests.trigger");
assertThat(flowListenerNoInput.get().getTrigger().getVariables().get("flowId")).isEqualTo("trigger-flow");
assertThat(flowListenerNoInput.get().getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(flowListenerNoInput.getTaskRunList().size()).isEqualTo(1);
assertThat(flowListenerNoInput.getTrigger().getVariables().get("executionId")).isEqualTo(execution.getId());
assertThat(flowListenerNoInput.getTrigger().getVariables().get("namespace")).isEqualTo(NAMESPACE);
assertThat(flowListenerNoInput.getTrigger().getVariables().get("flowId")).isEqualTo("trigger-flow");
assertThat(flowListenerNoInput.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(flowListenerNamespace.getTaskRunList().size()).isEqualTo(1);
assertThat(flowListenerNamespace.getTrigger().getVariables().get("namespace")).isEqualTo(NAMESPACE);
assertThat(flowListenerNamespace.get().getTaskRunList().size()).isEqualTo(1);
assertThat(flowListenerNamespace.get().getTrigger().getVariables().get("namespace")).isEqualTo("io.kestra.tests.trigger");
// it will be triggered for 'trigger-flow' or any of the 'trigger-flow-listener*', so we only assert that it's one of them
assertThat(flowListenerNamespace.getTrigger().getVariables().get("flowId"))
assertThat(flowListenerNamespace.get().getTrigger().getVariables().get("flowId"))
.satisfiesAnyOf(
arg -> assertThat(arg).isEqualTo("trigger-flow"),
arg -> assertThat(arg).isEqualTo("trigger-flow-listener-no-inputs"),
@@ -64,43 +85,56 @@ public class FlowTriggerCaseTest {
);
}
public void triggerWithPause() throws TimeoutException, QueueException {
public void triggerWithPause() throws InterruptedException, TimeoutException, QueueException {
CountDownLatch countDownLatch = new CountDownLatch(4);
List<Execution> flowListeners = new ArrayList<>();
Flux<Execution> receive = TestsUtils.receive(executionQueue, either -> {
Execution execution = either.getLeft();
if (execution.getState().getCurrent() == State.Type.SUCCESS && execution.getFlowId().equals("trigger-flow-listener-with-pause")) {
flowListeners.add(execution);
countDownLatch.countDown();
}
});
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests.trigger.pause", "trigger-flow-with-pause");
assertThat(execution.getTaskRunList().size()).isEqualTo(3);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
List<Execution> triggeredExec = runnerUtils.awaitFlowExecutionNumber(
4,
MAIN_TENANT,
"io.kestra.tests.trigger.pause",
"trigger-flow-listener-with-pause");
assertTrue(countDownLatch.await(15, TimeUnit.SECONDS));
receive.blockLast();
assertThat(triggeredExec.size()).isEqualTo(4);
List<Execution> sortedExecs = triggeredExec.stream()
.sorted(Comparator.comparing(e -> e.getState().getEndDate().orElse(Instant.now())))
.toList();
assertThat(sortedExecs.get(0).getOutputs().get("status")).isEqualTo("RUNNING");
assertThat(sortedExecs.get(1).getOutputs().get("status")).isEqualTo("PAUSED");
assertThat(sortedExecs.get(2).getOutputs().get("status")).isEqualTo("RUNNING");
assertThat(sortedExecs.get(3).getOutputs().get("status")).isEqualTo("SUCCESS");
assertThat(flowListeners.size()).isEqualTo(4);
assertThat(flowListeners.get(0).getOutputs().get("status")).isEqualTo("RUNNING");
assertThat(flowListeners.get(1).getOutputs().get("status")).isEqualTo("PAUSED");
assertThat(flowListeners.get(2).getOutputs().get("status")).isEqualTo("RUNNING");
assertThat(flowListeners.get(3).getOutputs().get("status")).isEqualTo("SUCCESS");
}
public void triggerWithConcurrencyLimit(String tenantId) throws QueueException, TimeoutException {
Execution execution1 = runnerUtils.runOneUntilRunning(tenantId, "io.kestra.tests.trigger.concurrency", "trigger-flow-with-concurrency-limit");
Execution execution2 = runnerUtils.runOne(tenantId, "io.kestra.tests.trigger.concurrency", "trigger-flow-with-concurrency-limit");
public void triggerWithConcurrencyLimit() throws QueueException, TimeoutException, InterruptedException {
CountDownLatch countDownLatch = new CountDownLatch(5);
List<Execution> flowListeners = new ArrayList<>();
List<Execution> triggeredExec = runnerUtils.awaitFlowExecutionNumber(
5,
tenantId,
"io.kestra.tests.trigger.concurrency",
"trigger-flow-listener-with-concurrency-limit");
Flux<Execution> receive = TestsUtils.receive(executionQueue, either -> {
Execution execution = either.getLeft();
if (execution.getState().getCurrent() == State.Type.SUCCESS && execution.getFlowId().equals("trigger-flow-listener-with-concurrency-limit")) {
flowListeners.add(execution);
countDownLatch.countDown();
}
});
assertThat(triggeredExec.size()).isEqualTo(5);
assertThat(triggeredExec.stream().anyMatch(e -> e.getOutputs().get("status").equals("RUNNING") && e.getOutputs().get("executionId").equals(execution1.getId()))).isTrue();
assertThat(triggeredExec.stream().anyMatch(e -> e.getOutputs().get("status").equals("SUCCESS") && e.getOutputs().get("executionId").equals(execution1.getId()))).isTrue();
assertThat(triggeredExec.stream().anyMatch(e -> e.getOutputs().get("status").equals("QUEUED") && e.getOutputs().get("executionId").equals(execution2.getId()))).isTrue();
assertThat(triggeredExec.stream().anyMatch(e -> e.getOutputs().get("status").equals("RUNNING") && e.getOutputs().get("executionId").equals(execution2.getId()))).isTrue();
assertThat(triggeredExec.stream().anyMatch(e -> e.getOutputs().get("status").equals("SUCCESS") && e.getOutputs().get("executionId").equals(execution2.getId()))).isTrue();
Execution execution1 = runnerUtils.runOneUntilRunning(MAIN_TENANT, "io.kestra.tests.trigger.concurrency", "trigger-flow-with-concurrency-limit");
Execution execution2 = runnerUtils.runOneUntilRunning(MAIN_TENANT, "io.kestra.tests.trigger.concurrency", "trigger-flow-with-concurrency-limit");
assertTrue(countDownLatch.await(15, TimeUnit.SECONDS));
receive.blockLast();
assertThat(flowListeners.size()).isEqualTo(5);
assertThat(flowListeners.stream().anyMatch(e -> e.getOutputs().get("status").equals("RUNNING") && e.getOutputs().get("executionId").equals(execution1.getId()))).isTrue();
assertThat(flowListeners.stream().anyMatch(e -> e.getOutputs().get("status").equals("SUCCESS") && e.getOutputs().get("executionId").equals(execution1.getId()))).isTrue();
assertThat(flowListeners.stream().anyMatch(e -> e.getOutputs().get("status").equals("QUEUED") && e.getOutputs().get("executionId").equals(execution2.getId()))).isTrue();
assertThat(flowListeners.stream().anyMatch(e -> e.getOutputs().get("status").equals("RUNNING") && e.getOutputs().get("executionId").equals(execution2.getId()))).isTrue();
assertThat(flowListeners.stream().anyMatch(e -> e.getOutputs().get("status").equals("SUCCESS") && e.getOutputs().get("executionId").equals(execution2.getId()))).isTrue();
}
}

View File

@@ -17,9 +17,6 @@ import io.kestra.core.storages.StorageInterface;
import io.kestra.core.utils.TestsUtils;
import jakarta.inject.Inject;
import jakarta.inject.Named;
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.atomic.AtomicReference;
import org.junit.jupiter.api.Test;
import jakarta.validation.ConstraintViolationException;
@@ -39,7 +36,6 @@ import java.util.concurrent.TimeoutException;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
import static org.junit.jupiter.api.Assertions.assertThrows;
import static org.junit.jupiter.api.Assertions.assertTrue;
@KestraTest(startRunner = true)
public class InputsTest {
@@ -48,7 +44,7 @@ public class InputsTest {
private QueueInterface<LogEntry> logQueue;
@Inject
private TestRunnerUtils runnerUtils;
private RunnerUtils runnerUtils;
public static Map<String, Object> inputs = ImmutableMap.<String, Object>builder()
.put("string", "myString")
@@ -94,8 +90,8 @@ public class InputsTest {
@Inject
private FlowInputOutput flowInputOutput;
private Map<String, Object> typedInputs(Map<String, Object> map, String tenantId) {
return typedInputs(map, flowRepository.findById(tenantId, "io.kestra.tests", "inputs").get());
private Map<String, Object> typedInputs(Map<String, Object> map) {
return typedInputs(map, flowRepository.findById(MAIN_TENANT, "io.kestra.tests", "inputs").get());
}
private Map<String, Object> typedInputs(Map<String, Object> map, Flow flow) {
@@ -104,7 +100,7 @@ public class InputsTest {
Execution.builder()
.id("test")
.namespace(flow.getNamespace())
.tenantId(flow.getTenantId())
.tenantId(MAIN_TENANT)
.flowRevision(1)
.flowId(flow.getId())
.build(),
@@ -117,25 +113,25 @@ public class InputsTest {
void missingRequired() {
HashMap<String, Object> inputs = new HashMap<>(InputsTest.inputs);
inputs.put("string", null);
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(inputs, MAIN_TENANT));
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(inputs));
assertThat(e.getMessage()).contains("Invalid input for `string`, missing required input, but received `null`");
}
@Test
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant")
@LoadFlows({"flows/valids/inputs.yaml"})
void nonRequiredNoDefaultNoValueIsNull() {
HashMap<String, Object> inputsWithMissingOptionalInput = new HashMap<>(inputs);
inputsWithMissingOptionalInput.remove("bool");
assertThat(typedInputs(inputsWithMissingOptionalInput, "tenant").containsKey("bool")).isTrue();
assertThat(typedInputs(inputsWithMissingOptionalInput, "tenant").get("bool")).isNull();
assertThat(typedInputs(inputsWithMissingOptionalInput).containsKey("bool")).isTrue();
assertThat(typedInputs(inputsWithMissingOptionalInput).get("bool")).isNull();
}
@SuppressWarnings("unchecked")
@Test
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant1")
@LoadFlows({"flows/valids/inputs.yaml"})
void allValidInputs() throws URISyntaxException, IOException {
Map<String, Object> typeds = typedInputs(inputs, "tenant1");
Map<String, Object> typeds = typedInputs(inputs);
assertThat(typeds.get("string")).isEqualTo("myString");
assertThat(typeds.get("int")).isEqualTo(42);
@@ -147,7 +143,7 @@ public class InputsTest {
assertThat(typeds.get("time")).isEqualTo(LocalTime.parse("18:27:49"));
assertThat(typeds.get("duration")).isEqualTo(Duration.parse("PT5M6S"));
assertThat((URI) typeds.get("file")).isEqualTo(new URI("kestra:///io/kestra/tests/inputs/executions/test/inputs/file/application-test.yml"));
assertThat(CharStreams.toString(new InputStreamReader(storageInterface.get("tenant1", null, (URI) typeds.get("file"))))).isEqualTo(CharStreams.toString(new InputStreamReader(new FileInputStream((String) inputs.get("file")))));
assertThat(CharStreams.toString(new InputStreamReader(storageInterface.get(MAIN_TENANT, null, (URI) typeds.get("file"))))).isEqualTo(CharStreams.toString(new InputStreamReader(new FileInputStream((String) inputs.get("file")))));
assertThat(typeds.get("json")).isEqualTo(Map.of("a", "b"));
assertThat(typeds.get("uri")).isEqualTo("https://www.google.com");
assertThat(((Map<String, Object>) typeds.get("nested")).get("string")).isEqualTo("a string");
@@ -170,9 +166,9 @@ public class InputsTest {
}
@Test
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant2")
@LoadFlows({"flows/valids/inputs.yaml"})
void allValidTypedInputs() {
Map<String, Object> typeds = typedInputs(inputs, "tenant2");
Map<String, Object> typeds = typedInputs(inputs);
typeds.put("int", 42);
typeds.put("float", 42.42F);
typeds.put("bool", false);
@@ -185,10 +181,10 @@ public class InputsTest {
}
@Test
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant3")
@LoadFlows({"flows/valids/inputs.yaml"})
void inputFlow() throws TimeoutException, QueueException {
Execution execution = runnerUtils.runOne(
"tenant3",
MAIN_TENANT,
"io.kestra.tests",
"inputs",
null,
@@ -205,165 +201,165 @@ public class InputsTest {
}
@Test
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant4")
@LoadFlows({"flows/valids/inputs.yaml"})
void inputValidatedStringBadValue() {
HashMap<String, Object> map = new HashMap<>(inputs);
map.put("validatedString", "foo");
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(map, "tenant4"));
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(map));
assertThat(e.getMessage()).contains("Invalid input for `validatedString`, it must match the pattern");
}
@Test
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant5")
@LoadFlows({"flows/valids/inputs.yaml"})
void inputValidatedIntegerBadValue() {
HashMap<String, Object> mapMin = new HashMap<>(inputs);
mapMin.put("validatedInt", "9");
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMin, "tenant5"));
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMin));
assertThat(e.getMessage()).contains("Invalid input for `validatedInt`, it must be more than `10`, but received `9`");
HashMap<String, Object> mapMax = new HashMap<>(inputs);
mapMax.put("validatedInt", "21");
e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMax, "tenant5"));
e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMax));
assertThat(e.getMessage()).contains("Invalid input for `validatedInt`, it must be less than `20`, but received `21`");
}
@Test
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant6")
@LoadFlows({"flows/valids/inputs.yaml"})
void inputValidatedDateBadValue() {
HashMap<String, Object> mapMin = new HashMap<>(inputs);
mapMin.put("validatedDate", "2022-01-01");
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMin, "tenant6"));
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMin));
assertThat(e.getMessage()).contains("Invalid input for `validatedDate`, it must be after `2023-01-01`, but received `2022-01-01`");
HashMap<String, Object> mapMax = new HashMap<>(inputs);
mapMax.put("validatedDate", "2024-01-01");
e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMax, "tenant6"));
e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMax));
assertThat(e.getMessage()).contains("Invalid input for `validatedDate`, it must be before `2023-12-31`, but received `2024-01-01`");
}
@Test
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant7")
@LoadFlows({"flows/valids/inputs.yaml"})
void inputValidatedDateTimeBadValue() {
HashMap<String, Object> mapMin = new HashMap<>(inputs);
mapMin.put("validatedDateTime", "2022-01-01T00:00:00Z");
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMin, "tenant7"));
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMin));
assertThat(e.getMessage()).contains("Invalid input for `validatedDateTime`, it must be after `2023-01-01T00:00:00Z`, but received `2022-01-01T00:00:00Z`");
HashMap<String, Object> mapMax = new HashMap<>(inputs);
mapMax.put("validatedDateTime", "2024-01-01T00:00:00Z");
e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMax, "tenant7"));
e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMax));
assertThat(e.getMessage()).contains("Invalid input for `validatedDateTime`, it must be before `2023-12-31T23:59:59Z`");
}
@Test
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant8")
@LoadFlows({"flows/valids/inputs.yaml"})
void inputValidatedDurationBadValue() {
HashMap<String, Object> mapMin = new HashMap<>(inputs);
mapMin.put("validatedDuration", "PT1S");
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMin, "tenant8"));
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMin));
assertThat(e.getMessage()).contains("Invalid input for `validatedDuration`, It must be more than `PT10S`, but received `PT1S`");
HashMap<String, Object> mapMax = new HashMap<>(inputs);
mapMax.put("validatedDuration", "PT30S");
e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMax, "tenant8"));
e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMax));
assertThat(e.getMessage()).contains("Invalid input for `validatedDuration`, It must be less than `PT20S`, but received `PT30S`");
}
@Test
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant9")
@LoadFlows({"flows/valids/inputs.yaml"})
void inputValidatedFloatBadValue() {
HashMap<String, Object> mapMin = new HashMap<>(inputs);
mapMin.put("validatedFloat", "0.01");
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMin, "tenant9"));
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMin));
assertThat(e.getMessage()).contains("Invalid input for `validatedFloat`, it must be more than `0.1`, but received `0.01`");
HashMap<String, Object> mapMax = new HashMap<>(inputs);
mapMax.put("validatedFloat", "1.01");
e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMax, "tenant9"));
e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMax));
assertThat(e.getMessage()).contains("Invalid input for `validatedFloat`, it must be less than `0.5`, but received `1.01`");
}
@Test
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant10")
@LoadFlows({"flows/valids/inputs.yaml"})
void inputValidatedTimeBadValue() {
HashMap<String, Object> mapMin = new HashMap<>(inputs);
mapMin.put("validatedTime", "00:00:01");
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMin, "tenant10"));
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMin));
assertThat(e.getMessage()).contains("Invalid input for `validatedTime`, it must be after `01:00`, but received `00:00:01`");
HashMap<String, Object> mapMax = new HashMap<>(inputs);
mapMax.put("validatedTime", "14:00:00");
e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMax, "tenant10"));
e = assertThrows(ConstraintViolationException.class, () -> typedInputs(mapMax));
assertThat(e.getMessage()).contains("Invalid input for `validatedTime`, it must be before `11:59:59`, but received `14:00:00`");
}
@Test
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant11")
@LoadFlows({"flows/valids/inputs.yaml"})
void inputFailed() {
HashMap<String, Object> map = new HashMap<>(inputs);
map.put("uri", "http:/bla");
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(map, "tenant11"));
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(map));
assertThat(e.getMessage()).contains("Invalid input for `uri`, Expected `URI` but received `http:/bla`, but received `http:/bla`");
}
@Test
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant12")
@LoadFlows({"flows/valids/inputs.yaml"})
void inputEnumFailed() {
HashMap<String, Object> map = new HashMap<>(inputs);
map.put("enum", "INVALID");
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(map, "tenant12"));
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(map));
assertThat(e.getMessage()).isEqualTo("enum: Invalid input for `enum`, it must match the values `[ENUM_VALUE, OTHER_ONE]`, but received `INVALID`");
}
@Test
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant13")
@LoadFlows({"flows/valids/inputs.yaml"})
void inputArrayFailed() {
HashMap<String, Object> map = new HashMap<>(inputs);
map.put("array", "[\"s1\", \"s2\"]");
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(map, "tenant13"));
ConstraintViolationException e = assertThrows(ConstraintViolationException.class, () -> typedInputs(map));
assertThat(e.getMessage()).contains("Invalid input for `array`, Unable to parse array element as `INT` on `s1`, but received `[\"s1\", \"s2\"]`");
}
@Test
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant14")
@LoadFlows({"flows/valids/inputs.yaml"})
void inputEmptyJson() {
HashMap<String, Object> map = new HashMap<>(inputs);
map.put("json", "{}");
Map<String, Object> typeds = typedInputs(map, "tenant14");
Map<String, Object> typeds = typedInputs(map);
assertThat(typeds.get("json")).isInstanceOf(Map.class);
assertThat(((Map<?, ?>) typeds.get("json")).size()).isZero();
}
@Test
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant15")
@LoadFlows({"flows/valids/inputs.yaml"})
void inputEmptyJsonFlow() throws TimeoutException, QueueException {
HashMap<String, Object> map = new HashMap<>(inputs);
map.put("json", "{}");
Execution execution = runnerUtils.runOne(
"tenant15",
MAIN_TENANT,
"io.kestra.tests",
"inputs",
null,
@@ -379,20 +375,12 @@ public class InputsTest {
}
@Test
@LoadFlows(value = {"flows/valids/input-log-secret.yaml"}, tenantId = "tenant16")
void shouldNotLogSecretInput() throws TimeoutException, QueueException, InterruptedException {
AtomicReference<LogEntry> logEntry = new AtomicReference<>();
CountDownLatch countDownLatch = new CountDownLatch(1);
Flux<LogEntry> receive = TestsUtils.receive(logQueue, l -> {
LogEntry left = l.getLeft();
if (left.getTenantId().equals("tenant16")){
logEntry.set(left);
countDownLatch.countDown();
}
});
@LoadFlows({"flows/valids/input-log-secret.yaml"})
void shouldNotLogSecretInput() throws TimeoutException, QueueException {
Flux<LogEntry> receive = TestsUtils.receive(logQueue, l -> {});
Execution execution = runnerUtils.runOne(
"tenant16",
MAIN_TENANT,
"io.kestra.tests",
"input-log-secret",
null,
@@ -402,21 +390,20 @@ public class InputsTest {
assertThat(execution.getTaskRunList()).hasSize(1);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
receive.blockLast();
assertTrue(countDownLatch.await(10, TimeUnit.SECONDS));
assertThat(logEntry.get()).isNotNull();
assertThat(logEntry.get().getMessage()).isEqualTo("These are my secrets: ****** - ******");
var logEntry = receive.blockLast();
assertThat(logEntry).isNotNull();
assertThat(logEntry.getMessage()).isEqualTo("These are my secrets: ****** - ******");
}
@Test
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant17")
@LoadFlows({"flows/valids/inputs.yaml"})
void fileInputWithFileDefault() throws IOException, QueueException, TimeoutException {
HashMap<String, Object> newInputs = new HashMap<>(InputsTest.inputs);
URI file = createFile();
newInputs.put("file", file);
Execution execution = runnerUtils.runOne(
"tenant17",
MAIN_TENANT,
"io.kestra.tests",
"inputs",
null,
@@ -428,14 +415,14 @@ public class InputsTest {
}
@Test
@LoadFlows(value = {"flows/valids/inputs.yaml"}, tenantId = "tenant18")
@LoadFlows({"flows/valids/inputs.yaml"})
void fileInputWithNsfile() throws IOException, QueueException, TimeoutException {
HashMap<String, Object> inputs = new HashMap<>(InputsTest.inputs);
URI file = createNsFile(false, "tenant18");
URI file = createNsFile(false);
inputs.put("file", file);
Execution execution = runnerUtils.runOne(
"tenant18",
MAIN_TENANT,
"io.kestra.tests",
"inputs",
null,
@@ -452,11 +439,11 @@ public class InputsTest {
return tempFile.toPath().toUri();
}
private URI createNsFile(boolean nsInAuthority, String tenantId) throws IOException {
private URI createNsFile(boolean nsInAuthority) throws IOException {
String namespace = "io.kestra.tests";
String filePath = "file.txt";
storageInterface.createDirectory(tenantId, namespace, URI.create(StorageContext.namespaceFilePrefix(namespace)));
storageInterface.put(tenantId, namespace, URI.create(StorageContext.namespaceFilePrefix(namespace) + "/" + filePath), new ByteArrayInputStream("Hello World".getBytes()));
storageInterface.createDirectory(MAIN_TENANT, namespace, URI.create(StorageContext.namespaceFilePrefix(namespace)));
storageInterface.put(MAIN_TENANT, namespace, URI.create(StorageContext.namespaceFilePrefix(namespace) + "/" + filePath), new ByteArrayInputStream("Hello World".getBytes()));
return URI.create("nsfile://" + (nsInAuthority ? namespace : "") + "/" + filePath);
}
}

View File

@@ -14,17 +14,15 @@ import java.io.IOException;
import java.net.URISyntaxException;
import java.util.Objects;
import java.util.concurrent.TimeoutException;
import org.junit.jupiter.api.parallel.ExecutionMode;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
@org.junit.jupiter.api.parallel.Execution(ExecutionMode.SAME_THREAD)
@KestraTest(startRunner = true)
class ListenersTest {
@Inject
private TestRunnerUtils runnerUtils;
private RunnerUtils runnerUtils;
@Inject
private LocalFlowRepositoryLoader repositoryLoader;

View File

@@ -1,168 +1,243 @@
package io.kestra.core.runners;
import io.kestra.core.models.flows.State.Type;
import io.kestra.core.queues.QueueException;
import io.kestra.core.repositories.ArrayListTotal;
import io.kestra.core.repositories.ExecutionRepositoryInterface;
import io.kestra.core.utils.TestsUtils;
import io.micronaut.context.ApplicationContext;
import io.kestra.core.models.executions.Execution;
import io.kestra.core.models.flows.Flow;
import io.kestra.core.models.flows.State;
import io.kestra.core.queues.QueueFactoryInterface;
import io.kestra.core.queues.QueueInterface;
import io.kestra.core.repositories.FlowRepositoryInterface;
import io.micronaut.data.model.Pageable;
import java.time.Duration;
import java.util.List;
import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.TimeoutException;
import java.util.concurrent.atomic.AtomicReference;
import jakarta.inject.Inject;
import jakarta.inject.Named;
import jakarta.inject.Singleton;
import reactor.core.publisher.Flux;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
import static org.junit.jupiter.api.Assertions.assertTrue;
@Singleton
public class MultipleConditionTriggerCaseTest {
public static final String NAMESPACE = "io.kestra.tests.trigger";
@Inject
@Named(QueueFactoryInterface.EXECUTION_NAMED)
protected QueueInterface<Execution> executionQueue;
@Inject
protected TestRunnerUtils runnerUtils;
protected RunnerUtils runnerUtils;
@Inject
protected FlowRepositoryInterface flowRepository;
@Inject
protected ExecutionRepositoryInterface executionRepository;
@Inject
protected ApplicationContext applicationContext;
public void trigger() throws InterruptedException, TimeoutException, QueueException {
CountDownLatch countDownLatch = new CountDownLatch(3);
ConcurrentHashMap<String, Execution> ended = new ConcurrentHashMap<>();
List<String> watchedExecutions = List.of("trigger-multiplecondition-flow-a",
"trigger-multiplecondition-flow-b",
"trigger-multiplecondition-listener"
);
Flux<Execution> receive = TestsUtils.receive(executionQueue, either -> {
Execution execution = either.getLeft();
if (watchedExecutions.contains(execution.getFlowId()) && execution.getState().getCurrent() == State.Type.SUCCESS) {
ended.put(execution.getId(), execution);
countDownLatch.countDown();
}
});
// first one
Execution execution = runnerUtils.runOne(MAIN_TENANT, NAMESPACE, "trigger-multiplecondition-flow-a");
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests.trigger",
"trigger-multiplecondition-flow-a", Duration.ofSeconds(60));
assertThat(execution.getTaskRunList().size()).isEqualTo(1);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
// wait a little to be sure that the trigger is not launching execution
Thread.sleep(1000);
ArrayListTotal<Execution> flowBExecutions = executionRepository.findByFlowId(MAIN_TENANT,
NAMESPACE, "trigger-multiplecondition-flow-b", Pageable.UNPAGED);
ArrayListTotal<Execution> listenerExecutions = executionRepository.findByFlowId(MAIN_TENANT,
NAMESPACE, "trigger-multiplecondition-listener", Pageable.UNPAGED);
assertThat(flowBExecutions).isEmpty();
assertThat(listenerExecutions).isEmpty();
assertThat(ended.size()).isEqualTo(1);
// second one
execution = runnerUtils.runOne(MAIN_TENANT, NAMESPACE, "trigger-multiplecondition-flow-b");
execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests.trigger",
"trigger-multiplecondition-flow-b", Duration.ofSeconds(60));
assertThat(execution.getTaskRunList().size()).isEqualTo(1);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
// trigger is done
Execution triggerExecution = runnerUtils.awaitFlowExecution(
e -> e.getState().getCurrent().equals(Type.SUCCESS),
MAIN_TENANT, NAMESPACE, "trigger-multiplecondition-listener");
assertTrue(countDownLatch.await(10, TimeUnit.SECONDS));
receive.blockLast();
assertThat(ended.size()).isEqualTo(3);
Flow flow = flowRepository.findById(MAIN_TENANT, "io.kestra.tests.trigger",
"trigger-multiplecondition-listener").orElseThrow();
Execution triggerExecution = ended.entrySet()
.stream()
.filter(e -> e.getValue().getFlowId().equals(flow.getId()))
.findFirst()
.map(Map.Entry::getValue)
.orElseThrow();
assertThat(triggerExecution.getTaskRunList().size()).isEqualTo(1);
assertThat(triggerExecution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(triggerExecution.getTrigger().getVariables().get("executionId")).isEqualTo(execution.getId());
assertThat(triggerExecution.getTrigger().getVariables().get("namespace")).isEqualTo(
NAMESPACE);
assertThat(triggerExecution.getTrigger().getVariables().get("namespace")).isEqualTo("io.kestra.tests.trigger");
assertThat(triggerExecution.getTrigger().getVariables().get("flowId")).isEqualTo("trigger-multiplecondition-flow-b");
}
public void failed(String tenantId) throws InterruptedException, TimeoutException, QueueException {
public void failed() throws InterruptedException, TimeoutException, QueueException {
CountDownLatch countDownLatch = new CountDownLatch(1);
AtomicReference<Execution> listener = new AtomicReference<>();
Flux<Execution> receive = TestsUtils.receive(executionQueue, either -> {
Execution execution = either.getLeft();
if (execution.getFlowId().equals("trigger-flow-listener-namespace-condition")
&& execution.getState().getCurrent().isTerminated()) {
listener.set(execution);
countDownLatch.countDown();
}
});
// first one
Execution execution = runnerUtils.runOne(tenantId, NAMESPACE,
"trigger-multiplecondition-flow-c");
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests.trigger",
"trigger-multiplecondition-flow-c", Duration.ofSeconds(60));
assertThat(execution.getTaskRunList().size()).isEqualTo(1);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.FAILED);
// wait a little to be sure that the trigger is not launching execution
Thread.sleep(1000);
ArrayListTotal<Execution> byFlowId = executionRepository.findByFlowId(tenantId, NAMESPACE,
"trigger-multiplecondition-flow-d", Pageable.UNPAGED);
assertThat(byFlowId).isEmpty();
assertThat(listener.get()).isNull();
// second one
execution = runnerUtils.runOne(tenantId, NAMESPACE,
"trigger-multiplecondition-flow-d");
execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests.trigger",
"trigger-multiplecondition-flow-d", Duration.ofSeconds(60));
assertThat(execution.getTaskRunList().size()).isEqualTo(1);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
Execution triggerExecution = runnerUtils.awaitFlowExecution(
e -> e.getState().getCurrent().equals(Type.SUCCESS),
tenantId, NAMESPACE, "trigger-flow-listener-namespace-condition");
// trigger was not done
assertThat(triggerExecution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertTrue(countDownLatch.await(10, TimeUnit.SECONDS));
receive.blockLast();
assertThat(listener.get()).isNotNull();
assertThat(listener.get().getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
}
public void flowTriggerPreconditions() throws TimeoutException, QueueException {
public void flowTriggerPreconditions()
throws InterruptedException, TimeoutException, QueueException {
CountDownLatch countDownLatch = new CountDownLatch(1);
AtomicReference<Execution> flowTrigger = new AtomicReference<>();
Flux<Execution> receive = TestsUtils.receive(executionQueue, either -> {
Execution execution = either.getLeft();
if (execution.getState().getCurrent() == State.Type.SUCCESS && execution.getFlowId()
.equals("flow-trigger-preconditions-flow-listen")) {
flowTrigger.set(execution);
countDownLatch.countDown();
}
});
// flowA
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests.trigger.preconditions",
"flow-trigger-preconditions-flow-a");
"flow-trigger-preconditions-flow-a", Duration.ofSeconds(60));
assertThat(execution.getTaskRunList().size()).isEqualTo(1);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
// flowB: we trigger it two times, as flow-trigger-flow-preconditions-flow-listen is configured with resetOnSuccess: false it should be triggered two times
execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests.trigger.preconditions",
"flow-trigger-preconditions-flow-a");
"flow-trigger-preconditions-flow-a", Duration.ofSeconds(60));
assertThat(execution.getTaskRunList().size()).isEqualTo(1);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests.trigger.preconditions",
"flow-trigger-preconditions-flow-b");
"flow-trigger-preconditions-flow-b", Duration.ofSeconds(60));
assertThat(execution.getTaskRunList().size()).isEqualTo(1);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
// trigger is done
Execution triggerExecution = runnerUtils.awaitFlowExecution(
e -> e.getState().getCurrent().equals(Type.SUCCESS),
MAIN_TENANT, "io.kestra.tests.trigger.preconditions", "flow-trigger-preconditions-flow-listen");
assertTrue(countDownLatch.await(1, TimeUnit.SECONDS));
receive.blockLast();
assertThat(flowTrigger.get()).isNotNull();
Execution triggerExecution = flowTrigger.get();
assertThat(triggerExecution.getTaskRunList().size()).isEqualTo(1);
assertThat(triggerExecution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(triggerExecution.getTrigger().getVariables().get("outputs")).isNotNull();
assertThat((Map<String, Object>) triggerExecution.getTrigger().getVariables().get("outputs")).containsEntry("some", "value");
}
public void flowTriggerPreconditionsMergeOutputs(String tenantId) throws QueueException, TimeoutException {
public void flowTriggerPreconditionsMergeOutputs() throws QueueException, TimeoutException, InterruptedException {
// we do the same as in flowTriggerPreconditions() but we trigger flows in the opposite order to be sure that outputs are merged
CountDownLatch countDownLatch = new CountDownLatch(1);
AtomicReference<Execution> flowTrigger = new AtomicReference<>();
Flux<Execution> receive = TestsUtils.receive(executionQueue, either -> {
Execution execution = either.getLeft();
if (execution.getState().getCurrent() == State.Type.SUCCESS && execution.getFlowId()
.equals("flow-trigger-preconditions-flow-listen")) {
flowTrigger.set(execution);
countDownLatch.countDown();
}
});
// flowB
Execution execution = runnerUtils.runOne(tenantId, "io.kestra.tests.trigger.preconditions",
"flow-trigger-preconditions-flow-b");
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests.trigger.preconditions",
"flow-trigger-preconditions-flow-b", Duration.ofSeconds(60));
assertThat(execution.getTaskRunList().size()).isEqualTo(1);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
// flowA
execution = runnerUtils.runOne(tenantId, "io.kestra.tests.trigger.preconditions",
"flow-trigger-preconditions-flow-a");
execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests.trigger.preconditions",
"flow-trigger-preconditions-flow-a", Duration.ofSeconds(60));
assertThat(execution.getTaskRunList().size()).isEqualTo(1);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
// trigger is done
Execution triggerExecution = runnerUtils.awaitFlowExecution(
e -> e.getState().getCurrent().equals(Type.SUCCESS),
tenantId, "io.kestra.tests.trigger.preconditions", "flow-trigger-preconditions-flow-listen");
assertTrue(countDownLatch.await(1, TimeUnit.SECONDS));
receive.blockLast();
assertThat(flowTrigger.get()).isNotNull();
Execution triggerExecution = flowTrigger.get();
assertThat(triggerExecution.getTaskRunList().size()).isEqualTo(1);
assertThat(triggerExecution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(triggerExecution.getTrigger().getVariables().get("outputs")).isNotNull();
assertThat((Map<String, Object>) triggerExecution.getTrigger().getVariables().get("outputs")).containsEntry("some", "value");
}
public void flowTriggerOnPaused() throws TimeoutException, QueueException {
public void flowTriggerOnPaused()
throws InterruptedException, TimeoutException, QueueException {
CountDownLatch countDownLatch = new CountDownLatch(1);
AtomicReference<Execution> flowTrigger = new AtomicReference<>();
Flux<Execution> receive = TestsUtils.receive(executionQueue, either -> {
Execution execution = either.getLeft();
if (execution.getState().getCurrent() == State.Type.SUCCESS && execution.getFlowId()
.equals("flow-trigger-paused-listen")) {
flowTrigger.set(execution);
countDownLatch.countDown();
}
});
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests.trigger.paused",
"flow-trigger-paused-flow");
"flow-trigger-paused-flow", Duration.ofSeconds(60));
assertThat(execution.getTaskRunList().size()).isEqualTo(2);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
// trigger is done
Execution triggerExecution = runnerUtils.awaitFlowExecution(
e -> e.getState().getCurrent().equals(Type.SUCCESS),
MAIN_TENANT, "io.kestra.tests.trigger.paused", "flow-trigger-paused-listen");
assertTrue(countDownLatch.await(1, TimeUnit.SECONDS));
receive.blockLast();
assertThat(flowTrigger.get()).isNotNull();
Execution triggerExecution = flowTrigger.get();
assertThat(triggerExecution.getTaskRunList().size()).isEqualTo(1);
assertThat(triggerExecution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
}

View File

@@ -30,7 +30,7 @@ import static org.assertj.core.api.Assertions.assertThat;
@Singleton
public class PluginDefaultsCaseTest {
@Inject
private TestRunnerUtils runnerUtils;
private RunnerUtils runnerUtils;
public void taskDefaults() throws TimeoutException, QueueException {
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "plugin-defaults", Duration.ofSeconds(60));

View File

@@ -4,19 +4,29 @@ import io.kestra.core.models.executions.Execution;
import io.kestra.core.models.executions.TaskRun;
import io.kestra.core.models.flows.Flow;
import io.kestra.core.models.flows.State;
import io.kestra.core.models.flows.State.Type;
import io.kestra.core.queues.QueueException;
import io.kestra.core.queues.QueueFactoryInterface;
import io.kestra.core.queues.QueueInterface;
import io.kestra.core.repositories.FlowRepositoryInterface;
import io.kestra.core.services.ExecutionService;
import java.time.Duration;
import java.util.List;
import java.util.Optional;
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.TimeoutException;
import java.util.concurrent.atomic.AtomicReference;
import io.kestra.core.utils.TestsUtils;
import jakarta.inject.Inject;
import jakarta.inject.Named;
import jakarta.inject.Singleton;
import reactor.core.publisher.Flux;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
import static io.kestra.core.utils.Rethrow.throwRunnable;
import static org.junit.jupiter.api.Assertions.assertTrue;
@Singleton
@@ -25,30 +35,38 @@ public class RestartCaseTest {
private FlowRepositoryInterface flowRepository;
@Inject
private TestRunnerUtils runnerUtils;
private RunnerUtils runnerUtils;
@Inject
private ExecutionService executionService;
@Inject
@Named(QueueFactoryInterface.EXECUTION_NAMED)
private QueueInterface<Execution> executionQueue;
public void restartFailedThenSuccess() throws Exception {
Flow flow = flowRepository.findById(MAIN_TENANT, "io.kestra.tests", "restart_last_failed").orElseThrow();
Execution firstExecution = runnerUtils.runOne(MAIN_TENANT, flow.getNamespace(), flow.getId());
Execution firstExecution = runnerUtils.runOne(MAIN_TENANT, flow.getNamespace(), flow.getId(), Duration.ofSeconds(60));
assertThat(firstExecution.getState().getCurrent()).isEqualTo(State.Type.FAILED);
assertThat(firstExecution.getTaskRunList()).hasSize(3);
assertThat(firstExecution.getTaskRunList().get(2).getState().getCurrent()).isEqualTo(State.Type.FAILED);
// wait
Execution restartedExec = executionService.restart(firstExecution, null);
assertThat(restartedExec).isNotNull();
assertThat(restartedExec.getId()).isEqualTo(firstExecution.getId());
assertThat(restartedExec.getParentId()).isNull();
assertThat(restartedExec.getTaskRunList().size()).isEqualTo(3);
assertThat(restartedExec.getState().getCurrent()).isEqualTo(State.Type.RESTARTED);
Execution finishedRestartedExecution = runnerUtils.emitAndAwaitExecution(
Execution finishedRestartedExecution = runnerUtils.awaitExecution(
execution -> execution.getState().getCurrent() == State.Type.SUCCESS && execution.getId().equals(firstExecution.getId()),
restartedExec
throwRunnable(() -> {
Execution restartedExec = executionService.restart(firstExecution, null);
assertThat(restartedExec).isNotNull();
assertThat(restartedExec.getId()).isEqualTo(firstExecution.getId());
assertThat(restartedExec.getParentId()).isNull();
assertThat(restartedExec.getTaskRunList().size()).isEqualTo(3);
assertThat(restartedExec.getState().getCurrent()).isEqualTo(State.Type.RESTARTED);
executionQueue.emit(restartedExec);
}),
Duration.ofSeconds(60)
);
assertThat(finishedRestartedExecution).isNotNull();
@@ -75,16 +93,19 @@ public class RestartCaseTest {
assertThat(firstExecution.getTaskRunList().getFirst().getState().getCurrent()).isEqualTo(State.Type.FAILED);
// wait
Execution restartedExec = executionService.restart(firstExecution, null);
assertThat(restartedExec).isNotNull();
assertThat(restartedExec.getId()).isEqualTo(firstExecution.getId());
assertThat(restartedExec.getParentId()).isNull();
assertThat(restartedExec.getTaskRunList().size()).isEqualTo(1);
assertThat(restartedExec.getState().getCurrent()).isEqualTo(State.Type.RESTARTED);
Execution finishedRestartedExecution = runnerUtils.emitAndAwaitExecution(
Execution finishedRestartedExecution = runnerUtils.awaitExecution(
execution -> execution.getState().getCurrent() == State.Type.FAILED && execution.getTaskRunList().getFirst().getAttempts().size() == 2,
restartedExec
throwRunnable(() -> {
Execution restartedExec = executionService.restart(firstExecution, null);
executionQueue.emit(restartedExec);
assertThat(restartedExec).isNotNull();
assertThat(restartedExec.getId()).isEqualTo(firstExecution.getId());
assertThat(restartedExec.getParentId()).isNull();
assertThat(restartedExec.getTaskRunList().size()).isEqualTo(1);
assertThat(restartedExec.getState().getCurrent()).isEqualTo(State.Type.RESTARTED);
}),
Duration.ofSeconds(60)
);
assertThat(finishedRestartedExecution).isNotNull();
@@ -107,16 +128,19 @@ public class RestartCaseTest {
assertThat(firstExecution.getTaskRunList().get(3).getState().getCurrent()).isEqualTo(State.Type.FAILED);
// wait
Execution restartedExec = executionService.restart(firstExecution, null);
assertThat(restartedExec).isNotNull();
assertThat(restartedExec.getId()).isEqualTo(firstExecution.getId());
assertThat(restartedExec.getParentId()).isNull();
assertThat(restartedExec.getTaskRunList().size()).isEqualTo(4);
assertThat(restartedExec.getState().getCurrent()).isEqualTo(State.Type.RESTARTED);
Execution finishedRestartedExecution = runnerUtils.emitAndAwaitExecution(
Execution finishedRestartedExecution = runnerUtils.awaitExecution(
execution -> execution.getState().getCurrent() == State.Type.FAILED && execution.findTaskRunsByTaskId("failStep").stream().findFirst().get().getAttempts().size() == 2,
restartedExec
throwRunnable(() -> {
Execution restartedExec = executionService.restart(firstExecution, null);
executionQueue.emit(restartedExec);
assertThat(restartedExec).isNotNull();
assertThat(restartedExec.getId()).isEqualTo(firstExecution.getId());
assertThat(restartedExec.getParentId()).isNull();
assertThat(restartedExec.getTaskRunList().size()).isEqualTo(4);
assertThat(restartedExec.getState().getCurrent()).isEqualTo(State.Type.RESTARTED);
}),
Duration.ofSeconds(60)
);
assertThat(finishedRestartedExecution).isNotNull();
@@ -139,19 +163,21 @@ public class RestartCaseTest {
assertThat(firstExecution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
// wait
Execution restartedExec = executionService.replay(firstExecution, firstExecution.findTaskRunByTaskIdAndValue("2_end", List.of()).getId(), null);
assertThat(restartedExec.getState().getCurrent()).isEqualTo(State.Type.RESTARTED);
assertThat(restartedExec.getState().getHistories()).hasSize(4);
assertThat(restartedExec.getTaskRunList()).hasSize(20);
assertThat(restartedExec.getTaskRunList().get(19).getState().getCurrent()).isEqualTo(State.Type.RESTARTED);
assertThat(restartedExec.getId()).isNotEqualTo(firstExecution.getId());
assertThat(restartedExec.getTaskRunList().get(1).getId()).isNotEqualTo(firstExecution.getTaskRunList().get(1).getId());
Execution finishedRestartedExecution = runnerUtils.awaitChildExecution(
flow,
firstExecution,
restartedExec,
throwRunnable(() -> {
Execution restartedExec = executionService.replay(firstExecution, firstExecution.findTaskRunByTaskIdAndValue("2_end", List.of()).getId(), null);
executionQueue.emit(restartedExec);
assertThat(restartedExec.getState().getCurrent()).isEqualTo(State.Type.RESTARTED);
assertThat(restartedExec.getState().getHistories()).hasSize(4);
assertThat(restartedExec.getTaskRunList()).hasSize(20);
assertThat(restartedExec.getTaskRunList().get(19).getState().getCurrent()).isEqualTo(State.Type.RESTARTED);
assertThat(restartedExec.getId()).isNotEqualTo(firstExecution.getId());
assertThat(restartedExec.getTaskRunList().get(1).getId()).isNotEqualTo(firstExecution.getTaskRunList().get(1).getId());
}),
Duration.ofSeconds(60)
);
@@ -169,58 +195,71 @@ public class RestartCaseTest {
Execution restart = executionService.restart(execution, null);
assertThat(restart.getState().getCurrent()).isEqualTo(State.Type.RESTARTED);
Execution restartEnded = runnerUtils.emitAndAwaitExecution(
Execution restartEnded = runnerUtils.awaitExecution(
e -> e.getState().getCurrent() == State.Type.FAILED,
restart,
Duration.ofSeconds(60)
throwRunnable(() -> executionQueue.emit(restart)),
Duration.ofSeconds(120)
);
assertThat(restartEnded.getState().getCurrent()).isEqualTo(State.Type.FAILED);
Execution newRestart = executionService.restart(restartEnded, null);
restartEnded = runnerUtils.emitAndAwaitExecution(
restartEnded = runnerUtils.awaitExecution(
e -> e.getState().getCurrent() == State.Type.FAILED,
newRestart,
Duration.ofSeconds(60)
throwRunnable(() -> executionQueue.emit(newRestart)),
Duration.ofSeconds(120)
);
assertThat(restartEnded.getState().getCurrent()).isEqualTo(State.Type.FAILED);
}
public void restartSubflow() throws Exception {
CountDownLatch countDownLatch = new CountDownLatch(1);
Flux<Execution> receiveSubflows = TestsUtils.receive(executionQueue, either -> {
Execution subflowExecution = either.getLeft();
if (subflowExecution.getFlowId().equals("restart-child") && subflowExecution.getState().getCurrent().isFailed()) {
countDownLatch.countDown();
}
});
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "restart-parent");
assertThat(execution.getTaskRunList()).hasSize(3);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.FAILED);
// here we must have 1 failed subflows
runnerUtils.awaitFlowExecution(e -> e.getState().getCurrent().isFailed(), MAIN_TENANT, "io.kestra.tests", "restart-child");
assertTrue(countDownLatch.await(1, TimeUnit.MINUTES));
receiveSubflows.blockLast();
// there is 3 values so we must restart it 3 times to end the 3 subflows
CountDownLatch successLatch = new CountDownLatch(3);
receiveSubflows = TestsUtils.receive(executionQueue, either -> {
Execution subflowExecution = either.getLeft();
if (subflowExecution.getFlowId().equals("restart-child") && subflowExecution.getState().getCurrent().isSuccess()) {
successLatch.countDown();
}
});
Execution restarted1 = executionService.restart(execution, null);
execution = runnerUtils.emitAndAwaitExecution(
execution = runnerUtils.awaitExecution(
e -> e.getState().getCurrent() == State.Type.FAILED && e.getFlowId().equals("restart-parent"),
restarted1,
throwRunnable(() -> executionQueue.emit(restarted1)),
Duration.ofSeconds(10)
);
Execution restarted2 = executionService.restart(execution, null);
execution = runnerUtils.emitAndAwaitExecution(
execution = runnerUtils.awaitExecution(
e -> e.getState().getCurrent() == State.Type.FAILED && e.getFlowId().equals("restart-parent"),
restarted2,
throwRunnable(() -> executionQueue.emit(restarted2)),
Duration.ofSeconds(10)
);
Execution restarted3 = executionService.restart(execution, null);
execution = runnerUtils.emitAndAwaitExecution(
execution = runnerUtils.awaitExecution(
e -> e.getState().getCurrent() == State.Type.SUCCESS && e.getFlowId().equals("restart-parent"),
restarted3,
throwRunnable(() -> executionQueue.emit(restarted3)),
Duration.ofSeconds(10)
);
assertThat(execution.getTaskRunList()).hasSize(6);
List<Execution> childExecutions = runnerUtils.awaitFlowExecutionNumber(3, MAIN_TENANT, "io.kestra.tests", "restart-child");
List<Execution> successfulRestart = childExecutions.stream()
.filter(e -> e.getState().getCurrent().equals(Type.SUCCESS)).toList();
assertThat(successfulRestart).hasSize(3);
assertTrue(successLatch.await(1, TimeUnit.MINUTES));
receiveSubflows.blockLast();
}
public void restartFailedWithFinally() throws Exception {
@@ -233,15 +272,18 @@ public class RestartCaseTest {
assertThat(firstExecution.getTaskRunList().get(1).getState().getCurrent()).isEqualTo(State.Type.FAILED);
// wait
Execution restartedExec = executionService.restart(firstExecution, null);
assertThat(restartedExec).isNotNull();
assertThat(restartedExec.getId()).isEqualTo(firstExecution.getId());
assertThat(restartedExec.getParentId()).isNull();
assertThat(restartedExec.getTaskRunList().size()).isEqualTo(2);
assertThat(restartedExec.getState().getCurrent()).isEqualTo(State.Type.RESTARTED);
Execution finishedRestartedExecution = runnerUtils.emitAndAwaitExecution(
execution -> executionService.isTerminated(flow, execution) && execution.getState().isSuccess(),
restartedExec,
Execution finishedRestartedExecution = runnerUtils.awaitExecution(
execution -> executionService.isTerminated(flow, execution) && execution.getState().isSuccess() && execution.getId().equals(firstExecution.getId()),
throwRunnable(() -> {
Execution restartedExec = executionService.restart(firstExecution, null);
assertThat(restartedExec).isNotNull();
assertThat(restartedExec.getId()).isEqualTo(firstExecution.getId());
assertThat(restartedExec.getParentId()).isNull();
assertThat(restartedExec.getTaskRunList().size()).isEqualTo(2);
assertThat(restartedExec.getState().getCurrent()).isEqualTo(State.Type.RESTARTED);
executionQueue.emit(restartedExec);
}),
Duration.ofSeconds(60)
);
@@ -267,18 +309,21 @@ public class RestartCaseTest {
assertThat(firstExecution.getTaskRunList().get(1).getState().getCurrent()).isEqualTo(State.Type.FAILED);
// wait
Execution restartedExec = executionService.restart(firstExecution, null);
assertThat(restartedExec).isNotNull();
assertThat(restartedExec.getId()).isEqualTo(firstExecution.getId());
assertThat(restartedExec.getParentId()).isNull();
assertThat(restartedExec.getTaskRunList().size()).isEqualTo(2);
assertThat(restartedExec.getState().getCurrent()).isEqualTo(State.Type.RESTARTED);
Execution finishedRestartedExecution = runnerUtils.awaitExecution(
execution -> executionService.isTerminated(flow, execution) && execution.getState().isSuccess() && execution.getId().equals(firstExecution.getId()),
throwRunnable(() -> {
Execution restartedExec = executionService.restart(firstExecution, null);
assertThat(restartedExec).isNotNull();
assertThat(restartedExec.getId()).isEqualTo(firstExecution.getId());
assertThat(restartedExec.getParentId()).isNull();
assertThat(restartedExec.getTaskRunList().size()).isEqualTo(2);
assertThat(restartedExec.getState().getCurrent()).isEqualTo(State.Type.RESTARTED);
Execution finishedRestartedExecution = runnerUtils.emitAndAwaitExecution(
execution -> executionService.isTerminated(flow, execution) && execution.getState().isSuccess(),
restartedExec,
executionQueue.emit(restartedExec);
}),
Duration.ofSeconds(60)
);
assertThat(finishedRestartedExecution).isNotNull();
assertThat(finishedRestartedExecution.getId()).isEqualTo(firstExecution.getId());
assertThat(finishedRestartedExecution.getParentId()).isNull();

View File

@@ -11,7 +11,6 @@ import jakarta.inject.Inject;
import jakarta.inject.Named;
import org.junit.jupiter.api.RepeatedTest;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.parallel.ExecutionMode;
import org.slf4j.Logger;
import org.slf4j.event.Level;
import reactor.core.publisher.Flux;
@@ -26,7 +25,6 @@ import java.util.concurrent.CopyOnWriteArrayList;
import static org.assertj.core.api.Assertions.assertThat;
@KestraTest
@org.junit.jupiter.api.parallel.Execution(ExecutionMode.SAME_THREAD)
class RunContextLoggerTest {
@Inject
@Named(QueueFactoryInterface.WORKERTASKLOG_NAMED)

View File

@@ -98,7 +98,7 @@ class RunContextTest {
private FlowInputOutput flowIO;
@Inject
private TestRunnerUtils runnerUtils;
private RunnerUtils runnerUtils;
@Inject
protected LocalFlowRepositoryLoader repositoryLoader;

View File

@@ -16,7 +16,7 @@ import static org.assertj.core.api.Assertions.assertThat;
@Singleton
public class SLATestCase {
@Inject
private TestRunnerUtils runnerUtils;
private RunnerUtils runnerUtils;
public void maxDurationSLAShouldFail() throws QueueException, TimeoutException {
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "sla-max-duration-fail");
@@ -36,14 +36,14 @@ public class SLATestCase {
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
}
public void executionConditionSLAShouldCancel(String tenantId) throws QueueException, TimeoutException {
Execution execution = runnerUtils.runOne(tenantId, "io.kestra.tests", "sla-execution-condition", null, (f, e) -> Map.of("string", "CANCEL"));
public void executionConditionSLAShouldCancel() throws QueueException, TimeoutException {
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "sla-execution-condition", null, (f, e) -> Map.of("string", "CANCEL"));
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.CANCELLED);
}
public void executionConditionSLAShouldLabel(String tenantId) throws QueueException, TimeoutException {
Execution execution = runnerUtils.runOne(tenantId, "io.kestra.tests", "sla-execution-condition", null, (f, e) -> Map.of("string", "LABEL"));
public void executionConditionSLAShouldLabel() throws QueueException, TimeoutException {
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "sla-execution-condition", null, (f, e) -> Map.of("string", "LABEL"));
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertThat(execution.getLabels()).contains(new Label("sla", "violated"));

View File

@@ -3,31 +3,54 @@ package io.kestra.core.runners;
import io.kestra.core.models.executions.Execution;
import io.kestra.core.models.flows.Flow;
import io.kestra.core.models.flows.State;
import io.kestra.core.models.flows.State.Type;
import io.kestra.core.queues.QueueException;
import io.kestra.core.queues.QueueFactoryInterface;
import io.kestra.core.queues.QueueInterface;
import io.kestra.core.repositories.FlowRepositoryInterface;
import io.kestra.core.utils.TestsUtils;
import jakarta.inject.Inject;
import jakarta.inject.Named;
import jakarta.inject.Singleton;
import reactor.core.publisher.Flux;
import java.time.ZonedDateTime;
import java.util.Optional;
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.TimeUnit;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
import static org.junit.jupiter.api.Assertions.assertTrue;
@Singleton
public class ScheduleDateCaseTest {
@Inject
private FlowRepositoryInterface flowRepository;
@Inject
private TestRunnerUtils runnerUtils;
@Named(QueueFactoryInterface.EXECUTION_NAMED)
protected QueueInterface<Execution> executionQueue;
public void shouldScheduleOnDate(String tenantId) throws QueueException {
public void shouldScheduleOnDate() throws QueueException, InterruptedException {
ZonedDateTime scheduleOn = ZonedDateTime.now().plusSeconds(1);
Flow flow = flowRepository.findById(tenantId, "io.kestra.tests", "minimal").orElseThrow();
Flow flow = flowRepository.findById(MAIN_TENANT, "io.kestra.tests", "minimal").orElseThrow();
Execution execution = Execution.newExecution(flow, null, null, Optional.of(scheduleOn));
assertThat(execution.getScheduleDate()).isEqualTo(scheduleOn.toInstant());
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.CREATED);
this.executionQueue.emit(execution);
runnerUtils.emitAndAwaitExecution(e -> e.getState().getCurrent().equals(Type.SUCCESS), execution);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.CREATED);
assertThat(execution.getScheduleDate()).isEqualTo(scheduleOn.toInstant());
CountDownLatch latch1 = new CountDownLatch(1);
Flux<Execution> receive = TestsUtils.receive(executionQueue, e -> {
if (e.getLeft().getId().equals(execution.getId())) {
if (e.getLeft().getState().getCurrent() == State.Type.SUCCESS) {
latch1.countDown();
}
}
});
assertTrue(latch1.await(1, TimeUnit.MINUTES));
receive.blockLast();
}
}

View File

@@ -32,7 +32,7 @@ public class SkipExecutionCaseTest {
protected QueueInterface<Execution> executionQueue;
@Inject
protected TestRunnerUtils runnerUtils;
protected RunnerUtils runnerUtils;
@Inject
private ExecutionRepositoryInterface executionRepository;

View File

@@ -30,7 +30,7 @@ public class TaskCacheTest {
static final AtomicInteger COUNTER = new AtomicInteger(0);
@Inject
private TestRunnerUtils runnerUtils;
private RunnerUtils runnerUtils;
@BeforeEach
void resetCounter() {

View File

@@ -33,7 +33,7 @@ public class TaskWithAllowFailureTest {
private FlowInputOutput flowIO;
@Inject
private TestRunnerUtils runnerUtils;
private RunnerUtils runnerUtils;
@Test
@ExecuteFlow("flows/valids/task-allow-failure-runnable.yml")
@@ -44,10 +44,10 @@ public class TaskWithAllowFailureTest {
}
@Test
@LoadFlows(value = {"flows/valids/task-allow-failure-executable-flow.yml",
"flows/valids/for-each-item-subflow-failed.yaml"}, tenantId = "tenant1")
@LoadFlows({"flows/valids/task-allow-failure-executable-flow.yml",
"flows/valids/for-each-item-subflow-failed.yaml"})
void executableTask_Flow() throws QueueException, TimeoutException {
Execution execution = runnerUtils.runOne("tenant1", "io.kestra.tests", "task-allow-failure-executable-flow");
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "task-allow-failure-executable-flow");
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.WARNING);
assertThat(execution.getTaskRunList()).hasSize(2);
}

View File

@@ -9,7 +9,6 @@ import io.kestra.core.queues.QueueException;
import io.kestra.core.storages.StorageInterface;
import jakarta.inject.Inject;
import org.apache.commons.lang3.StringUtils;
import org.junit.jupiter.api.Disabled;
import org.junit.jupiter.api.Test;
import java.io.File;
@@ -35,7 +34,7 @@ public class TaskWithAllowWarningTest {
private FlowInputOutput flowIO;
@Inject
private TestRunnerUtils runnerUtils;
private RunnerUtils runnerUtils;
@Test
@ExecuteFlow("flows/valids/task-allow-warning-runnable.yml")
@@ -55,7 +54,6 @@ public class TaskWithAllowWarningTest {
}
@Test
@Disabled("This test does not test failing in subflow foreach as the subflow is not called, needs to be rework before reactivation")
@LoadFlows({"flows/valids/task-allow-warning-executable-foreachitem.yml"})
void executableTask_ForEachItem() throws TimeoutException, QueueException, URISyntaxException, IOException {
URI file = storageUpload();

View File

@@ -18,7 +18,6 @@ import org.assertj.core.api.AbstractObjectAssert;
import org.assertj.core.api.ObjectAssert;
import org.junit.jupiter.api.Test;
import java.time.Duration;
import java.time.ZonedDateTime;
import java.time.temporal.ChronoUnit;
import java.util.List;
@@ -38,7 +37,7 @@ class TestSuiteTest {
protected QueueInterface<Execution> executionQueue;
@Inject
protected TestRunnerUtils runnerUtils;
protected RunnerUtils runnerUtils;
@Inject
protected FlowRepositoryInterface flowRepository;
@@ -51,7 +50,7 @@ class TestSuiteTest {
void withoutAnyTaskFixture() throws QueueException, TimeoutException {
var fixtures = List.<TaskFixture>of();
var executionResult = runReturnFlow(fixtures, MAIN_TENANT);
var executionResult = runReturnFlow(fixtures);
assertThat(executionResult.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertOutputForTask(executionResult, "task-id")
@@ -66,7 +65,7 @@ class TestSuiteTest {
}
@Test
@LoadFlows(value = {"flows/valids/return.yaml"}, tenantId = "tenant1")
@LoadFlows({"flows/valids/return.yaml"})
void taskFixture() throws TimeoutException, QueueException {
var fixtures = List.of(
TaskFixture.builder()
@@ -74,7 +73,7 @@ class TestSuiteTest {
.build()
);
var executionResult = runReturnFlow(fixtures, "tenant1");
var executionResult = runReturnFlow(fixtures);
assertThat(executionResult.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertOutputForTask(executionResult, "task-id")
@@ -86,7 +85,7 @@ class TestSuiteTest {
}
@Test
@LoadFlows(value = {"flows/valids/return.yaml"}, tenantId = "tenant2")
@LoadFlows({"flows/valids/return.yaml"})
void twoTaskFixturesOverridingOutput() throws QueueException, TimeoutException {
var fixtures = List.of(
TaskFixture.builder()
@@ -99,7 +98,7 @@ class TestSuiteTest {
.build()
);
var executionResult = runReturnFlow(fixtures, "tenant2");
var executionResult = runReturnFlow(fixtures);
assertThat(executionResult.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
assertOutputForTask(executionResult, "task-id")
@@ -111,7 +110,7 @@ class TestSuiteTest {
}
@Test
@LoadFlows(value = {"flows/valids/return.yaml"}, tenantId = "tenant3")
@LoadFlows({"flows/valids/return.yaml"})
void taskFixturesWithWarningState() throws QueueException, TimeoutException {
var fixtures = List.of(
TaskFixture.builder()
@@ -120,7 +119,7 @@ class TestSuiteTest {
.build()
);
var executionResult = runReturnFlow(fixtures, "tenant3");
var executionResult = runReturnFlow(fixtures);
assertThat(executionResult.getState().getCurrent()).isEqualTo(State.Type.WARNING);
assertTask(executionResult, "task-id")
@@ -134,8 +133,8 @@ class TestSuiteTest {
.isEqualTo(State.Type.WARNING);
}
private Execution runReturnFlow(List<TaskFixture> fixtures, String tenantId) throws TimeoutException, QueueException {
var flow = flowRepository.findById(tenantId, "io.kestra.tests", "return", Optional.empty()).orElseThrow();
private Execution runReturnFlow(List<TaskFixture> fixtures) throws TimeoutException, QueueException {
var flow = flowRepository.findById(MAIN_TENANT, "io.kestra.tests", "return", Optional.empty()).orElseThrow();
var execution = Execution.builder()
.id(IdUtils.create())
@@ -147,7 +146,7 @@ class TestSuiteTest {
.state(new State())
.build();
return runnerUtils.runOne(execution, flow, Duration.ofSeconds(10));
return runnerUtils.runOne(execution, flow, null);
}
private static AbstractObjectAssert<?, Object> assertOutputForTask(Execution executionResult, String taskId) {

View File

@@ -14,13 +14,10 @@ import java.util.Date;
import java.util.Map;
import jakarta.inject.Inject;
import org.junit.jupiter.api.TestInstance;
import org.junit.jupiter.api.TestInstance.Lifecycle;
import static org.assertj.core.api.Assertions.assertThat;
@KestraTest
@TestInstance(Lifecycle.PER_CLASS)
class DateFilterTest {
public static final ZonedDateTime NOW = ZonedDateTime.parse("2013-09-08T16:19:12.123456+01");
@@ -147,7 +144,7 @@ class DateFilterTest {
)
);
assertThat(render).isEqualTo("1378653552123456");
assertThat(render).isEqualTo("1378653552000123456");
}
@Test

View File

@@ -9,8 +9,6 @@ import io.micronaut.context.annotation.Property;
import jakarta.inject.Inject;
import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.parallel.Execution;
import org.junit.jupiter.api.parallel.ExecutionMode;
import org.slf4j.event.Level;
import java.time.Instant;
@@ -20,7 +18,6 @@ import static org.assertj.core.api.Assertions.assertThat;
@KestraTest
@Property(name = "kestra.server-type", value = "WORKER")
@Execution(ExecutionMode.SAME_THREAD)
class ErrorLogsFunctionTest {
@Inject
private LogRepositoryInterface logRepository;

View File

@@ -17,14 +17,11 @@ import java.io.IOException;
import java.net.URI;
import java.nio.file.Files;
import java.util.Map;
import org.junit.jupiter.api.parallel.Execution;
import org.junit.jupiter.api.parallel.ExecutionMode;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
import static org.junit.jupiter.api.Assertions.*;
@Execution(ExecutionMode.SAME_THREAD)
@KestraTest(rebuildContext = true)
class FileExistsFunctionTest {
@@ -194,7 +191,7 @@ class FileExistsFunctionTest {
}
private URI createFile() throws IOException {
File tempFile = File.createTempFile("%s-file".formatted(IdUtils.create()), ".txt");
File tempFile = File.createTempFile("file", ".txt");
Files.write(tempFile.toPath(), "Hello World".getBytes());
return tempFile.toPath().toUri();
}

View File

@@ -17,8 +17,6 @@ import java.io.IOException;
import java.net.URI;
import java.nio.file.Files;
import java.util.Map;
import org.junit.jupiter.api.parallel.Execution;
import org.junit.jupiter.api.parallel.ExecutionMode;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
@@ -26,7 +24,6 @@ import static org.hibernate.validator.internal.util.Contracts.assertTrue;
import static org.junit.jupiter.api.Assertions.assertThrows;
@KestraTest(rebuildContext = true)
@Execution(ExecutionMode.SAME_THREAD)
public class FileSizeFunctionTest {
private static final String NAMESPACE = "my.namespace";
@@ -278,14 +275,14 @@ public class FileSizeFunctionTest {
private URI createNsFile(boolean nsInAuthority) throws IOException {
String namespace = "io.kestra.tests";
String filePath = "%sfile.txt".formatted(IdUtils.create());
String filePath = "file.txt";
storageInterface.createDirectory(MAIN_TENANT, namespace, URI.create(StorageContext.namespaceFilePrefix(namespace)));
storageInterface.put(MAIN_TENANT, namespace, URI.create(StorageContext.namespaceFilePrefix(namespace) + "/" + filePath), new ByteArrayInputStream("Hello World".getBytes()));
return URI.create("nsfile://" + (nsInAuthority ? namespace : "") + "/" + filePath);
}
private URI createFile() throws IOException {
File tempFile = File.createTempFile("%sfile".formatted(IdUtils.create()), ".txt");
File tempFile = File.createTempFile("file", ".txt");
Files.write(tempFile.toPath(), "Hello World".getBytes());
return tempFile.toPath().toUri();
}

View File

@@ -14,14 +14,10 @@ public class FunctionTestUtils {
}
public static Map<String, Object> getVariables(String namespace) {
return getVariables(MAIN_TENANT, namespace);
}
public static Map<String, Object> getVariables(String tenantId, String namespace) {
return Map.of(
"flow", Map.of(
"id", "kv",
"tenantId", tenantId,
"tenantId", MAIN_TENANT,
"namespace", namespace)
);
}

View File

@@ -8,7 +8,6 @@ import io.kestra.core.exceptions.IllegalVariableEvaluationException;
import io.kestra.core.runners.VariableRenderer;
import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.serializers.JacksonMapper;
import io.pebbletemplates.pebble.error.PebbleException;
import jakarta.inject.Inject;
import org.apache.hc.client5.http.utils.Base64;
import org.junit.jupiter.api.Assertions;
@@ -17,17 +16,12 @@ import org.junit.jupiter.api.Test;
import java.nio.charset.StandardCharsets;
import java.util.List;
import java.util.Map;
import org.junit.jupiter.api.parallel.Execution;
import org.junit.jupiter.api.parallel.ExecutionMode;
import static com.github.tomakehurst.wiremock.client.WireMock.*;
import static com.github.tomakehurst.wiremock.client.WireMock.aResponse;
import static org.assertj.core.api.Assertions.assertThat;
import static org.junit.Assert.assertThrows;
@KestraTest
@WireMockTest(httpPort = 28182)
@Execution(ExecutionMode.SAME_THREAD)
class HttpFunctionTest {
@Inject
private VariableRenderer variableRenderer;
@@ -51,13 +45,6 @@ class HttpFunctionTest {
Assertions.assertTrue(rendered.contains("\"todo\":\"New todo\""));
}
@Test
void wrongMethod() {
var exception = assertThrows(IllegalVariableEvaluationException.class, () -> variableRenderer.render("{{ http(url) }}", Map.of("url", "https://dummyjson.com/todos/add")));
assertThat(exception.getCause()).isInstanceOf(PebbleException.class);
assertThat(exception.getCause().getMessage()).isEqualTo("Failed to execute HTTP Request, server respond with status 404 : Not Found ({{ http(url) }}:1)");
}
@Test
void getWithQueryHttpCall() throws IllegalVariableEvaluationException, JsonProcessingException {
String rendered = variableRenderer.render("""

View File

@@ -17,15 +17,12 @@ import java.io.IOException;
import java.net.URI;
import java.nio.file.Files;
import java.util.Map;
import org.junit.jupiter.api.parallel.Execution;
import org.junit.jupiter.api.parallel.ExecutionMode;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
import static org.junit.jupiter.api.Assertions.*;
@KestraTest(rebuildContext = true)
@Execution(ExecutionMode.SAME_THREAD)
class IsFileEmptyFunctionTest {
private static final String NAMESPACE = "my.namespace";

View File

@@ -12,7 +12,6 @@ import io.kestra.core.storages.StorageInterface;
import io.kestra.core.storages.kv.InternalKVStore;
import io.kestra.core.storages.kv.KVStore;
import io.kestra.core.storages.kv.KVValueAndMetadata;
import io.kestra.core.utils.TestsUtils;
import jakarta.inject.Inject;
import java.io.IOException;
import java.net.URI;
@@ -30,14 +29,18 @@ public class KvFunctionTest {
@Inject
VariableRenderer variableRenderer;
@BeforeEach
void reset() throws IOException {
storageInterface.deleteByPrefix(MAIN_TENANT, null, URI.create(StorageContext.kvPrefix("io.kestra.tests")));
}
@Test
void shouldGetValueFromKVGivenExistingKey() throws IllegalVariableEvaluationException, IOException {
// Given
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
KVStore kv = new InternalKVStore(tenant, "io.kestra.tests", storageInterface);
KVStore kv = new InternalKVStore(MAIN_TENANT, "io.kestra.tests", storageInterface);
kv.put("my-key", new KVValueAndMetadata(null, Map.of("field", "value")));
Map<String, Object> variables = getVariables(tenant, "io.kestra.tests");
Map<String, Object> variables = getVariables("io.kestra.tests");
// When
String rendered = variableRenderer.render("{{ kv('my-key') }}", variables);
@@ -49,14 +52,13 @@ public class KvFunctionTest {
@Test
void shouldGetValueFromKVGivenExistingKeyWithInheritance() throws IllegalVariableEvaluationException, IOException {
// Given
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
KVStore kv = new InternalKVStore(tenant, "my.company", storageInterface);
KVStore kv = new InternalKVStore(MAIN_TENANT, "my.company", storageInterface);
kv.put("my-key", new KVValueAndMetadata(null, Map.of("field", "value")));
KVStore firstKv = new InternalKVStore(tenant, "my", storageInterface);
KVStore firstKv = new InternalKVStore(MAIN_TENANT, "my", storageInterface);
firstKv.put("my-key", new KVValueAndMetadata(null, Map.of("field", "firstValue")));
Map<String, Object> variables = getVariables(tenant, "my.company.team");
Map<String, Object> variables = getVariables("my.company.team");
// When
String rendered = variableRenderer.render("{{ kv('my-key') }}", variables);
@@ -68,11 +70,10 @@ public class KvFunctionTest {
@Test
void shouldNotGetValueFromKVWithGivenNamespaceAndInheritance() throws IOException {
// Given
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
KVStore kv = new InternalKVStore(tenant, "kv", storageInterface);
KVStore kv = new InternalKVStore(MAIN_TENANT, "kv", storageInterface);
kv.put("my-key", new KVValueAndMetadata(null, Map.of("field", "value")));
Map<String, Object> variables = getVariables(tenant, "my.company.team");
Map<String, Object> variables = getVariables("my.company.team");
// When
Assertions.assertThrows(IllegalVariableEvaluationException.class, () ->
@@ -82,11 +83,10 @@ public class KvFunctionTest {
@Test
void shouldGetValueFromKVGivenExistingAndNamespace() throws IllegalVariableEvaluationException, IOException {
// Given
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
KVStore kv = new InternalKVStore(tenant, "kv", storageInterface);
KVStore kv = new InternalKVStore(MAIN_TENANT, "kv", storageInterface);
kv.put("my-key", new KVValueAndMetadata(null, Map.of("field", "value")));
Map<String, Object> variables = getVariables(tenant, "io.kestra.tests");
Map<String, Object> variables = getVariables("io.kestra.tests");
// When
String rendered = variableRenderer.render("{{ kv('my-key', namespace='kv') }}", variables);
@@ -98,8 +98,7 @@ public class KvFunctionTest {
@Test
void shouldGetEmptyGivenNonExistingKeyAndErrorOnMissingFalse() throws IllegalVariableEvaluationException {
// Given
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Map<String, Object> variables = getVariables(tenant, "io.kestra.tests");
Map<String, Object> variables = getVariables("io.kestra.tests");
// When
String rendered = variableRenderer.render("{{ kv('my-key', errorOnMissing=false) }}", variables);
@@ -111,8 +110,7 @@ public class KvFunctionTest {
@Test
void shouldFailGivenNonExistingKeyAndErrorOnMissingTrue() {
// Given
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Map<String, Object> variables = getVariables(tenant, "io.kestra.tests");
Map<String, Object> variables = getVariables("io.kestra.tests");
// When
IllegalVariableEvaluationException exception = Assertions.assertThrows(IllegalVariableEvaluationException.class, () -> {
@@ -126,8 +124,7 @@ public class KvFunctionTest {
@Test
void shouldFailGivenNonExistingKeyUsingDefaults() {
// Given
String tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
Map<String, Object> variables = getVariables(tenant, "io.kestra.tests");
Map<String, Object> variables = getVariables("io.kestra.tests");
// When
IllegalVariableEvaluationException exception = Assertions.assertThrows(IllegalVariableEvaluationException.class, () -> {
variableRenderer.render("{{ kv('my-key') }}", variables);

View File

@@ -1,52 +0,0 @@
package io.kestra.core.runners.pebble.functions;
import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.runners.VariableRenderer;
import jakarta.inject.Inject;
import org.junit.jupiter.api.Test;
import java.util.Collections;
import java.util.Map;
import static org.assertj.core.api.Assertions.assertThat;
@KestraTest
public class NanoIDFuntionTest {
@Inject
VariableRenderer variableRenderer;
@Test
void checkStandardNanoId() throws Exception {
String rendered =
variableRenderer.render(
"{{ nanoId() }}", Collections.emptyMap());
assertThat(!rendered.isEmpty()).as(rendered).isTrue();
assertThat(rendered.length()).isEqualTo(21L);
}
@Test
void checkDifferentLength() throws Exception {
String rendered =
variableRenderer.render(
"{{ nanoId(length) }}", Map.of("length", 8L));
assertThat(!rendered.isEmpty()).as(rendered).isTrue();
assertThat(rendered.length()).isEqualTo(8L);
}
@Test
void checkDifferentAlphabet() throws Exception {
String rendered =
variableRenderer.render(
"{{ nanoId(length,alphabet) }}", Map.of("length", 21L, "alphabet", ":;<=>?@"));
assertThat(!rendered.isEmpty()).as(rendered).isTrue();
assertThat(rendered.length()).isEqualTo(21L);
for (char c : rendered.toCharArray()) {
assertThat(c).isGreaterThanOrEqualTo(':');
assertThat(c).isLessThanOrEqualTo('@');
}
}
}

View File

@@ -19,8 +19,6 @@ import java.io.IOException;
import java.net.URI;
import java.nio.file.Files;
import java.util.Map;
import org.junit.jupiter.api.parallel.Execution;
import org.junit.jupiter.api.parallel.ExecutionMode;
import static io.kestra.core.runners.pebble.functions.FunctionTestUtils.NAMESPACE;
import static io.kestra.core.runners.pebble.functions.FunctionTestUtils.getVariables;
@@ -31,7 +29,6 @@ import static org.junit.jupiter.api.Assertions.assertThrows;
@KestraTest(rebuildContext = true)
@Property(name="kestra.server-type", value="WORKER")
@Execution(ExecutionMode.SAME_THREAD)
class ReadFileFunctionTest {
@Inject
VariableRenderer variableRenderer;

View File

@@ -12,7 +12,7 @@ import io.kestra.core.models.flows.State;
import io.kestra.core.queues.QueueException;
import io.kestra.core.queues.QueueFactoryInterface;
import io.kestra.core.queues.QueueInterface;
import io.kestra.core.runners.TestRunnerUtils;
import io.kestra.core.runners.RunnerUtils;
import io.kestra.core.runners.VariableRenderer;
import io.kestra.core.utils.TestsUtils;
import io.micronaut.test.annotation.MockBean;
@@ -41,7 +41,7 @@ public class SecretFunctionTest {
QueueInterface<LogEntry> logQueue;
@Inject
private TestRunnerUtils runnerUtils;
private RunnerUtils runnerUtils;
@Inject
private SecretService secretService;

View File

@@ -1,7 +1,6 @@
package io.kestra.core.server;
import io.kestra.core.contexts.KestraContext;
import io.kestra.core.models.ServerType;
import io.kestra.core.utils.IdUtils;
import io.kestra.core.utils.Network;
import org.junit.jupiter.api.Assertions;
@@ -26,7 +25,6 @@ import java.util.Set;
import static io.kestra.core.server.ServiceStateTransition.Result.ABORTED;
import static io.kestra.core.server.ServiceStateTransition.Result.FAILED;
import static io.kestra.core.server.ServiceStateTransition.Result.SUCCEEDED;
import static org.mockito.Mockito.when;
@ExtendWith({MockitoExtension.class})
@MockitoSettings(strictness = Strictness.LENIENT)
@@ -61,8 +59,6 @@ public class ServiceLivenessManagerTest {
);
KestraContext context = Mockito.mock(KestraContext.class);
KestraContext.setContext(context);
when(context.getServerType()).thenReturn(ServerType.INDEXER);
this.serviceLivenessManager = new ServiceLivenessManager(
config,
new ServiceRegistry(),
@@ -104,7 +100,8 @@ public class ServiceLivenessManagerTest {
);
// mock the state transition result
when(serviceLivenessUpdater.update(Mockito.any(ServiceInstance.class), Mockito.any(Service.ServiceState.class)))
Mockito
.when(serviceLivenessUpdater.update(Mockito.any(ServiceInstance.class), Mockito.any(Service.ServiceState.class)))
.thenReturn(response);
// When
@@ -130,7 +127,8 @@ public class ServiceLivenessManagerTest {
);
// mock the state transition result
when(serviceLivenessUpdater.update(Mockito.any(ServiceInstance.class), Mockito.any(Service.ServiceState.class)))
Mockito
.when(serviceLivenessUpdater.update(Mockito.any(ServiceInstance.class), Mockito.any(Service.ServiceState.class)))
.thenReturn(response);
// When
@@ -149,7 +147,8 @@ public class ServiceLivenessManagerTest {
serviceLivenessManager.updateServiceInstance(running, serviceInstanceFor(running));
// mock the state transition result
when(serviceLivenessUpdater.update(Mockito.any(ServiceInstance.class), Mockito.any(Service.ServiceState.class)))
Mockito
.when(serviceLivenessUpdater.update(Mockito.any(ServiceInstance.class), Mockito.any(Service.ServiceState.class)))
.thenReturn(new ServiceStateTransition.Response(ABORTED));
// When

View File

@@ -10,10 +10,6 @@ import io.kestra.core.runners.RunContextFactory;
import io.kestra.plugin.core.trigger.Schedule;
import jakarta.inject.Inject;
import org.junit.jupiter.api.Test;
import io.kestra.plugin.core.execution.Labels;
import io.kestra.core.models.executions.Execution;
import static org.assertj.core.api.Assertions.assertThatThrownBy;
import java.util.Collections;
import java.util.List;
@@ -83,24 +79,4 @@ class LabelServiceTest {
assertTrue(LabelService.containsAll(List.of(new Label("key1", "value1")), List.of(new Label("key1", "value1"))));
assertTrue(LabelService.containsAll(List.of(new Label("key1", "value1"), new Label("key2", "value2")), List.of(new Label("key1", "value1"))));
}
@Test
void shouldThrowExceptionOnEmptyLabelValueInLabelsTask() throws Exception {
Labels task = Labels.builder()
.id("test")
.type(Labels.class.getName())
.labels(Map.of("invalidLabel", "")) // empty value
.build();
RunContext runContext = runContextFactory.of();
Execution execution = Execution.builder()
.id("execId")
.namespace("test.ns")
.build();
assertThatThrownBy(() -> task.update(execution, runContext))
.isInstanceOf(IllegalArgumentException.class)
.hasMessageContaining("Label values cannot be empty");
}
}

View File

@@ -1,79 +0,0 @@
package io.kestra.core.services;
import com.google.common.collect.ImmutableMap;
import io.kestra.core.exceptions.FlowProcessingException;
import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.models.flows.Flow;
import io.kestra.core.models.flows.PluginDefault;
import io.kestra.core.services.PluginDefaultServiceTest.DefaultPrecedenceTester;
import io.kestra.core.utils.TestsUtils;
import jakarta.inject.Inject;
import lombok.extern.slf4j.Slf4j;
import org.junit.jupiter.api.parallel.ExecutionMode;
import org.junit.jupiter.params.ParameterizedTest;
import org.junit.jupiter.params.provider.Arguments;
import org.junit.jupiter.params.provider.MethodSource;
import java.util.Collections;
import java.util.List;
import java.util.stream.Stream;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.is;
@Slf4j
@KestraTest
class PluginDefaultServiceOverrideTest {
@Inject
private PluginDefaultService pluginDefaultService;
@org.junit.jupiter.api.parallel.Execution(ExecutionMode.SAME_THREAD)
@ParameterizedTest
@MethodSource
void flowDefaultsOverrideGlobalDefaults(boolean flowDefaultForced, boolean globalDefaultForced, String fooValue, String barValue, String bazValue) throws FlowProcessingException {
final DefaultPrecedenceTester task = DefaultPrecedenceTester.builder()
.id("test")
.type(DefaultPrecedenceTester.class.getName())
.propBaz("taskValue")
.build();
final PluginDefault flowDefault = new PluginDefault(DefaultPrecedenceTester.class.getName(), flowDefaultForced, ImmutableMap.of(
"propBar", "flowValue",
"propBaz", "flowValue"
));
final PluginDefault globalDefault = new PluginDefault(DefaultPrecedenceTester.class.getName(), globalDefaultForced, ImmutableMap.of(
"propFoo", "globalValue",
"propBar", "globalValue",
"propBaz", "globalValue"
));
var tenant = TestsUtils.randomTenant(PluginDefaultServiceOverrideTest.class.getSimpleName());
final Flow flowWithPluginDefault = Flow.builder()
.tenantId(tenant)
.tasks(Collections.singletonList(task))
.pluginDefaults(List.of(flowDefault))
.build();
final PluginGlobalDefaultConfiguration pluginGlobalDefaultConfiguration = new PluginGlobalDefaultConfiguration();
pluginGlobalDefaultConfiguration.defaults = List.of(globalDefault);
var previousGlobalDefault = pluginDefaultService.pluginGlobalDefault;
pluginDefaultService.pluginGlobalDefault = pluginGlobalDefaultConfiguration;
final Flow injected = pluginDefaultService.injectAllDefaults(flowWithPluginDefault, true);
pluginDefaultService.pluginGlobalDefault = previousGlobalDefault;
assertThat(((DefaultPrecedenceTester) injected.getTasks().getFirst()).getPropFoo(), is(fooValue));
assertThat(((DefaultPrecedenceTester) injected.getTasks().getFirst()).getPropBar(), is(barValue));
assertThat(((DefaultPrecedenceTester) injected.getTasks().getFirst()).getPropBaz(), is(bazValue));
}
private static Stream<Arguments> flowDefaultsOverrideGlobalDefaults() {
return Stream.of(
Arguments.of(false, false, "globalValue", "flowValue", "taskValue"),
Arguments.of(false, true, "globalValue", "globalValue", "globalValue"),
Arguments.of(true, false, "globalValue", "flowValue", "flowValue"),
Arguments.of(true, true, "globalValue", "flowValue", "flowValue")
);
}
}

View File

@@ -1,11 +1,12 @@
package io.kestra.core.services;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.google.common.collect.ImmutableMap;
import io.kestra.core.exceptions.FlowProcessingException;
import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.models.annotations.Plugin;
import io.kestra.core.models.conditions.ConditionContext;
import io.kestra.core.models.executions.Execution;
import io.kestra.core.models.flows.Flow;
import io.kestra.core.models.flows.FlowInterface;
import io.kestra.core.models.flows.FlowWithSource;
import io.kestra.core.models.flows.GenericFlow;
@@ -18,7 +19,6 @@ import io.kestra.core.models.triggers.PollingTriggerInterface;
import io.kestra.core.models.triggers.TriggerContext;
import io.kestra.core.models.triggers.TriggerOutput;
import io.kestra.core.runners.RunContext;
import io.kestra.core.utils.TestsUtils;
import io.kestra.plugin.core.condition.Expression;
import io.kestra.plugin.core.log.Log;
import io.kestra.plugin.core.trigger.Schedule;
@@ -31,13 +31,19 @@ import lombok.ToString;
import lombok.experimental.SuperBuilder;
import org.junit.jupiter.api.Assertions;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.params.ParameterizedTest;
import org.junit.jupiter.params.provider.Arguments;
import org.junit.jupiter.params.provider.MethodSource;
import org.slf4j.event.Level;
import java.time.Duration;
import java.util.Collections;
import java.util.List;
import java.util.Map;
import java.util.Optional;
import java.util.stream.Stream;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.containsInAnyOrder;
import static org.hamcrest.Matchers.is;
@@ -65,8 +71,7 @@ class PluginDefaultServiceTest {
@Test
void shouldInjectGivenFlowWithNullSource() throws FlowProcessingException {
// Given
var tenant = TestsUtils.randomTenant(PluginDefaultServiceTest.class.getSimpleName());
FlowInterface flow = GenericFlow.fromYaml(tenant, TEST_LOG_FLOW_SOURCE);
FlowInterface flow = GenericFlow.fromYaml(MAIN_TENANT, TEST_LOG_FLOW_SOURCE);
// When
FlowWithSource result = pluginDefaultService.injectAllDefaults(flow, true);
@@ -126,8 +131,55 @@ class PluginDefaultServiceTest {
), result);
}
@ParameterizedTest
@MethodSource
void flowDefaultsOverrideGlobalDefaults(boolean flowDefaultForced, boolean globalDefaultForced, String fooValue, String barValue, String bazValue) throws FlowProcessingException {
final DefaultPrecedenceTester task = DefaultPrecedenceTester.builder()
.id("test")
.type(DefaultPrecedenceTester.class.getName())
.propBaz("taskValue")
.build();
final PluginDefault flowDefault = new PluginDefault(DefaultPrecedenceTester.class.getName(), flowDefaultForced, ImmutableMap.of(
"propBar", "flowValue",
"propBaz", "flowValue"
));
final PluginDefault globalDefault = new PluginDefault(DefaultPrecedenceTester.class.getName(), globalDefaultForced, ImmutableMap.of(
"propFoo", "globalValue",
"propBar", "globalValue",
"propBaz", "globalValue"
));
final Flow flowWithPluginDefault = Flow.builder()
.tasks(Collections.singletonList(task))
.pluginDefaults(List.of(flowDefault))
.build();
final PluginGlobalDefaultConfiguration pluginGlobalDefaultConfiguration = new PluginGlobalDefaultConfiguration();
pluginGlobalDefaultConfiguration.defaults = List.of(globalDefault);
var previousGlobalDefault = pluginDefaultService.pluginGlobalDefault;
pluginDefaultService.pluginGlobalDefault = pluginGlobalDefaultConfiguration;
final Flow injected = pluginDefaultService.injectAllDefaults(flowWithPluginDefault, true);
pluginDefaultService.pluginGlobalDefault = previousGlobalDefault;
assertThat(((DefaultPrecedenceTester) injected.getTasks().getFirst()).getPropFoo(), is(fooValue));
assertThat(((DefaultPrecedenceTester) injected.getTasks().getFirst()).getPropBar(), is(barValue));
assertThat(((DefaultPrecedenceTester) injected.getTasks().getFirst()).getPropBaz(), is(bazValue));
}
private static Stream<Arguments> flowDefaultsOverrideGlobalDefaults() {
return Stream.of(
Arguments.of(false, false, "globalValue", "flowValue", "taskValue"),
Arguments.of(false, true, "globalValue", "globalValue", "globalValue"),
Arguments.of(true, false, "globalValue", "flowValue", "flowValue"),
Arguments.of(true, true, "globalValue", "flowValue", "flowValue")
);
}
@Test
public void injectFlowAndGlobals() throws FlowProcessingException, JsonProcessingException {
public void injectFlowAndGlobals() throws FlowProcessingException {
String source = String.format("""
id: default-test
namespace: io.kestra.tests
@@ -163,8 +215,8 @@ class PluginDefaultServiceTest {
DefaultTriggerTester.class.getName(),
Expression.class.getName()
);
var tenant = TestsUtils.randomTenant(PluginDefaultServiceTest.class.getSimpleName());
FlowWithSource injected = pluginDefaultService.parseFlowWithAllDefaults(tenant, source, false);
FlowWithSource injected = pluginDefaultService.parseFlowWithAllDefaults(null, source, false);
assertThat(((DefaultTester) injected.getTasks().getFirst()).getValue(), is(1));
assertThat(((DefaultTester) injected.getTasks().getFirst()).getSet(), is(666));
@@ -209,8 +261,7 @@ class PluginDefaultServiceTest {
""";
// When
var tenant = TestsUtils.randomTenant(PluginDefaultServiceTest.class.getSimpleName());
FlowWithSource injected = pluginDefaultService.parseFlowWithAllDefaults(tenant, source, false);
FlowWithSource injected = pluginDefaultService.parseFlowWithAllDefaults(null, source, false);
// Then
assertThat(((DefaultTester) injected.getTasks().getFirst()).getSet(), is(2));
@@ -248,8 +299,7 @@ class PluginDefaultServiceTest {
""";
// When
var tenant = TestsUtils.randomTenant(PluginDefaultServiceTest.class.getSimpleName());
FlowWithSource injected = pluginDefaultService.parseFlowWithAllDefaults(tenant, source, false);
FlowWithSource injected = pluginDefaultService.parseFlowWithAllDefaults(null, source, false);
// Then
assertThat(((DefaultTester) injected.getTasks().getFirst()).getSet(), is(666));
@@ -259,8 +309,7 @@ class PluginDefaultServiceTest {
@Test
void shouldInjectFlowDefaultsGivenAlias() throws FlowProcessingException {
// Given
var tenant = TestsUtils.randomTenant(PluginDefaultServiceTest.class.getSimpleName());
GenericFlow flow = GenericFlow.fromYaml(tenant, """
GenericFlow flow = GenericFlow.fromYaml(MAIN_TENANT, """
id: default-test
namespace: io.kestra.tests
@@ -284,8 +333,7 @@ class PluginDefaultServiceTest {
@Test
void shouldInjectFlowDefaultsGivenType() throws FlowProcessingException {
var tenant = TestsUtils.randomTenant(PluginDefaultServiceTest.class.getSimpleName());
GenericFlow flow = GenericFlow.fromYaml(tenant, """
GenericFlow flow = GenericFlow.fromYaml(MAIN_TENANT, """
id: default-test
namespace: io.kestra.tests
@@ -308,8 +356,7 @@ class PluginDefaultServiceTest {
@Test
public void shouldNotInjectDefaultsGivenExistingTaskValue() throws FlowProcessingException {
// Given
var tenant = TestsUtils.randomTenant(PluginDefaultServiceTest.class.getSimpleName());
GenericFlow flow = GenericFlow.fromYaml(tenant, """
GenericFlow flow = GenericFlow.fromYaml(MAIN_TENANT, """
id: default-test
namespace: io.kestra.tests

View File

@@ -5,7 +5,6 @@ import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.models.executions.Execution;
import io.kestra.core.models.executions.TaskRun;
import io.kestra.core.models.flows.State;
import org.junit.jupiter.api.Disabled;
import org.junit.jupiter.api.Test;
import static org.assertj.core.api.Assertions.assertThat;
@@ -33,7 +32,6 @@ class SanityCheckTest {
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
}
@Disabled
@Test
@ExecuteFlow("sanity-checks/kv.yaml")
void qaKv(Execution execution) {
@@ -113,11 +111,4 @@ class SanityCheckTest {
assertThat(execution.getTaskRunList()).hasSize(6);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
}
@Test
@ExecuteFlow("sanity-checks/output_values.yaml")
void qaOutputValues(Execution execution) {
assertThat(execution.getTaskRunList()).hasSize(2);
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.SUCCESS);
}
}

View File

@@ -17,8 +17,6 @@ import jakarta.inject.Named;
import java.nio.file.Path;
import java.util.stream.Collectors;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.parallel.Execution;
import org.junit.jupiter.api.parallel.ExecutionMode;
import reactor.core.publisher.Flux;
import java.io.ByteArrayInputStream;
@@ -33,7 +31,6 @@ import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
@KestraTest
@Execution(ExecutionMode.SAME_THREAD)
class NamespaceFilesUtilsTest {
@Inject
RunContextFactory runContextFactory;

View File

@@ -5,17 +5,8 @@ import org.junit.jupiter.api.Test;
import java.util.List;
import static org.assertj.core.api.Assertions.assertThatObject;
import static org.hamcrest.MatcherAssert.assertThat;
class VersionTest {
@Test
void shouldCreateVersionFromIntegerGivenMajorVersion() {
Version version = Version.of(1);
Assertions.assertEquals(1, version.majorVersion());
}
@Test
void shouldCreateVersionFromStringGivenMajorVersion() {
Version version = Version.of("1");
@@ -30,27 +21,27 @@ class VersionTest {
}
@Test
void shouldCreateVersionFromStringGivenMajorMinorPatchVersion() {
void shouldCreateVersionFromStringGivenMajorMinorIncrementVersion() {
Version version = Version.of("1.2.3");
Assertions.assertEquals(1, version.majorVersion());
Assertions.assertEquals(2, version.minorVersion());
Assertions.assertEquals(3, version.patchVersion());
Assertions.assertEquals(3, version.incrementalVersion());
}
@Test
void shouldCreateVersionFromPrefixedStringGivenMajorMinorPatchVersion() {
void shouldCreateVersionFromPrefixedStringGivenMajorMinorIncrementVersion() {
Version version = Version.of("v1.2.3");
Assertions.assertEquals(1, version.majorVersion());
Assertions.assertEquals(2, version.minorVersion());
Assertions.assertEquals(3, version.patchVersion());
Assertions.assertEquals(3, version.incrementalVersion());
}
@Test
void shouldCreateVersionFromStringGivenMajorMinorPatchAndQualifierVersion() {
void shouldCreateVersionFromStringGivenMajorMinorIncrementAndQualifierVersion() {
Version version = Version.of("1.2.3-SNAPSHOT");
Assertions.assertEquals(1, version.majorVersion());
Assertions.assertEquals(2, version.minorVersion());
Assertions.assertEquals(3, version.patchVersion());
Assertions.assertEquals(3, version.incrementalVersion());
Assertions.assertEquals("SNAPSHOT", version.qualifier().toString());
}
@@ -59,7 +50,7 @@ class VersionTest {
Version version = Version.of("1.2.3-RC0-SNAPSHOT");
Assertions.assertEquals(1, version.majorVersion());
Assertions.assertEquals(2, version.minorVersion());
Assertions.assertEquals(3, version.patchVersion());
Assertions.assertEquals(3, version.incrementalVersion());
Assertions.assertEquals("RC0-SNAPSHOT", version.qualifier().toString());
}
@@ -85,13 +76,13 @@ class VersionTest {
}
@Test
void shouldGetLatestVersionGivenMajorMinorPatchVersions() {
void shouldGetLatestVersionGivenMajorMinorIncrementalVersions() {
Version result = Version.getLatest(Version.of("1.0.9"), Version.of("1.0.10"), Version.of("1.0.11"));
Assertions.assertEquals(Version.of("1.0.11"), result);
}
@Test
public void shouldGetOldestVersionGivenMajorMinorPatchVersions() {
public void shouldGetOldestVersionGivenMajorMinorIncrementalVersions() {
Version result = Version.getOldest(Version.of("1.0.9"), Version.of("1.0.10"), Version.of("1.0.11"));
Assertions.assertEquals(Version.of("1.0.9"), result);
}
@@ -144,50 +135,14 @@ class VersionTest {
}
@Test
public void shouldGetStableVersionGivenMajorMinorPatchVersion() {
// Given
List<Version> versions = List.of(Version.of("1.2.1"), Version.of("1.2.3"), Version.of("0.99.0"));
// When - Then
assertThatObject(Version.getStable(Version.of("1.2.1"), versions)).isEqualTo(Version.of("1.2.1"));
assertThatObject(Version.getStable(Version.of("1.2.0"), versions)).isNull();
assertThatObject(Version.getStable(Version.of("1.2.4"), versions)).isNull();
}
@Test
public void shouldGetStableGivenMajorAndMinorVersionOnly() {
// Given
List<Version> versions = List.of(Version.of("1.2.1"), Version.of("1.2.3"), Version.of("0.99.0"));
// When - Then
assertThatObject(Version.getStable(Version.of("1.2"), versions)).isEqualTo(Version.of("1.2.3"));
}
@Test
public void shouldGetStableGivenMajorVersionOnly() {
// Given
List<Version> versions = List.of(Version.of("1.2.1"), Version.of("1.2.3"), Version.of("0.99.0"));
// When - Then
assertThatObject(Version.getStable(Version.of("1"), versions)).isEqualTo(Version.of("1.2.3"));
public void shouldGetStableVersionGivenMajorMinorVersions() {
Version result = Version.getStable(Version.of("1.2.0"), List.of(Version.of("1.2.1"), Version.of("1.2.2"), Version.of("0.99.0")));
Assertions.assertEquals(Version.of("1.2.2"), result);
}
@Test
public void shouldGetNullForStableGivenMajorAndMinorVersionOnly() {
// Given
List<Version> versions = List.of(Version.of("1.2.1"), Version.of("1.2.3"), Version.of("0.99.0"));
// When - Then
assertThatObject(Version.getStable(Version.of("2.0"), versions)).isNull();
assertThatObject(Version.getStable(Version.of("0.1"), versions)).isNull();
}
@Test
public void shouldGetNullForStableGivenMajorVersionOnly() {
// Given
List<Version> versions = List.of(Version.of("1.2.1"), Version.of("1.2.3"), Version.of("0.99.0"));
// When - Then
assertThatObject(Version.getStable(Version.of("2"), versions)).isNull();
public void shouldGetNullForStableVersionGivenNoCompatibleVersions() {
Version result = Version.getStable(Version.of("1.2.0"), List.of(Version.of("1.3.0"), Version.of("2.0.0"), Version.of("0.99.0")));
Assertions.assertNull(result);
}
}

View File

@@ -1,6 +1,6 @@
package io.kestra.plugin.core.execution;
import io.kestra.core.context.TestRunContextFactory;
import com.google.common.collect.ImmutableMap;
import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.models.executions.statistics.Flow;
import io.kestra.core.models.flows.State;
@@ -8,6 +8,7 @@ import io.kestra.core.models.property.Property;
import io.kestra.core.repositories.AbstractExecutionRepositoryTest;
import io.kestra.core.repositories.ExecutionRepositoryInterface;
import io.kestra.core.runners.RunContext;
import io.kestra.core.runners.RunContextFactory;
import io.kestra.core.utils.IdUtils;
import io.kestra.core.utils.TestsUtils;
import jakarta.inject.Inject;
@@ -19,10 +20,8 @@ import static org.assertj.core.api.Assertions.assertThat;
@KestraTest
class CountTest {
public static final String NAMESPACE = "io.kestra.unittest";
@Inject
TestRunContextFactory runContextFactory;
RunContextFactory runContextFactory;
@Inject
ExecutionRepositoryInterface executionRepository;
@@ -30,10 +29,8 @@ class CountTest {
@Test
void run() throws Exception {
var tenant = TestsUtils.randomTenant(this.getClass().getSimpleName());
for (int i = 0; i < 28; i++) {
executionRepository.save(AbstractExecutionRepositoryTest.builder(
tenant,
i < 5 ? State.Type.RUNNING : (i < 8 ? State.Type.FAILED : State.Type.SUCCESS),
i < 4 ? "first" : (i < 10 ? "second" : "third")
).build());
@@ -52,8 +49,7 @@ class CountTest {
.endDate(new Property<>("{{ now() }}"))
.build();
RunContext runContext = runContextFactory.of("id", NAMESPACE, tenant);
RunContext runContext = TestsUtils.mockRunContext(runContextFactory, task, ImmutableMap.of("namespace", "io.kestra.unittest"));
Count.Output run = task.run(runContext);
assertThat(run.getResults().size()).isEqualTo(2);

View File

@@ -1,15 +1,38 @@
package io.kestra.plugin.core.execution;
import static org.assertj.core.api.Assertions.assertThat;
import io.kestra.core.junit.annotations.ExecuteFlow;
import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.junit.annotations.LoadFlows;
import io.kestra.core.models.executions.Execution;
import io.kestra.core.models.flows.Flow;
import io.kestra.core.models.flows.State;
import io.kestra.core.queues.QueueException;
import io.kestra.core.queues.QueueFactoryInterface;
import io.kestra.core.queues.QueueInterface;
import io.kestra.core.repositories.FlowRepositoryInterface;
import io.kestra.core.utils.TestsUtils;
import jakarta.inject.Inject;
import jakarta.inject.Named;
import org.junit.jupiter.api.Test;
import reactor.core.publisher.Flux;
import java.util.Optional;
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.atomic.AtomicReference;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
import static org.junit.jupiter.api.Assertions.assertTrue;
@KestraTest(startRunner = true)
class ExitTest {
@Inject
@Named(QueueFactoryInterface.EXECUTION_NAMED)
private QueueInterface<Execution> executionQueue;
@Inject
private FlowRepositoryInterface flowRepository;
@Test
@ExecuteFlow("flows/valids/exit.yaml")
@@ -20,13 +43,30 @@ class ExitTest {
}
@Test
@ExecuteFlow("flows/valids/exit-killed.yaml")
void shouldExitAndKillTheExecution(Execution execution) {
assertThat(execution.getState().getCurrent()).isEqualTo(State.Type.KILLED);
assertThat(execution.getTaskRunList()).hasSize(2);
assertThat(execution.getTaskRunList().getFirst().getState().getCurrent()).isEqualTo(State.Type.KILLED);
assertThat(execution.getTaskRunList().get(1).getState().getCurrent()).isEqualTo(State.Type.KILLED);
@LoadFlows("flows/valids/exit-killed.yaml")
void shouldExitAndKillTheExecution() throws QueueException, InterruptedException {
CountDownLatch countDownLatch = new CountDownLatch(2);// We need to wait for 3 execution modifications to be sure all tasks are passed to KILLED
AtomicReference<Execution> killedExecution = new AtomicReference<>();
Flux<Execution> receive = TestsUtils.receive(executionQueue, either -> {
Execution execution = either.getLeft();
if (execution.getFlowId().equals("exit-killed") && execution.getState().getCurrent().isKilled()) {
killedExecution.set(execution);
countDownLatch.countDown();
}
});
// we cannot use the runnerUtils as it may not see the RUNNING state before the execution is killed
Flow flow = flowRepository.findById(MAIN_TENANT, "io.kestra.tests", "exit-killed", Optional.empty()).orElseThrow();
Execution execution = Execution.newExecution(flow, null, null, Optional.empty());
executionQueue.emit(execution);
assertTrue(countDownLatch.await(1, TimeUnit.MINUTES));
assertThat(killedExecution.get()).isNotNull();
assertThat(killedExecution.get().getState().getCurrent()).isEqualTo(State.Type.KILLED);
assertThat(killedExecution.get().getTaskRunList().size()).isEqualTo(2);
assertThat(killedExecution.get().getTaskRunList().getFirst().getState().getCurrent()).isEqualTo(State.Type.KILLED);
assertThat(killedExecution.get().getTaskRunList().get(1).getState().getCurrent()).isEqualTo(State.Type.KILLED);
receive.blockLast();
}
@Test

View File

@@ -8,7 +8,7 @@ import io.kestra.core.junit.annotations.LoadFlows;
import io.kestra.core.models.executions.Execution;
import io.kestra.core.models.flows.State;
import io.kestra.core.queues.QueueException;
import io.kestra.core.runners.TestRunnerUtils;
import io.kestra.core.runners.RunnerUtils;
import jakarta.inject.Inject;
import java.time.Duration;
import java.util.Map;
@@ -19,7 +19,7 @@ import org.junit.jupiter.api.Test;
public class FailTest {
@Inject
private TestRunnerUtils runnerUtils;
private RunnerUtils runnerUtils;
@Test
@LoadFlows({"flows/valids/fail-on-switch.yaml"})
@@ -33,9 +33,9 @@ public class FailTest {
}
@Test
@LoadFlows(value = {"flows/valids/fail-on-condition.yaml"}, tenantId = "fail")
@LoadFlows({"flows/valids/fail-on-condition.yaml"})
void failOnCondition() throws TimeoutException, QueueException{
Execution execution = runnerUtils.runOne("fail", "io.kestra.tests", "fail-on-condition", null,
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "fail-on-condition", null,
(f, e) -> Map.of("param", "fail") , Duration.ofSeconds(20));
assertThat(execution.getTaskRunList()).hasSize(2);
@@ -44,9 +44,9 @@ public class FailTest {
}
@Test
@LoadFlows(value = {"flows/valids/fail-on-condition.yaml"}, tenantId = "success")
@LoadFlows({"flows/valids/fail-on-condition.yaml"})
void dontFailOnCondition() throws TimeoutException, QueueException{
Execution execution = runnerUtils.runOne("success", "io.kestra.tests", "fail-on-condition", null,
Execution execution = runnerUtils.runOne(MAIN_TENANT, "io.kestra.tests", "fail-on-condition", null,
(f, e) -> Map.of("param", "success") , Duration.ofSeconds(20));
assertThat(execution.getTaskRunList()).hasSize(3);

View File

@@ -5,7 +5,7 @@ import io.kestra.core.junit.annotations.LoadFlows;
import io.kestra.core.models.executions.Execution;
import io.kestra.core.models.flows.State;
import io.kestra.core.repositories.ExecutionRepositoryInterface;
import io.kestra.core.runners.TestRunnerUtils;
import io.kestra.core.runners.RunnerUtils;
import io.kestra.core.utils.Await;
import jakarta.inject.Inject;
import org.junit.jupiter.api.Test;
@@ -20,7 +20,7 @@ import static org.assertj.core.api.Assertions.assertThat;
class ResumeTest {
@Inject
private TestRunnerUtils runnerUtils;
private RunnerUtils runnerUtils;
@Inject
private ExecutionRepositoryInterface executionRepository;

View File

@@ -6,7 +6,7 @@ import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.junit.annotations.LoadFlows;
import io.kestra.core.queues.QueueException;
import io.kestra.core.runners.FlowInputOutput;
import io.kestra.core.runners.TestRunnerUtils;
import io.kestra.core.runners.RunnerUtils;
import jakarta.inject.Inject;
import org.junit.jupiter.api.Test;
import io.kestra.core.models.executions.Execution;
@@ -14,6 +14,7 @@ import io.kestra.core.models.flows.State;
import java.util.concurrent.TimeoutException;
import static io.kestra.core.tenant.TenantService.MAIN_TENANT;
import static org.assertj.core.api.Assertions.assertThat;
@KestraTest(startRunner = true)
@@ -21,7 +22,7 @@ class AllowFailureTest {
@Inject
private FlowInputOutput flowIO;
@Inject
protected TestRunnerUtils runnerUtils;
protected RunnerUtils runnerUtils;
@Test
@ExecuteFlow("flows/valids/allow-failure.yaml")
@@ -34,10 +35,10 @@ class AllowFailureTest {
}
@Test
@LoadFlows(value = {"flows/valids/allow-failure.yaml"}, tenantId = "fail")
@LoadFlows({"flows/valids/allow-failure.yaml"})
void failed() throws TimeoutException, QueueException {
Execution execution = runnerUtils.runOne(
"fail",
MAIN_TENANT,
"io.kestra.tests",
"allow-failure",
null,

View File

@@ -7,7 +7,7 @@ import io.kestra.core.models.flows.State;
import io.kestra.core.queues.QueueException;
import io.kestra.core.queues.QueueFactoryInterface;
import io.kestra.core.queues.QueueInterface;
import io.kestra.core.runners.TestRunnerUtils;
import io.kestra.core.runners.RunnerUtils;
import io.kestra.core.utils.TestsUtils;
import io.kestra.core.junit.annotations.LoadFlows;
import jakarta.inject.Inject;
@@ -31,7 +31,7 @@ class CorrelationIdTest {
@Named(QueueFactoryInterface.EXECUTION_NAMED)
private QueueInterface<Execution> executionQueue;
@Inject
private TestRunnerUtils runnerUtils;
private RunnerUtils runnerUtils;
@Test
@LoadFlows({"flows/valids/subflow-parent.yaml",

View File

@@ -12,7 +12,7 @@ import io.kestra.core.models.flows.State;
import io.kestra.core.models.validations.ModelValidator;
import io.kestra.core.queues.QueueException;
import io.kestra.core.runners.FlowInputOutput;
import io.kestra.core.runners.TestRunnerUtils;
import io.kestra.core.runners.RunnerUtils;
import io.kestra.core.serializers.YamlParser;
import io.kestra.core.utils.TestsUtils;
import jakarta.inject.Inject;
@@ -33,7 +33,7 @@ public class DagTest {
ModelValidator modelValidator;
@Inject
protected TestRunnerUtils runnerUtils;
protected RunnerUtils runnerUtils;
@Inject
private FlowInputOutput flowIO;

View File

@@ -4,7 +4,6 @@ import io.kestra.core.junit.annotations.ExecuteFlow;
import io.kestra.core.junit.annotations.KestraTest;
import io.kestra.core.junit.annotations.LoadFlows;
import io.kestra.core.queues.QueueException;
import io.kestra.core.runners.TestRunnerUtils;
import io.kestra.core.utils.TestsUtils;
import org.junit.jupiter.api.Test;
import io.kestra.core.exceptions.InternalException;
@@ -14,6 +13,7 @@ import io.kestra.core.models.executions.TaskRun;
import io.kestra.core.models.flows.State;
import io.kestra.core.queues.QueueFactoryInterface;
import io.kestra.core.queues.QueueInterface;
import io.kestra.core.runners.RunnerUtils;
import java.time.Duration;
import java.util.*;
@@ -34,7 +34,7 @@ public class EachSequentialTest {
QueueInterface<LogEntry> logQueue;
@Inject
private TestRunnerUtils runnerUtils;
private RunnerUtils runnerUtils;
@Test
@ExecuteFlow("flows/valids/each-sequential.yaml")
@@ -92,7 +92,7 @@ public class EachSequentialTest {
EachSequentialTest.eachNullTest(runnerUtils, logQueue);
}
public static void eachNullTest(TestRunnerUtils runnerUtils, QueueInterface<LogEntry> logQueue) throws TimeoutException, QueueException {
public static void eachNullTest(RunnerUtils runnerUtils, QueueInterface<LogEntry> logQueue) throws TimeoutException, QueueException {
List<LogEntry> logs = new CopyOnWriteArrayList<>();
Flux<LogEntry> receive = TestsUtils.receive(logQueue, either -> logs.add(either.getLeft()));

View File

@@ -6,8 +6,9 @@ import io.kestra.core.models.executions.Execution;
import io.kestra.core.models.flows.State;
import io.kestra.core.queues.QueueException;
import io.kestra.core.runners.FlowInputOutput;
import io.kestra.core.runners.TestRunnerUtils;
import io.kestra.core.runners.RunnerUtils;
import jakarta.inject.Inject;
import org.junit.jupiter.api.Disabled;
import org.junit.jupiter.api.Test;
import java.time.Duration;
@@ -21,9 +22,8 @@ import static org.assertj.core.api.Assertions.assertThat;
class FinallyTest {
public static final String NAMESPACE = "io.kestra.tests";
private static final String TENANT_ID = "tenant1";
@Inject
protected TestRunnerUtils runnerUtils;
protected RunnerUtils runnerUtils;
@Inject
private FlowInputOutput flowIO;
@@ -46,10 +46,10 @@ class FinallyTest {
}
@Test
@LoadFlows(value = {"flows/valids/finally-sequential.yaml"}, tenantId = TENANT_ID)
@LoadFlows({"flows/valids/finally-sequential.yaml"})
void sequentialWithErrors() throws QueueException, TimeoutException {
Execution execution = runnerUtils.runOne(
TENANT_ID,
MAIN_TENANT,
NAMESPACE, "finally-sequential", null,
(flow, execution1) -> flowIO.readExecutionInputs(flow, execution1, Map.of("failed", true)),
Duration.ofSeconds(60)
@@ -92,10 +92,10 @@ class FinallyTest {
}
@Test
@LoadFlows(value = {"flows/valids/finally-sequential-error.yaml"}, tenantId = TENANT_ID)
@LoadFlows({"flows/valids/finally-sequential-error.yaml"})
void sequentialErrorBlockWithErrors() throws QueueException, TimeoutException {
Execution execution = runnerUtils.runOne(
TENANT_ID,
MAIN_TENANT,
NAMESPACE, "finally-sequential-error", null,
(flow, execution1) -> flowIO.readExecutionInputs(flow, execution1, Map.of("failed", true)),
Duration.ofSeconds(60)
@@ -128,10 +128,10 @@ class FinallyTest {
}
@Test
@LoadFlows(value = {"flows/valids/finally-allowfailure.yaml"}, tenantId = TENANT_ID)
@LoadFlows({"flows/valids/finally-allowfailure.yaml"})
void allowFailureWithErrors() throws QueueException, TimeoutException {
Execution execution = runnerUtils.runOne(
TENANT_ID,
MAIN_TENANT,
NAMESPACE, "finally-allowfailure", null,
(flow, execution1) -> flowIO.readExecutionInputs(flow, execution1, Map.of("failed", true)),
Duration.ofSeconds(60)
@@ -164,10 +164,10 @@ class FinallyTest {
}
@Test
@LoadFlows(value = {"flows/valids/finally-parallel.yaml"}, tenantId = TENANT_ID)
@LoadFlows({"flows/valids/finally-parallel.yaml"})
void parallelWithErrors() throws QueueException, TimeoutException {
Execution execution = runnerUtils.runOne(
TENANT_ID,
MAIN_TENANT,
NAMESPACE, "finally-parallel", null,
(flow, execution1) -> flowIO.readExecutionInputs(flow, execution1, Map.of("failed", true)),
Duration.ofSeconds(60)
@@ -183,10 +183,10 @@ class FinallyTest {
}
@Test
@LoadFlows(value = {"flows/valids/finally-foreach.yaml"}, tenantId = TENANT_ID)
@LoadFlows({"flows/valids/finally-foreach.yaml"})
void forEachWithoutErrors() throws QueueException, TimeoutException {
Execution execution = runnerUtils.runOne(
TENANT_ID,
MAIN_TENANT,
NAMESPACE, "finally-foreach", null,
(flow, execution1) -> flowIO.readExecutionInputs(flow, execution1, Map.of("failed", false)),
Duration.ofSeconds(60)
@@ -236,10 +236,10 @@ class FinallyTest {
}
@Test
@LoadFlows(value = {"flows/valids/finally-eachparallel.yaml"}, tenantId = TENANT_ID)
@LoadFlows({"flows/valids/finally-eachparallel.yaml"})
void eachParallelWithErrors() throws QueueException, TimeoutException {
Execution execution = runnerUtils.runOne(
TENANT_ID,
MAIN_TENANT,
NAMESPACE, "finally-eachparallel", null,
(flow, execution1) -> flowIO.readExecutionInputs(flow, execution1, Map.of("failed", true)),
Duration.ofSeconds(60)
@@ -255,10 +255,10 @@ class FinallyTest {
}
@Test
@LoadFlows(value = {"flows/valids/finally-dag.yaml"}, tenantId = TENANT_ID)
@LoadFlows({"flows/valids/finally-dag.yaml"})
void dagWithoutErrors() throws QueueException, TimeoutException {
Execution execution = runnerUtils.runOne(
TENANT_ID,
MAIN_TENANT,
NAMESPACE, "finally-dag", null,
(flow, execution1) -> flowIO.readExecutionInputs(flow, execution1, Map.of("failed", false)),
Duration.ofSeconds(60)
@@ -308,10 +308,10 @@ class FinallyTest {
}
@Test
@LoadFlows(value = {"flows/valids/finally-flow.yaml"}, tenantId = TENANT_ID)
@LoadFlows({"flows/valids/finally-flow.yaml"})
void flowWithErrors() throws QueueException, TimeoutException {
Execution execution = runnerUtils.runOne(
TENANT_ID,
MAIN_TENANT,
NAMESPACE, "finally-flow", null,
(flow, execution1) -> flowIO.readExecutionInputs(flow, execution1, Map.of("failed", true)),
Duration.ofSeconds(60)
@@ -342,13 +342,13 @@ class FinallyTest {
}
@Test
@LoadFlows(value = {"flows/valids/finally-flow-error.yaml"}, tenantId = TENANT_ID)
@LoadFlows({"flows/valids/finally-flow-error.yaml"})
void flowErrorBlockWithErrors() throws QueueException, TimeoutException {
Execution execution = runnerUtils.runOne(
TENANT_ID,
MAIN_TENANT,
NAMESPACE, "finally-flow-error", null,
(flow, execution1) -> flowIO.readExecutionInputs(flow, execution1, Map.of("failed", true)),
Duration.ofSeconds(20)
Duration.ofSeconds(60)
);
assertThat(execution.getTaskRunList()).hasSize(6);

Some files were not shown because too many files have changed in this diff Show More