Remove octavia-cli (#33950)
This commit is contained in:
4
.github/labeler.yml
vendored
4
.github/labeler.yml
vendored
@@ -11,10 +11,6 @@ area/documentation:
|
||||
- docs/*
|
||||
- docs/**/*
|
||||
|
||||
area/octavia-cli:
|
||||
- octavia-cli/*
|
||||
- octavia-cli/**/*
|
||||
|
||||
CDK:
|
||||
- airbyte-cdk/*
|
||||
- airbyte-cdk/**/*
|
||||
|
||||
23
.github/workflows/release-airbyte-os.yml
vendored
23
.github/workflows/release-airbyte-os.yml
vendored
@@ -198,29 +198,6 @@ jobs:
|
||||
PART_TO_BUMP: ${{ github.event.inputs.partToBump }}
|
||||
run: ./tools/bin/release_version_octavia.sh
|
||||
|
||||
- name: Publish Python Package to test.pypi.org
|
||||
if: github.event.inputs.skip-publish-test != 'true'
|
||||
uses: mariamrf/py-package-publish-action@v1.1.0
|
||||
with:
|
||||
# specify the same version as in ~/.python-version
|
||||
python_version: "3.10"
|
||||
pip_version: "23.2"
|
||||
subdir: "octavia-cli/"
|
||||
env:
|
||||
TWINE_PASSWORD: ${{ secrets.TWINE_PASSWORD }}
|
||||
TWINE_USERNAME: ${{ secrets.TWINE_USERNAME }}
|
||||
TWINE_REPOSITORY_URL: "https://test.pypi.org/legacy/"
|
||||
- name: Publish Python Package
|
||||
uses: mariamrf/py-package-publish-action@v1.1.0
|
||||
with:
|
||||
# specify the same version as in ~/.python-version
|
||||
python_version: "3.10"
|
||||
pip_version: "23.2"
|
||||
subdir: "octavia-cli/"
|
||||
env:
|
||||
TWINE_PASSWORD: ${{ secrets.TWINE_PASSWORD }}
|
||||
TWINE_USERNAME: ${{ secrets.TWINE_USERNAME }}
|
||||
|
||||
# In case of self-hosted EC2 errors, remove this block.
|
||||
stop-release-airbyte-runner:
|
||||
name: "Release Airbyte: Stop EC2 Runner"
|
||||
|
||||
@@ -1,715 +0,0 @@
|
||||
---
|
||||
products: oss-community
|
||||
---
|
||||
|
||||
# CLI documentation
|
||||
|
||||
:::caution
|
||||
The Octavia CLI is an alpha, unofficial CLI that won't be maintained.
|
||||
:::
|
||||
|
||||
:::tip Recommendation
|
||||
We recommend all users leverage the official [Airbyte Terraform Provider](https://reference.airbyte.com/reference/using-the-terraform-provider), instead of this CLI.
|
||||
:::
|
||||
|
||||
## What is `octavia` CLI?
|
||||
|
||||
Octavia CLI is a tool to manage Airbyte configurations in YAML.
|
||||
It has the following features:
|
||||
|
||||
- Scaffolding of a readable directory architecture that will host the YAML configs (`octavia init`).
|
||||
- Auto-generation of YAML config file that matches the resources' schemas (`octavia generate`).
|
||||
- Manage Airbyte resources with YAML config files.
|
||||
- Safe resources update through diff display and validation (`octavia apply`).
|
||||
- Simple secret management to avoid versioning credentials.
|
||||
|
||||
## Why should I use `octavia` CLI?
|
||||
|
||||
A CLI provides freedom to users to use the tool in whatever context and use case they have.
|
||||
These are non-exhaustive use cases `octavia` can be convenient for:
|
||||
|
||||
- Managing Airbyte configurations with a CLI instead of a web UI.
|
||||
- Versioning Airbyte configurations in Git.
|
||||
- Updating of Airbyte configurations in an automated deployment pipeline.
|
||||
- Integrating the Airbyte configuration deployment in a dev ops tooling stack: Helm, Ansible etc.
|
||||
- Streamlining the deployment of Airbyte configurations to multiple Airbyte instance.
|
||||
|
||||
Readers can refer to our [opened GitHub issues](https://github.com/airbytehq/airbyte/issues?q=is%3Aopen+is%3Aissue+label%3Aarea%2Foctavia-cli) to check the ongoing work on this project.
|
||||
|
||||
## Table of content
|
||||
|
||||
- [Workflow](#workflow)
|
||||
- [Secret management](#secret-management)
|
||||
- [Install](#install)
|
||||
- [Commands reference](#commands-reference)
|
||||
- [Contributing](#contributing)
|
||||
- [Telemetry](#telemetry)
|
||||
- [Changelog](#changelog)
|
||||
|
||||
## Workflow
|
||||
|
||||
### 1. Generate local YAML files for sources or destinations
|
||||
|
||||
1. Retrieve the _definition id_ of the connector you want to use using `octavia list` command.
|
||||
2. Generate YAML configuration running `octavia generate source <DEFINITION_ID> <SOURCE_NAME>` or `octavia generate destination <DEFINITION_ID> <DESTINATION_NAME>`.
|
||||
|
||||
### 2. Edit your local YAML configurations
|
||||
|
||||
1. Edit the generated YAML configurations according to your need.
|
||||
2. Use the [secret management feature](#secret-management) feature to avoid storing credentials in the YAML files.
|
||||
|
||||
### 3. Create the declared sources or destinations on your Airbyte instance
|
||||
|
||||
1. Run `octavia apply` to create the **sources** and **destinations**
|
||||
|
||||
### 4. Generate connections
|
||||
|
||||
1. Run `octavia octavia generate connection --source <PATH_TO_SOURCE_CONFIG> --destination <PATH_TO_DESTINATION_CONFIG> <CONNECTION_NAME>` to create a YAML configuration for a new connection.
|
||||
2. Edit the created configuration file according to your need: change the scheduling or the replicated streams list.
|
||||
|
||||
### 5. Create the declared connections
|
||||
|
||||
1. Run `octavia apply` to create the newly declared connection on your Airbyte instance.
|
||||
|
||||
### 6. Update your configurations
|
||||
|
||||
Changes in your local configurations can be propagated to your Airbyte instance using `octavia apply`. You will be prompted for validation of changes. You can bypass the validation step using the `--force` flag.
|
||||
|
||||
## Secret management
|
||||
|
||||
Sources and destinations configurations have credential fields that you **do not want to store as plain text in your VCS**.
|
||||
`octavia` offers secret management through environment variables expansion:
|
||||
|
||||
```yaml
|
||||
configuration:
|
||||
password: ${MY_PASSWORD}
|
||||
```
|
||||
|
||||
If you have set a `MY_PASSWORD` environment variable, `octavia apply` will load its value into the `password` field.
|
||||
|
||||
## Install
|
||||
|
||||
### Requirements
|
||||
|
||||
We decided to package the CLI in a docker image with portability in mind.
|
||||
**[Please install and run Docker if you are not](https://docs.docker.com/get-docker/)**.
|
||||
|
||||
### As a command available in your bash profile
|
||||
|
||||
```bash
|
||||
curl -s -o- https://raw.githubusercontent.com/airbytehq/airbyte/master/octavia-cli/install.sh | bash
|
||||
```
|
||||
|
||||
This script:
|
||||
|
||||
1. Pulls the [octavia-cli image](https://hub.docker.com/r/airbyte/octavia-cli/tags) from our Docker registry.
|
||||
2. Creates an `octavia` alias in your profile.
|
||||
3. Creates a `~/.octavia` file whose values are mapped to the octavia container's environment variables.
|
||||
|
||||
### Using `docker run`
|
||||
|
||||
```bash
|
||||
touch ~/.octavia # Create a file to store env variables that will be mapped the octavia-cli container
|
||||
mkdir my_octavia_project_directory # Create your octavia project directory where YAML configurations will be stored.
|
||||
docker run --name octavia-cli -i --rm -v my_octavia_project_directory:/home/octavia-project --network host --user $(id -u):$(id -g) --env-file ~/.octavia airbyte/octavia-cli:0.40.32
|
||||
```
|
||||
|
||||
### Using `docker-compose`
|
||||
|
||||
Using octavia in docker-compose could be convenient for automatic `apply` on start-up.
|
||||
|
||||
Add another entry in the services key of your Airbyte `docker-compose.yml`
|
||||
|
||||
```yaml
|
||||
services:
|
||||
# . . .
|
||||
octavia-cli:
|
||||
image: airbyte/octavia-cli:latest
|
||||
command: apply --force
|
||||
env_file:
|
||||
- ~/.octavia # Use a local env file to store variables that will be mapped the octavia-cli container
|
||||
volumes:
|
||||
- <path_to_your_local_octavia_project_directory>:/home/octavia-project
|
||||
depends_on:
|
||||
- webapp
|
||||
```
|
||||
|
||||
Other commands besides `apply` can be run like so:
|
||||
|
||||
```bash
|
||||
docker compose run octavia-cli <command>`
|
||||
```
|
||||
|
||||
## Commands reference
|
||||
|
||||
### `octavia` command flags
|
||||
|
||||
| **Flag** | **Description** | **Env Variable** | **Default** |
|
||||
| ---------------------------------------- | --------------------------------------------------------------------------------- |----------------------------| ------------------------------------------------------ |
|
||||
| `--airbyte-url` | Airbyte instance URL. | `AIRBYTE_URL` | `http://localhost:8000` |
|
||||
| `--airbyte-username` | Airbyte instance username (basic auth). | `AIRBYTE_USERNAME` | `airbyte` |
|
||||
| `--airbyte-password` | Airbyte instance password (basic auth). | `AIRBYTE_PASSWORD` | `password` |
|
||||
| `--workspace-id` | Airbyte workspace id. | `AIRBYTE_WORKSPACE_ID` | The first workspace id found on your Airbyte instance. |
|
||||
| `--enable-telemetry/--disable-telemetry` | Enable or disable the sending of telemetry data. | `OCTAVIA_ENABLE_TELEMETRY` | True |
|
||||
| `--api-http-header` | HTTP Header value pairs passed while calling Airbyte's API | None | None |
|
||||
| `--api-http-headers-file-path` | Path to the YAML file that contains custom HTTP Headers to send to Airbyte's API. | None | None |
|
||||
|
||||
#### Using custom HTTP headers
|
||||
|
||||
You can set custom HTTP headers to send to Airbyte's API with options:
|
||||
|
||||
```bash
|
||||
octavia --api-http-header Header-Name Header-Value --api-http-header Header-Name-2 Header-Value-2 list connectors sources
|
||||
```
|
||||
|
||||
You can also use a custom YAML file (one is already created on init in `api_http_headers.yaml`) to declare the HTTP headers to send to the API:
|
||||
|
||||
```yaml
|
||||
headers:
|
||||
Authorization: Bearer my-secret-token
|
||||
User-Agent: octavia-cli/0.0.0
|
||||
```
|
||||
|
||||
Environment variable expansion is available in this Yaml file
|
||||
|
||||
```yaml
|
||||
headers:
|
||||
Authorization: Bearer ${MY_API_TOKEN}
|
||||
```
|
||||
|
||||
**Options based headers are overriding file based headers if a header is declared in both.**
|
||||
|
||||
### `octavia` subcommands
|
||||
|
||||
| **Command** | **Usage** |
|
||||
| ----------------------------------------- | ------------------------------------------------------------------------------------------ |
|
||||
| **`octavia init`** | Initialize required directories for the project. |
|
||||
| **`octavia list connectors sources`** | List all sources connectors available on the remote Airbyte instance. |
|
||||
| **`octavia list connectors destination`** | List all destinations connectors available on the remote Airbyte instance. |
|
||||
| **`octavia list workspace sources`** | List existing sources in current the Airbyte workspace. |
|
||||
| **`octavia list workspace destinations`** | List existing destinations in the current Airbyte workspace. |
|
||||
| **`octavia list workspace connections`** | List existing connections in the current Airbyte workspace. |
|
||||
| **`octavia get source`** | Get the JSON representation of an existing source in current the Airbyte workspace. |
|
||||
| **`octavia get destination`** | Get the JSON representation of an existing destination in the current Airbyte workspace. |
|
||||
| **`octavia get connection`** | Get the JSON representation of an existing connection in the current Airbyte workspace. |
|
||||
| **`octavia import all`** | Import all existing sources, destinations and connections to manage them with octavia-cli. |
|
||||
| **`octavia import source`** | Import an existing source to manage it with octavia-cli. |
|
||||
| **`octavia import destination`** | Import an existing destination to manage it with octavia-cli. |
|
||||
| **`octavia import connection`** | Import an existing connection to manage it with octavia-cli. |
|
||||
| **`octavia generate source`** | Generate a local YAML configuration for a new source. |
|
||||
| **`octavia generate destination`** | Generate a local YAML configuration for a new destination. |
|
||||
| **`octavia generate connection`** | Generate a local YAML configuration for a new connection. |
|
||||
| **`octavia apply`** | Create or update Airbyte remote resources according to local YAML configurations. |
|
||||
|
||||
#### `octavia init`
|
||||
|
||||
The `octavia init` commands scaffolds the required directory architecture for running `octavia generate` and `octavia apply` commands.
|
||||
|
||||
**Example**:
|
||||
|
||||
```bash
|
||||
$ mkdir my_octavia_project && cd my_octavia_project
|
||||
$ octavia init
|
||||
🐙 - Octavia is targetting your Airbyte instance running at http://localhost:8000 on workspace e1f46f7d-5354-4200-aed6-7816015ca54b.
|
||||
🐙 - Project is not yet initialized.
|
||||
🔨 - Initializing the project.
|
||||
✅ - Created the following directories: sources, destinations, connections.
|
||||
$ ls
|
||||
connections destinations sources
|
||||
```
|
||||
|
||||
#### `octavia list connectors sources`
|
||||
|
||||
List all the source connectors currently available on your Airbyte instance.
|
||||
|
||||
**Example**:
|
||||
|
||||
```bash
|
||||
$ octavia list connectors sources
|
||||
NAME DOCKER REPOSITORY DOCKER IMAGE TAG SOURCE DEFINITION ID
|
||||
Airtable airbyte/source-airtable 0.1.1 14c6e7ea-97ed-4f5e-a7b5-25e9a80b8212
|
||||
AWS CloudTrail airbyte/source-aws-cloudtrail 0.1.4 6ff047c0-f5d5-4ce5-8c81-204a830fa7e1
|
||||
Amazon Ads airbyte/source-amazon-ads 0.1.3 c6b0a29e-1da9-4512-9002-7bfd0cba2246
|
||||
Amazon Seller Partner airbyte/source-amazon-seller-partner 0.2.16 e55879a8-0ef8-4557-abcf-ab34c53ec460
|
||||
```
|
||||
|
||||
#### `octavia list connectors destinations`
|
||||
|
||||
List all the destinations connectors currently available on your Airbyte instance.
|
||||
|
||||
**Example**:
|
||||
|
||||
```bash
|
||||
$ octavia list connectors destinations
|
||||
NAME DOCKER REPOSITORY DOCKER IMAGE TAG DESTINATION DEFINITION ID
|
||||
Azure Blob Storage airbyte/destination-azure-blob-storage 0.1.3 b4c5d105-31fd-4817-96b6-cb923bfc04cb
|
||||
Amazon SQS airbyte/destination-amazon-sqs 0.1.0 0eeee7fb-518f-4045-bacc-9619e31c43ea
|
||||
BigQuery airbyte/destination-bigquery 0.6.11 22f6c74f-5699-40ff-833c-4a879ea40133
|
||||
BigQuery (denormalized typed struct) airbyte/destination-bigquery-denormalized 0.2.10 079d5540-f236-4294-ba7c-ade8fd918496
|
||||
```
|
||||
|
||||
#### `octavia list workspace sources`
|
||||
|
||||
List all the sources existing on your targeted Airbyte instance.
|
||||
|
||||
**Example**:
|
||||
|
||||
```bash
|
||||
$ octavia list workspace sources
|
||||
NAME SOURCE NAME SOURCE ID
|
||||
weather OpenWeather c4aa8550-2122-4a33-9a21-adbfaa638544
|
||||
```
|
||||
|
||||
#### `octavia list workspace destinations`
|
||||
|
||||
List all the destinations existing on your targeted Airbyte instance.
|
||||
|
||||
**Example**:
|
||||
|
||||
```bash
|
||||
$ octavia list workspace destinations
|
||||
NAME DESTINATION NAME DESTINATION ID
|
||||
my_db Postgres c0c977c2-48e7-46fe-9f57-576285c26d42
|
||||
```
|
||||
|
||||
#### `octavia list workspace connections`
|
||||
|
||||
List all the connections existing on your targeted Airbyte instance.
|
||||
|
||||
**Example**:
|
||||
|
||||
```bash
|
||||
$ octavia list workspace connections
|
||||
NAME CONNECTION ID STATUS SOURCE ID DESTINATION ID
|
||||
weather_to_pg a4491317-153e-436f-b646-0b39338f9aab active c4aa8550-2122-4a33-9a21-adbfaa638544 c0c977c2-48e7-46fe-9f57-576285c26d42
|
||||
```
|
||||
|
||||
#### `octavia get source <SOURCE_ID> or <SOURCE_NAME>`
|
||||
|
||||
Get an existing source in current the Airbyte workspace. You can use a source ID or name.
|
||||
|
||||
| **Argument** | **Description** |
|
||||
| ------------- | ---------------- |
|
||||
| `SOURCE_ID` | The source id. |
|
||||
| `SOURCE_NAME` | The source name. |
|
||||
|
||||
**Examples**:
|
||||
|
||||
```bash
|
||||
$ octavia get source c0c977c2-48e7-46fe-9f57-576285c26d42
|
||||
{'connection_configuration': {'key': '**********',
|
||||
'start_date': '2010-01-01T00:00:00.000Z',
|
||||
'token': '**********'},
|
||||
'name': 'Pokemon',
|
||||
'source_definition_id': 'b08e4776-d1de-4e80-ab5c-1e51dad934a2',
|
||||
'source_id': 'c0c977c2-48e7-46fe-9f57-576285c26d42',
|
||||
'source_name': 'My Poke',
|
||||
'workspace_id': 'c4aa8550-2122-4a33-9a21-adbfaa638544'}
|
||||
```
|
||||
|
||||
```bash
|
||||
$ octavia get source "My Poke"
|
||||
{'connection_configuration': {'key': '**********',
|
||||
'start_date': '2010-01-01T00:00:00.000Z',
|
||||
'token': '**********'},
|
||||
'name': 'Pokemon',
|
||||
'source_definition_id': 'b08e4776-d1de-4e80-ab5c-1e51dad934a2',
|
||||
'source_id': 'c0c977c2-48e7-46fe-9f57-576285c26d42',
|
||||
'source_name': 'My Poke',
|
||||
'workspace_id': 'c4aa8550-2122-4a33-9a21-adbfaa638544'}
|
||||
```
|
||||
|
||||
#### `octavia get destination <DESTINATION_ID> or <DESTINATION_NAME>`
|
||||
|
||||
Get an existing destination in current the Airbyte workspace. You can use a destination ID or name.
|
||||
|
||||
| **Argument** | **Description** |
|
||||
| ------------------ | --------------------- |
|
||||
| `DESTINATION_ID` | The destination id. |
|
||||
| `DESTINATION_NAME` | The destination name. |
|
||||
|
||||
**Examples**:
|
||||
|
||||
```bash
|
||||
$ octavia get destination c0c977c2-48e7-46fe-9f57-576285c26d42
|
||||
{
|
||||
"destinationDefinitionId": "c0c977c2-48e7-46fe-9f57-576285c26d42",
|
||||
"destinationId": "18102e7c-5160-4000-841b-15e8ec48c301",
|
||||
"workspaceId": "18102e7c-5160-4000-883a-30bc7cd65601",
|
||||
"connectionConfiguration": {
|
||||
"user": "charles"
|
||||
},
|
||||
"name": "pg",
|
||||
"destinationName": "Postgres"
|
||||
}
|
||||
```
|
||||
|
||||
```bash
|
||||
$ octavia get destination pg
|
||||
{
|
||||
"destinationDefinitionId": "18102e7c-5160-4000-821f-4d7cfdf87201",
|
||||
"destinationId": "18102e7c-5160-4000-841b-15e8ec48c301",
|
||||
"workspaceId": "18102e7c-5160-4000-883a-30bc7cd65601",
|
||||
"connectionConfiguration": {
|
||||
"user": "charles"
|
||||
},
|
||||
"name": "string",
|
||||
"destinationName": "string"
|
||||
}
|
||||
```
|
||||
|
||||
#### `octavia get connection <CONNECTION_ID> or <CONNECTION_NAME>`
|
||||
|
||||
Get an existing connection in current the Airbyte workspace. You can use a connection ID or name.
|
||||
|
||||
| **Argument** | **Description** |
|
||||
| ----------------- | -------------------- |
|
||||
| `CONNECTION_ID` | The connection id. |
|
||||
| `CONNECTION_NAME` | The connection name. |
|
||||
|
||||
**Example**:
|
||||
|
||||
```bash
|
||||
$ octavia get connection c0c977c2-48e7-46fe-9f57-576285c26d42
|
||||
{
|
||||
"connectionId": "c0c977c2-48e7-46fe-9f57-576285c26d42",
|
||||
"name": "Poke To PG",
|
||||
"namespaceDefinition": "source",
|
||||
"namespaceFormat": "${SOURCE_NAMESPACE}",
|
||||
"prefix": "string",
|
||||
"sourceId": "18102e7c-5340-4000-8eaa-4a86f844b101",
|
||||
"destinationId": "18102e7c-5340-4000-8e58-6bed49c24b01",
|
||||
"operationIds": [
|
||||
"18102e7c-5340-4000-8ef0-f35c05a49a01"
|
||||
],
|
||||
"syncCatalog": {
|
||||
"streams": [
|
||||
{
|
||||
"stream": {
|
||||
"name": "string",
|
||||
"jsonSchema": {},
|
||||
"supportedSyncModes": [
|
||||
"full_refresh"
|
||||
],
|
||||
"sourceDefinedCursor": false,
|
||||
"defaultCursorField": [
|
||||
"string"
|
||||
],
|
||||
"sourceDefinedPrimaryKey": [
|
||||
[
|
||||
"string"
|
||||
]
|
||||
],
|
||||
"namespace": "string"
|
||||
},
|
||||
"config": {
|
||||
"syncMode": "full_refresh",
|
||||
"cursorField": [
|
||||
"string"
|
||||
],
|
||||
"destinationSyncMode": "append",
|
||||
"primaryKey": [
|
||||
[
|
||||
"string"
|
||||
]
|
||||
],
|
||||
"aliasName": "string",
|
||||
"selected": false
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
"schedule": {
|
||||
"units": 0,
|
||||
"timeUnit": "minutes"
|
||||
},
|
||||
"status": "active",
|
||||
"resourceRequirements": {
|
||||
"cpu_request": "string",
|
||||
"cpu_limit": "string",
|
||||
"memory_request": "string",
|
||||
"memory_limit": "string"
|
||||
},
|
||||
"sourceCatalogId": "18102e7c-5340-4000-85f3-204ab7715801"
|
||||
}
|
||||
```
|
||||
|
||||
```bash
|
||||
$ octavia get connection "Poke To PG"
|
||||
{
|
||||
"connectionId": "c0c977c2-48e7-46fe-9f57-576285c26d42",
|
||||
"name": "Poke To PG",
|
||||
"namespaceDefinition": "source",
|
||||
"namespaceFormat": "${SOURCE_NAMESPACE}",
|
||||
"prefix": "string",
|
||||
"sourceId": "18102e7c-5340-4000-8eaa-4a86f844b101",
|
||||
"destinationId": "18102e7c-5340-4000-8e58-6bed49c24b01",
|
||||
"operationIds": [
|
||||
"18102e7c-5340-4000-8ef0-f35c05a49a01"
|
||||
],
|
||||
"syncCatalog": {
|
||||
"streams": [
|
||||
{
|
||||
"stream": {
|
||||
"name": "string",
|
||||
"jsonSchema": {},
|
||||
"supportedSyncModes": [
|
||||
"full_refresh"
|
||||
],
|
||||
"sourceDefinedCursor": false,
|
||||
"defaultCursorField": [
|
||||
"string"
|
||||
],
|
||||
"sourceDefinedPrimaryKey": [
|
||||
[
|
||||
"string"
|
||||
]
|
||||
],
|
||||
"namespace": "string"
|
||||
},
|
||||
"config": {
|
||||
"syncMode": "full_refresh",
|
||||
"cursorField": [
|
||||
"string"
|
||||
],
|
||||
"destinationSyncMode": "append",
|
||||
"primaryKey": [
|
||||
[
|
||||
"string"
|
||||
]
|
||||
],
|
||||
"aliasName": "string",
|
||||
"selected": false
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
"schedule": {
|
||||
"units": 0,
|
||||
"timeUnit": "minutes"
|
||||
},
|
||||
"status": "active",
|
||||
"resourceRequirements": {
|
||||
"cpu_request": "string",
|
||||
"cpu_limit": "string",
|
||||
"memory_request": "string",
|
||||
"memory_limit": "string"
|
||||
},
|
||||
"sourceCatalogId": "18102e7c-5340-4000-85f3-204ab7715801"
|
||||
}
|
||||
```
|
||||
|
||||
#### `octavia import all`
|
||||
|
||||
Import all existing resources (sources, destinations, connections) on your Airbyte instance to manage them with octavia-cli.
|
||||
|
||||
**Examples**:
|
||||
|
||||
```bash
|
||||
$ octavia import all
|
||||
🐙 - Octavia is targetting your Airbyte instance running at http://localhost:8000 on workspace b06c6fbb-cadd-4c5c-bdbb-710add7dedb9.
|
||||
✅ - Imported source poke in sources/poke/configuration.yaml. State stored in sources/poke/state_b06c6fbb-cadd-4c5c-bdbb-710add7dedb9.yaml
|
||||
⚠️ - Please update any secrets stored in sources/poke/configuration.yaml
|
||||
✅ - Imported destination Postgres in destinations/postgres/configuration.yaml. State stored in destinations/postgres/state_b06c6fbb-cadd-4c5c-bdbb-710add7dedb9.yaml
|
||||
⚠️ - Please update any secrets stored in destinations/postgres/configuration.yaml
|
||||
✅ - Imported connection poke-to-pg in connections/poke_to_pg/configuration.yaml. State stored in connections/poke_to_pg/state_b06c6fbb-cadd-4c5c-bdbb-710add7dedb9.yaml
|
||||
```
|
||||
|
||||
You know have local configuration files for all Airbyte resources that were already existing.
|
||||
You need to edit any secret values that exist in these configuration files as secrets are not imported.
|
||||
You can edit the configuration files and run `octavia apply` to continue managing them with octavia-cli.
|
||||
|
||||
#### `octavia import destination <DESTINATION_ID> or <DESTINATION_NAME>`
|
||||
|
||||
Import an existing destination to manage it with octavia-cli. You can use a destination ID or name.
|
||||
|
||||
| **Argument** | **Description** |
|
||||
| ------------------ | --------------------- |
|
||||
| `DESTINATION_ID` | The destination id. |
|
||||
| `DESTINATION_NAME` | The destination name. |
|
||||
|
||||
#### `octavia import source <SOURCE_ID> or <SOURCE_NAME>`
|
||||
|
||||
Import an existing source to manage it with octavia-cli. You can use a source ID or name.
|
||||
|
||||
| **Argument** | **Description** |
|
||||
| ------------- | ---------------- |
|
||||
| `SOURCE_ID` | The source id. |
|
||||
| `SOURCE_NAME` | The source name. |
|
||||
|
||||
**Examples**:
|
||||
|
||||
```bash
|
||||
$ octavia import source poke
|
||||
🐙 - Octavia is targetting your Airbyte instance running at http://localhost:8000 on workspace 75658e4f-e5f0-4e35-be0c-bdad33226c94.
|
||||
✅ - Imported source poke in sources/poke/configuration.yaml. State stored in sources/poke/state_75658e4f-e5f0-4e35-be0c-bdad33226c94.yaml
|
||||
⚠️ - Please update any secrets stored in sources/poke/configuration.yaml
|
||||
```
|
||||
|
||||
You know have local configuration file for an Airbyte source that was already existing.
|
||||
You need to edit any secret value that exist in this configuration as secrets are not imported.
|
||||
You can edit the configuration and run `octavia apply` to continue managing it with octavia-cli.
|
||||
|
||||
#### `octavia import destination <DESTINATION_ID> or <DESTINATION_NAME>`
|
||||
|
||||
Import an existing destination to manage it with octavia-cli. You can use a destination ID or name.
|
||||
|
||||
| **Argument** | **Description** |
|
||||
| ------------------ | --------------------- |
|
||||
| `DESTINATION_ID` | The destination id. |
|
||||
| `DESTINATION_NAME` | The destination name. |
|
||||
|
||||
**Examples**:
|
||||
|
||||
```bash
|
||||
$ octavia import destination pg
|
||||
🐙 - Octavia is targetting your Airbyte instance running at http://localhost:8000 on workspace 75658e4f-e5f0-4e35-be0c-bdad33226c94.
|
||||
✅ - Imported destination pg in destinations/pg/configuration.yaml. State stored in destinations/pg/state_75658e4f-e5f0-4e35-be0c-bdad33226c94.yaml
|
||||
⚠️ - Please update any secrets stored in destinations/pg/configuration.yaml
|
||||
```
|
||||
|
||||
You know have local configuration file for an Airbyte destination that was already existing.
|
||||
You need to edit any secret value that exist in this configuration as secrets are not imported.
|
||||
You can edit the configuration and run `octavia apply` to continue managing it with octavia-cli.
|
||||
|
||||
#### `octavia import connection <CONNECTION_ID> or <CONNECTION_NAME>`
|
||||
|
||||
Import an existing connection to manage it with octavia-cli. You can use a connection ID or name.
|
||||
|
||||
| **Argument** | **Description** |
|
||||
| ----------------- | -------------------- |
|
||||
| `CONNECTION_ID` | The connection id. |
|
||||
| `CONNECTION_NAME` | The connection name. |
|
||||
|
||||
**Examples**:
|
||||
|
||||
```bash
|
||||
$ octavia import connection poke-to-pg
|
||||
🐙 - Octavia is targetting your Airbyte instance running at http://localhost:8000 on workspace 75658e4f-e5f0-4e35-be0c-bdad33226c94.
|
||||
✅ - Imported connection poke-to-pg in connections/poke-to-pg/configuration.yaml. State stored in connections/poke-to-pg/state_75658e4f-e5f0-4e35-be0c-bdad33226c94.yaml
|
||||
⚠️ - Please update any secrets stored in connections/poke-to-pg/configuration.yaml
|
||||
```
|
||||
|
||||
You know have local configuration file for an Airbyte connection that was already existing.
|
||||
**N.B.: You first need to import the source and destination used by the connection.**
|
||||
You can edit the configuration and run `octavia apply` to continue managing it with octavia-cli.
|
||||
|
||||
#### `octavia generate source <DEFINITION_ID> <SOURCE_NAME>`
|
||||
|
||||
Generate a YAML configuration for a source.
|
||||
The YAML file will be stored at `./sources/<resource_name>/configuration.yaml`.
|
||||
|
||||
| **Argument** | **Description** |
|
||||
| --------------- | --------------------------------------------------------------------------------------------- |
|
||||
| `DEFINITION_ID` | The source connector definition id. Can be retrieved using `octavia list connectors sources`. |
|
||||
| `SOURCE_NAME` | The name you want to give to this source in Airbyte. |
|
||||
|
||||
**Example**:
|
||||
|
||||
```bash
|
||||
$ octavia generate source d8540a80-6120-485d-b7d6-272bca477d9b weather
|
||||
✅ - Created the source template for weather in ./sources/weather/configuration.yaml.
|
||||
```
|
||||
|
||||
#### `octavia generate destination <DEFINITION_ID> <DESTINATION_NAME>`
|
||||
|
||||
Generate a YAML configuration for a destination.
|
||||
The YAML file will be stored at `./destinations/<destination_name>/configuration.yaml`.
|
||||
|
||||
| **Argument** | **Description** |
|
||||
| ------------------ | ------------------------------------------------------------------------------------------------------- |
|
||||
| `DEFINITION_ID` | The destination connector definition id. Can be retrieved using `octavia list connectors destinations`. |
|
||||
| `DESTINATION_NAME` | The name you want to give to this destination in Airbyte. |
|
||||
|
||||
**Example**:
|
||||
|
||||
```bash
|
||||
$ octavia generate destination 25c5221d-dce2-4163-ade9-739ef790f503 my_db
|
||||
✅ - Created the destination template for my_db in ./destinations/my_db/configuration.yaml.
|
||||
```
|
||||
|
||||
#### `octavia generate connection --source <path-to-source-configuration.yaml> --destination <path-to-destination-configuration.yaml> <CONNECTION_NAME>`
|
||||
|
||||
Generate a YAML configuration for a connection.
|
||||
The YAML file will be stored at `./connections/<connection_name>/configuration.yaml`.
|
||||
|
||||
| **Option** | **Required** | **Description** |
|
||||
| --------------- | ------------ | ------------------------------------------------------------------------------------------ |
|
||||
| `--source` | Yes | Path to the YAML configuration file of the source you want to create a connection from. |
|
||||
| `--destination` | Yes | Path to the YAML configuration file of the destination you want to create a connection to. |
|
||||
|
||||
| **Argument** | **Description** |
|
||||
| ----------------- | -------------------------------------------------------- |
|
||||
| `CONNECTION_NAME` | The name you want to give to this connection in Airbyte. |
|
||||
|
||||
**Example**:
|
||||
|
||||
```bash
|
||||
$ octavia generate connection --source sources/weather/configuration.yaml --destination destinations/my_db/configuration.yaml weather_to_pg
|
||||
✅ - Created the connection template for weather_to_pg in ./connections/weather_to_pg/configuration.yaml.
|
||||
```
|
||||
|
||||
#### `octavia apply`
|
||||
|
||||
Create or update the resource on your Airbyte instance according to local configurations found in your octavia project directory.
|
||||
If the resource was not found on your Airbyte instance, **apply** will **create** the remote resource.
|
||||
If the resource was found on your Airbyte instance, **apply** will prompt you for validation of the changes and will run an **update** of your resource.
|
||||
Please note that if a secret field was updated on your configuration, **apply** will run this change without prompt.
|
||||
|
||||
| **Option** | **Required** | **Description** |
|
||||
| ---------- | ------------ | ------------------------------------------------------------------ |
|
||||
| `--file` | No | Path to the YAML configuration files you want to create or update. |
|
||||
| `--force` | No | Run update without prompting for changes validation. |
|
||||
|
||||
**Example**:
|
||||
|
||||
```bash
|
||||
$ octavia apply
|
||||
🐙 - weather exists on your Airbyte instance, let's check if we need to update it!
|
||||
👀 - Here's the computed diff (🚨 remind that diff on secret fields are not displayed):
|
||||
E - Value of root['lat'] changed from "46.7603" to "45.7603".
|
||||
❓ - Do you want to update weather? [y/N]: y
|
||||
✍️ - Running update because a diff was detected between local and remote resource.
|
||||
🎉 - Successfully updated weather on your Airbyte instance!
|
||||
💾 - New state for weather stored at ./sources/weather/state_<workspace_id>.yaml.
|
||||
🐙 - my_db exists on your Airbyte instance, let's check if we need to update it!
|
||||
😴 - Did not update because no change detected.
|
||||
🐙 - weather_to_pg exists on your Airbyte instance, let's check if we need to update it!
|
||||
👀 - Here's the computed diff (🚨 remind that diff on secret fields are not displayed):
|
||||
E - Value of root['schedule']['timeUnit'] changed from "days" to "hours".
|
||||
❓ - Do you want to update weather_to_pg? [y/N]: y
|
||||
✍️ - Running update because a diff was detected between local and remote resource.
|
||||
🎉 - Successfully updated weather_to_pg on your Airbyte instance!
|
||||
💾 - New state for weather_to_pg stored at ./connections/weather_to_pg/state_<workspace_id>.yaml.
|
||||
```
|
||||
|
||||
## Contributing
|
||||
|
||||
1. Please sign up to [Airbyte's Slack workspace](https://slack.airbyte.io/) and join the `#octavia-cli`. We'll sync up community efforts in this channel.
|
||||
2. Pick an existing [GitHub issues](https://github.com/airbytehq/airbyte/issues?q=is%3Aopen+is%3Aissue+label%3Aarea%2Foctavia-cli) or **open** a new one to explain what you'd like to implement.
|
||||
3. Assign the GitHub issue to yourself.
|
||||
4. Fork Airbyte's repo, code and test thoroughly.
|
||||
5. Open a PR on our Airbyte repo from your fork.
|
||||
|
||||
### Developing locally
|
||||
|
||||
0. Build the project locally (from the root of Airbyte's repo): `SUB_BUILD=OCTAVIA_CLI ./gradlew build # from the root directory of the repo`.
|
||||
1. Install Python 3.8.12. We suggest doing it through `pyenv`.
|
||||
2. Create a virtualenv: `python -m venv .venv`.
|
||||
3. Activate the virtualenv: `source .venv/bin/activate`.
|
||||
4. Install dev dependencies: `pip install -e .\[tests\]`.
|
||||
5. Install `pre-commit` hooks: `pre-commit install`.
|
||||
6. Run the unittest suite: `pytest --cov=octavia_cli`. Note, a local version of airbyte needs to be running (e.g. `docker compose up` from the root directory of the project)
|
||||
7. Make sure the build passes (step 0) before opening a PR.
|
||||
|
||||
## Telemetry
|
||||
|
||||
This CLI has some telemetry tooling to send Airbyte some data about the usage of this tool.
|
||||
We will use this data to improve the CLI and measure its adoption.
|
||||
The telemetry sends data about:
|
||||
|
||||
- Which command was run (not the arguments or options used).
|
||||
- Success or failure of the command run and the error type (not the error payload).
|
||||
- The current Airbyte workspace id if the user has not set the _anonymous data collection_ on their Airbyte instance.
|
||||
|
||||
You can disable telemetry by setting the `OCTAVIA_ENABLE_TELEMETRY` environment variable to `False` or using the `--disable-telemetry` flag.
|
||||
@@ -564,10 +564,6 @@ module.exports = {
|
||||
type: "doc",
|
||||
id: "terraform-documentation",
|
||||
},
|
||||
{
|
||||
type: "doc",
|
||||
id: "cli-documentation",
|
||||
},
|
||||
understandingAirbyte,
|
||||
contributeToAirbyte,
|
||||
{
|
||||
|
||||
@@ -1,3 +0,0 @@
|
||||
[report]
|
||||
# show lines missing coverage
|
||||
show_missing = true
|
||||
@@ -1,4 +0,0 @@
|
||||
build
|
||||
!build/airbyte_api_client
|
||||
.venv
|
||||
octavia_cli.egg-info
|
||||
4
octavia-cli/.gitignore
vendored
4
octavia-cli/.gitignore
vendored
@@ -1,4 +0,0 @@
|
||||
.coverage
|
||||
.venv
|
||||
state_*.yaml
|
||||
dist
|
||||
@@ -1 +0,0 @@
|
||||
3.9
|
||||
@@ -1,18 +0,0 @@
|
||||
FROM python:3.9-slim as base
|
||||
|
||||
RUN apt-get upgrade \
|
||||
&& pip install --upgrade pip
|
||||
|
||||
WORKDIR /home/octavia-cli
|
||||
COPY . ./
|
||||
|
||||
RUN pip install --no-cache-dir .
|
||||
|
||||
RUN useradd --create-home --shell /bin/bash octavia-cli
|
||||
USER octavia-cli
|
||||
|
||||
WORKDIR /home/octavia-project
|
||||
ENTRYPOINT ["octavia"]
|
||||
|
||||
LABEL io.airbyte.version=0.50.0
|
||||
LABEL io.airbyte.name=airbyte/octavia-cli
|
||||
@@ -1,21 +0,0 @@
|
||||
MIT License
|
||||
|
||||
Copyright (c) 2020 Airbyte, Inc.
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
@@ -1,725 +0,0 @@
|
||||
# 🐙 Octavia CLI
|
||||
|
||||
## Disclaimer
|
||||
|
||||
The project is in **alpha** version.
|
||||
Readers can refer to our [opened GitHub issues](https://github.com/airbytehq/airbyte/issues?q=is%3Aopen+is%3Aissue+label%3Aarea%2Foctavia-cli) to check the ongoing work on this project.
|
||||
|
||||
## What is `octavia` CLI?
|
||||
|
||||
Octavia CLI is a tool to manage Airbyte configurations in YAML.
|
||||
It has the following features:
|
||||
|
||||
- Scaffolding of a readable directory architecture that will host the YAML configs (`octavia init`).
|
||||
- Auto-generation of YAML config file that matches the resources' schemas (`octavia generate`).
|
||||
- Manage Airbyte resources with YAML config files.
|
||||
- Safe resources update through diff display and validation (`octavia apply`).
|
||||
- Simple secret management to avoid versioning credentials.
|
||||
|
||||
## Why should I use `octavia` CLI?
|
||||
|
||||
A CLI provides freedom to users to use the tool in whatever context and use case they have.
|
||||
These are non-exhaustive use cases `octavia` can be convenient for:
|
||||
|
||||
- Managing Airbyte configurations with a CLI instead of a web UI.
|
||||
- Versioning Airbyte configurations in Git.
|
||||
- Updating of Airbyte configurations in an automated deployment pipeline.
|
||||
- Integrating the Airbyte configuration deployment in a dev ops tooling stack: Helm, Ansible etc.
|
||||
- Streamlining the deployment of Airbyte configurations to multiple Airbyte instance.
|
||||
|
||||
Feel free to share your use cases with the community in [#octavia-cli](https://airbytehq.slack.com/archives/C02RRUG9CP5) or on [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions).
|
||||
|
||||
## Table of content
|
||||
|
||||
- [Workflow](#workflow)
|
||||
- [Secret management](#secret-management)
|
||||
- [Install](#install)
|
||||
- [Commands reference](#commands-reference)
|
||||
- [Contributing](#contributing)
|
||||
- [Telemetry](#telemetry)
|
||||
- [Changelog](#changelog)
|
||||
|
||||
## Workflow
|
||||
|
||||
### 1. Generate local YAML files for sources or destinations
|
||||
|
||||
1. Retrieve the _definition id_ of the connector you want to use using `octavia list` command.
|
||||
2. Generate YAML configuration running `octavia generate source <DEFINITION_ID> <SOURCE_NAME>` or `octavia generate destination <DEFINITION_ID> <DESTINATION_NAME>`.
|
||||
|
||||
### 2. Edit your local YAML configurations
|
||||
|
||||
1. Edit the generated YAML configurations according to your need.
|
||||
2. Use the [secret management feature](#secret-management) feature to avoid storing credentials in the YAML files.
|
||||
|
||||
### 3. Create the declared sources or destinations on your Airbyte instance
|
||||
|
||||
1. Run `octavia apply` to create the **sources** and **destinations**
|
||||
|
||||
### 4. Generate connections
|
||||
|
||||
1. Run `octavia octavia generate connection --source <PATH_TO_SOURCE_CONFIG> --destination <PATH_TO_DESTINATION_CONFIG> <CONNECTION_NAME>` to create a YAML configuration for a new connection.
|
||||
2. Edit the created configuration file according to your need: change the scheduling or the replicated streams list.
|
||||
|
||||
### 5. Create the declared connections
|
||||
|
||||
1. Run `octavia apply` to create the newly declared connection on your Airbyte instance.
|
||||
|
||||
### 6. Update your configurations
|
||||
|
||||
Changes in your local configurations can be propagated to your Airbyte instance using `octavia apply`. You will be prompted for validation of changes. You can bypass the validation step using the `--force` flag.
|
||||
|
||||
## Secret management
|
||||
|
||||
Sources and destinations configurations have credential fields that you **do not want to store as plain text in your VCS**.
|
||||
`octavia` offers secret management through environment variables expansion:
|
||||
|
||||
```yaml
|
||||
configuration:
|
||||
password: ${MY_PASSWORD}
|
||||
```
|
||||
|
||||
If you have set a `MY_PASSWORD` environment variable, `octavia apply` will load its value into the `password` field.
|
||||
|
||||
## Install
|
||||
|
||||
### Requirements
|
||||
|
||||
We decided to package the CLI in a docker image with portability in mind.
|
||||
**[Please install and run Docker if you are not](https://docs.docker.com/get-docker/)**.
|
||||
|
||||
### As a command available in your bash profile
|
||||
|
||||
```bash
|
||||
curl -s -o- https://raw.githubusercontent.com/airbytehq/airbyte/master/octavia-cli/install.sh | bash
|
||||
```
|
||||
|
||||
This script:
|
||||
|
||||
1. Pulls the [octavia-cli image](https://hub.docker.com/r/airbyte/octavia-cli/tags) from our Docker registry.
|
||||
2. Creates an `octavia` alias in your profile.
|
||||
3. Creates a `~/.octavia` file whose values are mapped to the octavia container's environment variables.
|
||||
|
||||
### Using `docker run`
|
||||
|
||||
```bash
|
||||
touch ~/.octavia # Create a file to store env variables that will be mapped the octavia-cli container
|
||||
mkdir my_octavia_project_directory # Create your octavia project directory where YAML configurations will be stored.
|
||||
docker run --name octavia-cli -i --rm -v my_octavia_project_directory:/home/octavia-project --network host --user $(id -u):$(id -g) --env-file ~/.octavia airbyte/octavia-cli:0.50.0
|
||||
```
|
||||
|
||||
### Using `docker-compose`
|
||||
|
||||
Using octavia in docker-compose could be convenient for automatic `apply` on start-up.
|
||||
|
||||
Add another entry in the services key of your Airbyte `docker-compose.yml`
|
||||
|
||||
```yaml
|
||||
services:
|
||||
# . . .
|
||||
octavia-cli:
|
||||
image: airbyte/octavia-cli:latest
|
||||
command: apply --force
|
||||
env_file:
|
||||
- ~/.octavia # Use a local env file to store variables that will be mapped the octavia-cli container
|
||||
volumes:
|
||||
- <path_to_your_local_octavia_project_directory>:/home/octavia-project
|
||||
depends_on:
|
||||
- webapp
|
||||
```
|
||||
|
||||
Other commands besides `apply` can be run like so:
|
||||
|
||||
```bash
|
||||
docker compose run octavia-cli <command>`
|
||||
```
|
||||
|
||||
## Commands reference
|
||||
|
||||
### `octavia` command flags
|
||||
|
||||
| **Flag** | **Description** | **Env Variable** | **Default** |
|
||||
| ---------------------------------------- | --------------------------------------------------------------------------------- |----------------------------| ------------------------------------------------------ |
|
||||
| `--airbyte-url` | Airbyte instance URL. | `AIRBYTE_URL` | `http://localhost:8000` |
|
||||
| `--airbyte-username` | Airbyte instance username (basic auth). | `AIRBYTE_USERNAME` | `airbyte` |
|
||||
| `--airbyte-password` | Airbyte instance password (basic auth). | `AIRBYTE_PASSWORD` | `password` |
|
||||
| `--workspace-id` | Airbyte workspace id. | `AIRBYTE_WORKSPACE_ID` | The first workspace id found on your Airbyte instance. |
|
||||
| `--enable-telemetry/--disable-telemetry` | Enable or disable the sending of telemetry data. | `OCTAVIA_ENABLE_TELEMETRY` | True |
|
||||
| `--api-http-header` | HTTP Header value pairs passed while calling Airbyte's API | None | None |
|
||||
| `--api-http-headers-file-path` | Path to the YAML file that contains custom HTTP Headers to send to Airbyte's API. | None | None |
|
||||
|
||||
#### Using custom HTTP headers
|
||||
|
||||
You can set custom HTTP headers to send to Airbyte's API with options:
|
||||
|
||||
```bash
|
||||
octavia --api-http-header Header-Name Header-Value --api-http-header Header-Name-2 Header-Value-2 list connectors sources
|
||||
```
|
||||
|
||||
You can also use a custom YAML file (one is already created on init in `api_http_headers.yaml`) to declare the HTTP headers to send to the API:
|
||||
|
||||
```yaml
|
||||
headers:
|
||||
Authorization: Bearer my-secret-token
|
||||
User-Agent: octavia-cli/0.0.0
|
||||
```
|
||||
|
||||
Environment variable expansion is available in this Yaml file
|
||||
|
||||
```yaml
|
||||
headers:
|
||||
Authorization: Bearer ${MY_API_TOKEN}
|
||||
```
|
||||
|
||||
**Options based headers are overriding file based headers if an header is declared in both.**
|
||||
|
||||
### `octavia` subcommands
|
||||
|
||||
| **Command** | **Usage** |
|
||||
| ----------------------------------------- | ------------------------------------------------------------------------------------------ |
|
||||
| **`octavia init`** | Initialize required directories for the project. |
|
||||
| **`octavia list connectors sources`** | List all sources connectors available on the remote Airbyte instance. |
|
||||
| **`octavia list connectors destinations`** | List all destinations connectors available on the remote Airbyte instance. |
|
||||
| **`octavia list workspace sources`** | List existing sources in current the Airbyte workspace. |
|
||||
| **`octavia list workspace destinations`** | List existing destinations in the current Airbyte workspace. |
|
||||
| **`octavia list workspace connections`** | List existing connections in the current Airbyte workspace. |
|
||||
| **`octavia get source`** | Get the JSON representation of an existing source in current the Airbyte workspace. |
|
||||
| **`octavia get destination`** | Get the JSON representation of an existing destination in the current Airbyte workspace. |
|
||||
| **`octavia get connection`** | Get the JSON representation of an existing connection in the current Airbyte workspace. |
|
||||
| **`octavia import all`** | Import all existing sources, destinations and connections to manage them with octavia-cli. |
|
||||
| **`octavia import source`** | Import an existing source to manage it with octavia-cli. |
|
||||
| **`octavia import destination`** | Import an existing destination to manage it with octavia-cli. |
|
||||
| **`octavia import connection`** | Import an existing connection to manage it with octavia-cli. |
|
||||
| **`octavia generate source`** | Generate a local YAML configuration for a new source. |
|
||||
| **`octavia generate destination`** | Generate a local YAML configuration for a new destination. |
|
||||
| **`octavia generate connection`** | Generate a local YAML configuration for a new connection. |
|
||||
| **`octavia apply`** | Create or update Airbyte remote resources according to local YAML configurations. |
|
||||
|
||||
#### `octavia init`
|
||||
|
||||
The `octavia init` commands scaffolds the required directory architecture for running `octavia generate` and `octavia apply` commands.
|
||||
|
||||
**Example**:
|
||||
|
||||
```bash
|
||||
$ mkdir my_octavia_project && cd my_octavia_project
|
||||
$ octavia init
|
||||
🐙 - Octavia is targetting your Airbyte instance running at http://localhost:8000 on workspace e1f46f7d-5354-4200-aed6-7816015ca54b.
|
||||
🐙 - Project is not yet initialized.
|
||||
🔨 - Initializing the project.
|
||||
✅ - Created the following directories: sources, destinations, connections.
|
||||
$ ls
|
||||
connections destinations sources
|
||||
```
|
||||
|
||||
#### `octavia list connectors sources`
|
||||
|
||||
List all the source connectors currently available on your Airbyte instance.
|
||||
|
||||
**Example**:
|
||||
|
||||
```bash
|
||||
$ octavia list connectors sources
|
||||
NAME DOCKER REPOSITORY DOCKER IMAGE TAG SOURCE DEFINITION ID
|
||||
Airtable airbyte/source-airtable 0.1.1 14c6e7ea-97ed-4f5e-a7b5-25e9a80b8212
|
||||
AWS CloudTrail airbyte/source-aws-cloudtrail 0.1.4 6ff047c0-f5d5-4ce5-8c81-204a830fa7e1
|
||||
Amazon Ads airbyte/source-amazon-ads 0.1.3 c6b0a29e-1da9-4512-9002-7bfd0cba2246
|
||||
Amazon Seller Partner airbyte/source-amazon-seller-partner 0.2.16 e55879a8-0ef8-4557-abcf-ab34c53ec460
|
||||
```
|
||||
|
||||
#### `octavia list connectors destinations`
|
||||
|
||||
List all the destinations connectors currently available on your Airbyte instance.
|
||||
|
||||
**Example**:
|
||||
|
||||
```bash
|
||||
$ octavia list connectors destinations
|
||||
NAME DOCKER REPOSITORY DOCKER IMAGE TAG DESTINATION DEFINITION ID
|
||||
Azure Blob Storage airbyte/destination-azure-blob-storage 0.1.3 b4c5d105-31fd-4817-96b6-cb923bfc04cb
|
||||
Amazon SQS airbyte/destination-amazon-sqs 0.1.0 0eeee7fb-518f-4045-bacc-9619e31c43ea
|
||||
BigQuery airbyte/destination-bigquery 0.6.11 22f6c74f-5699-40ff-833c-4a879ea40133
|
||||
BigQuery (denormalized typed struct) airbyte/destination-bigquery-denormalized 0.2.10 079d5540-f236-4294-ba7c-ade8fd918496
|
||||
```
|
||||
|
||||
#### `octavia list workspace sources`
|
||||
|
||||
List all the sources existing on your targeted Airbyte instance.
|
||||
|
||||
**Example**:
|
||||
|
||||
```bash
|
||||
$ octavia list workspace sources
|
||||
NAME SOURCE NAME SOURCE ID
|
||||
weather OpenWeather c4aa8550-2122-4a33-9a21-adbfaa638544
|
||||
```
|
||||
|
||||
#### `octavia list workspace destinations`
|
||||
|
||||
List all the destinations existing on your targeted Airbyte instance.
|
||||
|
||||
**Example**:
|
||||
|
||||
```bash
|
||||
$ octavia list workspace destinations
|
||||
NAME DESTINATION NAME DESTINATION ID
|
||||
my_db Postgres c0c977c2-48e7-46fe-9f57-576285c26d42
|
||||
```
|
||||
|
||||
#### `octavia list workspace connections`
|
||||
|
||||
List all the connections existing on your targeted Airbyte instance.
|
||||
|
||||
**Example**:
|
||||
|
||||
```bash
|
||||
$ octavia list workspace connections
|
||||
NAME CONNECTION ID STATUS SOURCE ID DESTINATION ID
|
||||
weather_to_pg a4491317-153e-436f-b646-0b39338f9aab active c4aa8550-2122-4a33-9a21-adbfaa638544 c0c977c2-48e7-46fe-9f57-576285c26d42
|
||||
```
|
||||
|
||||
#### `octavia get source <SOURCE_ID> or <SOURCE_NAME>`
|
||||
|
||||
Get an existing source in current the Airbyte workspace. You can use a source ID or name.
|
||||
|
||||
| **Argument** | **Description** |
|
||||
| ------------- | ---------------- |
|
||||
| `SOURCE_ID` | The source id. |
|
||||
| `SOURCE_NAME` | The source name. |
|
||||
|
||||
**Examples**:
|
||||
|
||||
```bash
|
||||
$ octavia get source c0c977c2-48e7-46fe-9f57-576285c26d42
|
||||
{'connection_configuration': {'key': '**********',
|
||||
'start_date': '2010-01-01T00:00:00.000Z',
|
||||
'token': '**********'},
|
||||
'name': 'Pokemon',
|
||||
'source_definition_id': 'b08e4776-d1de-4e80-ab5c-1e51dad934a2',
|
||||
'source_id': 'c0c977c2-48e7-46fe-9f57-576285c26d42',
|
||||
'source_name': 'My Poke',
|
||||
'workspace_id': 'c4aa8550-2122-4a33-9a21-adbfaa638544'}
|
||||
```
|
||||
|
||||
```bash
|
||||
$ octavia get source "My Poke"
|
||||
{'connection_configuration': {'key': '**********',
|
||||
'start_date': '2010-01-01T00:00:00.000Z',
|
||||
'token': '**********'},
|
||||
'name': 'Pokemon',
|
||||
'source_definition_id': 'b08e4776-d1de-4e80-ab5c-1e51dad934a2',
|
||||
'source_id': 'c0c977c2-48e7-46fe-9f57-576285c26d42',
|
||||
'source_name': 'My Poke',
|
||||
'workspace_id': 'c4aa8550-2122-4a33-9a21-adbfaa638544'}
|
||||
```
|
||||
|
||||
#### `octavia get destination <DESTINATION_ID> or <DESTINATION_NAME>`
|
||||
|
||||
Get an existing destination in current the Airbyte workspace. You can use a destination ID or name.
|
||||
|
||||
| **Argument** | **Description** |
|
||||
| ------------------ | --------------------- |
|
||||
| `DESTINATION_ID` | The destination id. |
|
||||
| `DESTINATION_NAME` | The destination name. |
|
||||
|
||||
**Examples**:
|
||||
|
||||
```bash
|
||||
$ octavia get destination c0c977c2-48e7-46fe-9f57-576285c26d42
|
||||
{
|
||||
"destinationDefinitionId": "c0c977c2-48e7-46fe-9f57-576285c26d42",
|
||||
"destinationId": "18102e7c-5160-4000-841b-15e8ec48c301",
|
||||
"workspaceId": "18102e7c-5160-4000-883a-30bc7cd65601",
|
||||
"connectionConfiguration": {
|
||||
"user": "charles"
|
||||
},
|
||||
"name": "pg",
|
||||
"destinationName": "Postgres"
|
||||
}
|
||||
```
|
||||
|
||||
```bash
|
||||
$ octavia get destination pg
|
||||
{
|
||||
"destinationDefinitionId": "18102e7c-5160-4000-821f-4d7cfdf87201",
|
||||
"destinationId": "18102e7c-5160-4000-841b-15e8ec48c301",
|
||||
"workspaceId": "18102e7c-5160-4000-883a-30bc7cd65601",
|
||||
"connectionConfiguration": {
|
||||
"user": "charles"
|
||||
},
|
||||
"name": "string",
|
||||
"destinationName": "string"
|
||||
}
|
||||
```
|
||||
|
||||
#### `octavia get connection <CONNECTION_ID> or <CONNECTION_NAME>`
|
||||
|
||||
Get an existing connection in current the Airbyte workspace. You can use a connection ID or name.
|
||||
|
||||
| **Argument** | **Description** |
|
||||
| ----------------- | -------------------- |
|
||||
| `CONNECTION_ID` | The connection id. |
|
||||
| `CONNECTION_NAME` | The connection name. |
|
||||
|
||||
**Example**:
|
||||
|
||||
```bash
|
||||
$ octavia get connection c0c977c2-48e7-46fe-9f57-576285c26d42
|
||||
{
|
||||
"connectionId": "c0c977c2-48e7-46fe-9f57-576285c26d42",
|
||||
"name": "Poke To PG",
|
||||
"namespaceDefinition": "source",
|
||||
"namespaceFormat": "${SOURCE_NAMESPACE}",
|
||||
"prefix": "string",
|
||||
"sourceId": "18102e7c-5340-4000-8eaa-4a86f844b101",
|
||||
"destinationId": "18102e7c-5340-4000-8e58-6bed49c24b01",
|
||||
"operationIds": [
|
||||
"18102e7c-5340-4000-8ef0-f35c05a49a01"
|
||||
],
|
||||
"syncCatalog": {
|
||||
"streams": [
|
||||
{
|
||||
"stream": {
|
||||
"name": "string",
|
||||
"jsonSchema": {},
|
||||
"supportedSyncModes": [
|
||||
"full_refresh"
|
||||
],
|
||||
"sourceDefinedCursor": false,
|
||||
"defaultCursorField": [
|
||||
"string"
|
||||
],
|
||||
"sourceDefinedPrimaryKey": [
|
||||
[
|
||||
"string"
|
||||
]
|
||||
],
|
||||
"namespace": "string"
|
||||
},
|
||||
"config": {
|
||||
"syncMode": "full_refresh",
|
||||
"cursorField": [
|
||||
"string"
|
||||
],
|
||||
"destinationSyncMode": "append",
|
||||
"primaryKey": [
|
||||
[
|
||||
"string"
|
||||
]
|
||||
],
|
||||
"aliasName": "string",
|
||||
"selected": false
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
"schedule": {
|
||||
"units": 0,
|
||||
"timeUnit": "minutes"
|
||||
},
|
||||
"status": "active",
|
||||
"resourceRequirements": {
|
||||
"cpu_request": "string",
|
||||
"cpu_limit": "string",
|
||||
"memory_request": "string",
|
||||
"memory_limit": "string"
|
||||
},
|
||||
"sourceCatalogId": "18102e7c-5340-4000-85f3-204ab7715801"
|
||||
}
|
||||
```
|
||||
|
||||
```bash
|
||||
$ octavia get connection "Poke To PG"
|
||||
{
|
||||
"connectionId": "c0c977c2-48e7-46fe-9f57-576285c26d42",
|
||||
"name": "Poke To PG",
|
||||
"namespaceDefinition": "source",
|
||||
"namespaceFormat": "${SOURCE_NAMESPACE}",
|
||||
"prefix": "string",
|
||||
"sourceId": "18102e7c-5340-4000-8eaa-4a86f844b101",
|
||||
"destinationId": "18102e7c-5340-4000-8e58-6bed49c24b01",
|
||||
"operationIds": [
|
||||
"18102e7c-5340-4000-8ef0-f35c05a49a01"
|
||||
],
|
||||
"syncCatalog": {
|
||||
"streams": [
|
||||
{
|
||||
"stream": {
|
||||
"name": "string",
|
||||
"jsonSchema": {},
|
||||
"supportedSyncModes": [
|
||||
"full_refresh"
|
||||
],
|
||||
"sourceDefinedCursor": false,
|
||||
"defaultCursorField": [
|
||||
"string"
|
||||
],
|
||||
"sourceDefinedPrimaryKey": [
|
||||
[
|
||||
"string"
|
||||
]
|
||||
],
|
||||
"namespace": "string"
|
||||
},
|
||||
"config": {
|
||||
"syncMode": "full_refresh",
|
||||
"cursorField": [
|
||||
"string"
|
||||
],
|
||||
"destinationSyncMode": "append",
|
||||
"primaryKey": [
|
||||
[
|
||||
"string"
|
||||
]
|
||||
],
|
||||
"aliasName": "string",
|
||||
"selected": false
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
"schedule": {
|
||||
"units": 0,
|
||||
"timeUnit": "minutes"
|
||||
},
|
||||
"status": "active",
|
||||
"resourceRequirements": {
|
||||
"cpu_request": "string",
|
||||
"cpu_limit": "string",
|
||||
"memory_request": "string",
|
||||
"memory_limit": "string"
|
||||
},
|
||||
"sourceCatalogId": "18102e7c-5340-4000-85f3-204ab7715801"
|
||||
}
|
||||
```
|
||||
|
||||
#### `octavia import all`
|
||||
|
||||
Import all existing resources (sources, destinations, connections) on your Airbyte instance to manage them with octavia-cli.
|
||||
|
||||
**Examples**:
|
||||
|
||||
```bash
|
||||
$ octavia import all
|
||||
🐙 - Octavia is targetting your Airbyte instance running at http://localhost:8000 on workspace b06c6fbb-cadd-4c5c-bdbb-710add7dedb9.
|
||||
✅ - Imported source poke in sources/poke/configuration.yaml. State stored in sources/poke/state_b06c6fbb-cadd-4c5c-bdbb-710add7dedb9.yaml
|
||||
⚠️ - Please update any secrets stored in sources/poke/configuration.yaml
|
||||
✅ - Imported destination Postgres in destinations/postgres/configuration.yaml. State stored in destinations/postgres/state_b06c6fbb-cadd-4c5c-bdbb-710add7dedb9.yaml
|
||||
⚠️ - Please update any secrets stored in destinations/postgres/configuration.yaml
|
||||
✅ - Imported connection poke-to-pg in connections/poke_to_pg/configuration.yaml. State stored in connections/poke_to_pg/state_b06c6fbb-cadd-4c5c-bdbb-710add7dedb9.yaml
|
||||
```
|
||||
|
||||
You know have local configuration files for all Airbyte resources that were already existing.
|
||||
You need to edit any secret values that exist in these configuration files as secrets are not imported.
|
||||
You can edit the configuration files and run `octavia apply` to continue managing them with octavia-cli.
|
||||
|
||||
#### `octavia import destination <DESTINATION_ID> or <DESTINATION_NAME>`
|
||||
|
||||
Import an existing destination to manage it with octavia-cli. You can use a destination ID or name.
|
||||
|
||||
| **Argument** | **Description** |
|
||||
| ------------------ | --------------------- |
|
||||
| `DESTINATION_ID` | The destination id. |
|
||||
| `DESTINATION_NAME` | The destination name. |
|
||||
|
||||
#### `octavia import source <SOURCE_ID> or <SOURCE_NAME>`
|
||||
|
||||
Import an existing source to manage it with octavia-cli. You can use a source ID or name.
|
||||
|
||||
| **Argument** | **Description** |
|
||||
| ------------- | ---------------- |
|
||||
| `SOURCE_ID` | The source id. |
|
||||
| `SOURCE_NAME` | The source name. |
|
||||
|
||||
**Examples**:
|
||||
|
||||
```bash
|
||||
$ octavia import source poke
|
||||
🐙 - Octavia is targetting your Airbyte instance running at http://localhost:8000 on workspace 75658e4f-e5f0-4e35-be0c-bdad33226c94.
|
||||
✅ - Imported source poke in sources/poke/configuration.yaml. State stored in sources/poke/state_75658e4f-e5f0-4e35-be0c-bdad33226c94.yaml
|
||||
⚠️ - Please update any secrets stored in sources/poke/configuration.yaml
|
||||
```
|
||||
|
||||
You know have local configuration file for an Airbyte source that was already existing.
|
||||
You need to edit any secret value that exist in this configuration as secrets are not imported.
|
||||
You can edit the configuration and run `octavia apply` to continue managing it with octavia-cli.
|
||||
|
||||
#### `octavia import destination <DESTINATION_ID> or <DESTINATION_NAME>`
|
||||
|
||||
Import an existing destination to manage it with octavia-cli. You can use a destination ID or name.
|
||||
|
||||
| **Argument** | **Description** |
|
||||
| ------------------ | --------------------- |
|
||||
| `DESTINATION_ID` | The destination id. |
|
||||
| `DESTINATION_NAME` | The destination name. |
|
||||
|
||||
**Examples**:
|
||||
|
||||
```bash
|
||||
$ octavia import destination pg
|
||||
🐙 - Octavia is targetting your Airbyte instance running at http://localhost:8000 on workspace 75658e4f-e5f0-4e35-be0c-bdad33226c94.
|
||||
✅ - Imported destination pg in destinations/pg/configuration.yaml. State stored in destinations/pg/state_75658e4f-e5f0-4e35-be0c-bdad33226c94.yaml
|
||||
⚠️ - Please update any secrets stored in destinations/pg/configuration.yaml
|
||||
```
|
||||
|
||||
You know have local configuration file for an Airbyte destination that was already existing.
|
||||
You need to edit any secret value that exist in this configuration as secrets are not imported.
|
||||
You can edit the configuration and run `octavia apply` to continue managing it with octavia-cli.
|
||||
|
||||
#### `octavia import connection <CONNECTION_ID> or <CONNECTION_NAME>`
|
||||
|
||||
Import an existing connection to manage it with octavia-cli. You can use a connection ID or name.
|
||||
|
||||
| **Argument** | **Description** |
|
||||
| ----------------- | -------------------- |
|
||||
| `CONNECTION_ID` | The connection id. |
|
||||
| `CONNECTION_NAME` | The connection name. |
|
||||
|
||||
**Examples**:
|
||||
|
||||
```bash
|
||||
$ octavia import connection poke-to-pg
|
||||
🐙 - Octavia is targetting your Airbyte instance running at http://localhost:8000 on workspace 75658e4f-e5f0-4e35-be0c-bdad33226c94.
|
||||
✅ - Imported connection poke-to-pg in connections/poke-to-pg/configuration.yaml. State stored in connections/poke-to-pg/state_75658e4f-e5f0-4e35-be0c-bdad33226c94.yaml
|
||||
⚠️ - Please update any secrets stored in connections/poke-to-pg/configuration.yaml
|
||||
```
|
||||
|
||||
You know have local configuration file for an Airbyte connection that was already existing.
|
||||
**N.B.: You first need to import the source and destination used by the connection.**
|
||||
You can edit the configuration and run `octavia apply` to continue managing it with octavia-cli.
|
||||
|
||||
#### `octavia generate source <DEFINITION_ID> <SOURCE_NAME>`
|
||||
|
||||
Generate a YAML configuration for a source.
|
||||
The YAML file will be stored at `./sources/<resource_name>/configuration.yaml`.
|
||||
|
||||
| **Argument** | **Description** |
|
||||
| --------------- | --------------------------------------------------------------------------------------------- |
|
||||
| `DEFINITION_ID` | The source connector definition id. Can be retrieved using `octavia list connectors sources`. |
|
||||
| `SOURCE_NAME` | The name you want to give to this source in Airbyte. |
|
||||
|
||||
**Example**:
|
||||
|
||||
```bash
|
||||
$ octavia generate source d8540a80-6120-485d-b7d6-272bca477d9b weather
|
||||
✅ - Created the source template for weather in ./sources/weather/configuration.yaml.
|
||||
```
|
||||
|
||||
#### `octavia generate destination <DEFINITION_ID> <DESTINATION_NAME>`
|
||||
|
||||
Generate a YAML configuration for a destination.
|
||||
The YAML file will be stored at `./destinations/<destination_name>/configuration.yaml`.
|
||||
|
||||
| **Argument** | **Description** |
|
||||
| ------------------ | ------------------------------------------------------------------------------------------------------- |
|
||||
| `DEFINITION_ID` | The destination connector definition id. Can be retrieved using `octavia list connectors destinations`. |
|
||||
| `DESTINATION_NAME` | The name you want to give to this destination in Airbyte. |
|
||||
|
||||
**Example**:
|
||||
|
||||
```bash
|
||||
$ octavia generate destination 25c5221d-dce2-4163-ade9-739ef790f503 my_db
|
||||
✅ - Created the destination template for my_db in ./destinations/my_db/configuration.yaml.
|
||||
```
|
||||
|
||||
#### `octavia generate connection --source <path-to-source-configuration.yaml> --destination <path-to-destination-configuration.yaml> <CONNECTION_NAME>`
|
||||
|
||||
Generate a YAML configuration for a connection.
|
||||
The YAML file will be stored at `./connections/<connection_name>/configuration.yaml`.
|
||||
|
||||
| **Option** | **Required** | **Description** |
|
||||
| --------------- | ------------ | ------------------------------------------------------------------------------------------ |
|
||||
| `--source` | Yes | Path to the YAML configuration file of the source you want to create a connection from. |
|
||||
| `--destination` | Yes | Path to the YAML configuration file of the destination you want to create a connection to. |
|
||||
|
||||
| **Argument** | **Description** |
|
||||
| ----------------- | -------------------------------------------------------- |
|
||||
| `CONNECTION_NAME` | The name you want to give to this connection in Airbyte. |
|
||||
|
||||
**Example**:
|
||||
|
||||
```bash
|
||||
$ octavia generate connection --source sources/weather/configuration.yaml --destination destinations/my_db/configuration.yaml weather_to_pg
|
||||
✅ - Created the connection template for weather_to_pg in ./connections/weather_to_pg/configuration.yaml.
|
||||
```
|
||||
|
||||
#### `octavia apply`
|
||||
|
||||
Create or update the resource on your Airbyte instance according to local configurations found in your octavia project directory.
|
||||
If the resource was not found on your Airbyte instance, **apply** will **create** the remote resource.
|
||||
If the resource was found on your Airbyte instance, **apply** will prompt you for validation of the changes and will run an **update** of your resource.
|
||||
Please note that if a secret field was updated on your configuration, **apply** will run this change without prompt.
|
||||
|
||||
| **Option** | **Required** | **Description** |
|
||||
| ---------- | ------------ | ------------------------------------------------------------------ |
|
||||
| `--file` | No | Path to the YAML configuration files you want to create or update. |
|
||||
| `--force` | No | Run update without prompting for changes validation. |
|
||||
|
||||
**Example**:
|
||||
|
||||
```bash
|
||||
$ octavia apply
|
||||
🐙 - weather exists on your Airbyte instance, let's check if we need to update it!
|
||||
👀 - Here's the computed diff (🚨 remind that diff on secret fields are not displayed):
|
||||
E - Value of root['lat'] changed from "46.7603" to "45.7603".
|
||||
❓ - Do you want to update weather? [y/N]: y
|
||||
✍️ - Running update because a diff was detected between local and remote resource.
|
||||
🎉 - Successfully updated weather on your Airbyte instance!
|
||||
💾 - New state for weather stored at ./sources/weather/state_<workspace_id>.yaml.
|
||||
🐙 - my_db exists on your Airbyte instance, let's check if we need to update it!
|
||||
😴 - Did not update because no change detected.
|
||||
🐙 - weather_to_pg exists on your Airbyte instance, let's check if we need to update it!
|
||||
👀 - Here's the computed diff (🚨 remind that diff on secret fields are not displayed):
|
||||
E - Value of root['schedule']['timeUnit'] changed from "days" to "hours".
|
||||
❓ - Do you want to update weather_to_pg? [y/N]: y
|
||||
✍️ - Running update because a diff was detected between local and remote resource.
|
||||
🎉 - Successfully updated weather_to_pg on your Airbyte instance!
|
||||
💾 - New state for weather_to_pg stored at ./connections/weather_to_pg/state_<workspace_id>.yaml.
|
||||
```
|
||||
|
||||
## Contributing
|
||||
|
||||
1. Please sign up to [Airbyte's Slack workspace](https://slack.airbyte.io/) and join the `#octavia-cli`. We'll sync up community efforts in this channel.
|
||||
2. Pick an existing [GitHub issues](https://github.com/airbytehq/airbyte/issues?q=is%3Aopen+is%3Aissue+label%3Aarea%2Foctavia-cli) or **open** a new one to explain what you'd like to implement.
|
||||
3. Assign the GitHub issue to yourself.
|
||||
4. Fork Airbyte's repo, code and test thoroughly.
|
||||
5. Open a PR on our Airbyte repo from your fork.
|
||||
|
||||
### Developing locally
|
||||
|
||||
0. Build the project locally (from the root of Airbyte's repo): `./gradlew :octavia-cli:build # from the root directory of the repo`.
|
||||
1. Install Python 3.8.12. We suggest doing it through `pyenv`.
|
||||
2. Create a virtualenv: `python -m venv .venv`.
|
||||
3. Activate the virtualenv: `source .venv/bin/activate`.
|
||||
4. Install dev dependencies: `pip install -e .\[tests\]`.
|
||||
5. Install `pre-commit` hooks: `pre-commit install`.
|
||||
6. Run the unittest suite: `pytest --cov=octavia_cli`. Note, a local version of airbyte needs to be running (e.g. `docker compose up` from the root directory of the project)
|
||||
7. Make sure the build passes (step 0) before opening a PR.
|
||||
|
||||
## Telemetry
|
||||
|
||||
This CLI has some telemetry tooling to send Airbyte some data about the usage of this tool.
|
||||
We will use this data to improve the CLI and measure its adoption.
|
||||
The telemetry sends data about:
|
||||
|
||||
- Which command was run (not the arguments or options used).
|
||||
- Success or failure of the command run and the error type (not the error payload).
|
||||
- The current Airbyte workspace id if the user has not set the _anonymous data collection_ on their Airbyte instance.
|
||||
|
||||
You can disable telemetry by setting the `OCTAVIA_ENABLE_TELEMETRY` environment variable to `False` or using the `--disable-telemetry` flag.
|
||||
|
||||
## Changelog
|
||||
|
||||
| Version | Date | Description | PR |
|
||||
|---------| ---------- | ------------------------------------------------------------------------------------- | ----------------------------------------------------------- |
|
||||
| 0.41.0 | 2022-10-13 | Use Basic Authentication for making API requests | [#17982](https://github.com/airbytehq/airbyte/pull/17982) |
|
||||
| 0.40.0 | 2022-08-10 | Enable cron and basic scheduling | [#15253](https://github.com/airbytehq/airbyte/pull/15253) |
|
||||
| 0.39.33 | 2022-07-05 | Add `octavia import all` command | [#14374](https://github.com/airbytehq/airbyte/pull/14374) |
|
||||
| 0.39.32 | 2022-06-30 | Create import command to import and manage existing Airbyte resource from octavia-cli | [#14137](https://github.com/airbytehq/airbyte/pull/14137) |
|
||||
| 0.39.27 | 2022-06-24 | Create get command to retrieve resources JSON representation | [#13254](https://github.com/airbytehq/airbyte/pull/13254) |
|
||||
| 0.39.19 | 2022-06-16 | Allow connection management on multiple workspaces | [#13070](https://github.com/airbytehq/airbyte/pull/12727) |
|
||||
| 0.39.19 | 2022-06-15 | Allow users to set custom HTTP headers | [#12893](https://github.com/airbytehq/airbyte/pull/12893) |
|
||||
| 0.39.14 | 2022-05-12 | Enable normalization on connection | [#12727](https://github.com/airbytehq/airbyte/pull/12727) |
|
||||
| 0.37.0 | 2022-05-05 | Use snake case in connection fields | [#12133](https://github.com/airbytehq/airbyte/pull/12133) |
|
||||
| 0.35.68 | 2022-04-15 | Improve telemetry | [#12072](https://github.com/airbytehq/airbyte/issues/11896) |
|
||||
| 0.35.68 | 2022-04-12 | Add telemetry | [#11896](https://github.com/airbytehq/airbyte/issues/11896) |
|
||||
| 0.35.61 | 2022-04-07 | Alpha release | [EPIC](https://github.com/airbytehq/airbyte/issues/10704) |
|
||||
@@ -1,121 +0,0 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
# This install scripts currently only works for ZSH and Bash profiles.
|
||||
# It creates an octavia alias in your profile bound to a docker run command and your current user.
|
||||
|
||||
VERSION=0.44.4
|
||||
OCTAVIA_ENV_FILE=${HOME}/.octavia
|
||||
|
||||
detect_profile() {
|
||||
if [ "${SHELL#*bash}" != "$SHELL" ]; then
|
||||
if [ -f "$HOME/.bashrc" ]; then
|
||||
DETECTED_PROFILE="$HOME/.bashrc"
|
||||
elif [ -f "$HOME/.bash_profile" ]; then
|
||||
DETECTED_PROFILE="$HOME/.bash_profile"
|
||||
fi
|
||||
elif [ "${SHELL#*zsh}" != "$SHELL" ]; then
|
||||
if [ -f "$HOME/.zshrc" ]; then
|
||||
DETECTED_PROFILE="$HOME/.zshrc"
|
||||
fi
|
||||
elif [ "${SHELL#*fish}" != "$SHELL" ]; then
|
||||
if [ -f "$HOME/.config/fish/config.fish" ]; then
|
||||
DETECTED_PROFILE="$HOME/.config/fish/config.fish"
|
||||
fi
|
||||
fi
|
||||
|
||||
if [ -z "${DETECTED_PROFILE}" ]; then
|
||||
echo "🚨 - Cannot install! This scripts only works if you are using one of these profiles: ~/.bashrc, ~/.bash_profile or ~/.zshrc"
|
||||
exit 1
|
||||
else
|
||||
echo "octavia alias will be added to ${DETECTED_PROFILE}"
|
||||
fi
|
||||
}
|
||||
|
||||
check_docker_is_running() {
|
||||
if ! docker info > /dev/null 2>&1; then
|
||||
echo "🚨 - This script uses docker, and it isn't running - please start docker and try again!"
|
||||
exit 1
|
||||
fi
|
||||
}
|
||||
|
||||
delete_previous_alias() {
|
||||
sed -i'' -e '/^alias octavia=/d' ${DETECTED_PROFILE}
|
||||
}
|
||||
|
||||
|
||||
pull_image() {
|
||||
echo "🐙 - Pulling image for octavia ${VERSION}"
|
||||
docker pull airbyte/octavia-cli:${VERSION} > /dev/null
|
||||
echo "🐙 - 🎉 octavia ${VERSION} image was pulled"
|
||||
}
|
||||
|
||||
add_octavia_comment_to_profile() {
|
||||
printf "\n# OCTAVIA CLI ${VERSION}\n" >> ${DETECTED_PROFILE}
|
||||
}
|
||||
|
||||
create_octavia_env_file() {
|
||||
if [ "${SHELL#*fish}" != "$SHELL" ]; then
|
||||
echo "set OCTAVIA_ENV_FILE ${OCTAVIA_ENV_FILE}" >> ${DETECTED_PROFILE}
|
||||
else
|
||||
echo "OCTAVIA_ENV_FILE=${OCTAVIA_ENV_FILE}" >> ${DETECTED_PROFILE}
|
||||
fi
|
||||
touch ${OCTAVIA_ENV_FILE}
|
||||
echo "🐙 - 💾 The octavia env file was created at ${OCTAVIA_ENV_FILE}"
|
||||
}
|
||||
|
||||
enable_telemetry() {
|
||||
if [ "${SHELL#*fish}" != "$SHELL" ]; then
|
||||
echo "set -x OCTAVIA_ENABLE_TELEMETRY $1" >> ${DETECTED_PROFILE}
|
||||
else
|
||||
echo "export OCTAVIA_ENABLE_TELEMETRY=$1" >> ${DETECTED_PROFILE}
|
||||
fi
|
||||
echo "OCTAVIA_ENABLE_TELEMETRY=$1" >> ${OCTAVIA_ENV_FILE}
|
||||
}
|
||||
|
||||
add_alias() {
|
||||
if [ "${SHELL#*fish}" != "$SHELL" ]; then
|
||||
echo 'alias octavia="docker run -i --rm -v $(pwd):/home/octavia-project --network host --env-file $OCTAVIA_ENV_FILE --user $(id -u):$(id -g) airbyte/octavia-cli:'${VERSION}'"' >> ${DETECTED_PROFILE}
|
||||
else
|
||||
echo 'alias octavia="docker run -i --rm -v \$(pwd):/home/octavia-project --network host --env-file \${OCTAVIA_ENV_FILE} --user \$(id -u):\$(id -g) airbyte/octavia-cli:'${VERSION}'"' >> ${DETECTED_PROFILE}
|
||||
fi
|
||||
echo "🐙 - 🎉 octavia alias was added to ${DETECTED_PROFILE}!"
|
||||
echo "🐙 - Please open a new terminal window or run source ${DETECTED_PROFILE}"
|
||||
}
|
||||
|
||||
install() {
|
||||
pull_image
|
||||
add_alias
|
||||
}
|
||||
|
||||
telemetry_consent() {
|
||||
read -p "❓ - Allow Airbyte to collect telemetry to improve the CLI? (Y/n)" -n 1 -r </dev/tty
|
||||
echo
|
||||
if [[ $REPLY =~ ^[Yy]$ ]]; then
|
||||
enable_telemetry "True"
|
||||
else
|
||||
enable_telemetry "False"
|
||||
fi
|
||||
}
|
||||
|
||||
update_or_install() {
|
||||
if grep -q "^alias octavia=*" ${DETECTED_PROFILE}; then
|
||||
read -p "❓ - You already have an octavia alias in your profile. Do you want to update? (Y/n)" -n 1 -r </dev/tty
|
||||
echo
|
||||
if [[ $REPLY =~ ^[Yy]$ ]]
|
||||
then
|
||||
delete_previous_alias
|
||||
install
|
||||
fi
|
||||
else
|
||||
add_octavia_comment_to_profile
|
||||
create_octavia_env_file
|
||||
telemetry_consent
|
||||
install
|
||||
fi
|
||||
}
|
||||
|
||||
set -e
|
||||
check_docker_is_running
|
||||
detect_profile
|
||||
set -u
|
||||
update_or_install
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,2 +0,0 @@
|
||||
**/state_*.yaml
|
||||
**/updated_*.yaml
|
||||
@@ -1,362 +0,0 @@
|
||||
# Configuration for connection poke_to_pg
|
||||
definition_type: connection
|
||||
resource_name: poke_to_pg
|
||||
source_configuration_path: TO_UPDATE_FROM_TEST
|
||||
destination_configuration_path: TO_UPDATE_FROM_TEST
|
||||
|
||||
# EDIT THE CONFIGURATION BELOW!
|
||||
configuration:
|
||||
status: active # REQUIRED | string | Allowed values: active, inactive, deprecated
|
||||
skip_reset: false # OPTIONAL | boolean | Flag to check if the connection should be reset after a connection update
|
||||
namespace_definition: source # OPTIONAL | string | Allowed values: source, destination, customformat
|
||||
namespace_format: "${SOURCE_NAMESPACE}" # OPTIONAL | string | Used when namespaceDefinition is 'customformat'. If blank then behaves like namespaceDefinition = 'destination'. If "${SOURCE_NAMESPACE}" then behaves like namespaceDefinition = 'source'.
|
||||
prefix: "" # REQUIRED | Prefix that will be prepended to the name of each stream when it is written to the destination
|
||||
resource_requirements: # OPTIONAL | object | Resource requirements to run workers (blank for unbounded allocations)
|
||||
cpu_limit: "" # OPTIONAL
|
||||
cpu_request: "" # OPTIONAL
|
||||
memory_limit: "" # OPTIONAL
|
||||
memory_request: "" # OPTIONAL
|
||||
schedule_type: basic
|
||||
schedule_data:
|
||||
basic_schedule:
|
||||
time_unit: hours # REQUIRED | string | Allowed values: minutes, hours, days, weeks, months
|
||||
units: 1 # REQUIRED | integer
|
||||
sync_catalog: # OPTIONAL | object | 🚨 ONLY edit streams.config, streams.stream should not be edited as schema cannot be changed.
|
||||
streams:
|
||||
- config:
|
||||
alias_name: pokemon
|
||||
cursor_field: []
|
||||
destination_sync_mode: append
|
||||
primary_key: []
|
||||
selected: true
|
||||
sync_mode: full_refresh
|
||||
stream:
|
||||
default_cursor_field: []
|
||||
json_schema:
|
||||
$schema: http://json-schema.org/draft-07/schema#
|
||||
properties:
|
||||
abilities:
|
||||
items:
|
||||
properties:
|
||||
ability:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
is_hidden:
|
||||
type:
|
||||
- "null"
|
||||
- boolean
|
||||
slot:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- array
|
||||
base_experience:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
forms:
|
||||
items:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- array
|
||||
game_indices:
|
||||
items:
|
||||
properties:
|
||||
game_index:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
version:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- array
|
||||
height:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
held_items:
|
||||
items:
|
||||
properties:
|
||||
item:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
version_details:
|
||||
items:
|
||||
properties:
|
||||
rarity:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
version:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- array
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- array
|
||||
id:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
"is_default ":
|
||||
type:
|
||||
- "null"
|
||||
- boolean
|
||||
location_area_encounters:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
moves:
|
||||
items:
|
||||
properties:
|
||||
move:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
version_group_details:
|
||||
items:
|
||||
properties:
|
||||
level_learned_at:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
move_learn_method:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
version_group:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- array
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- array
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
order:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
species:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
sprites:
|
||||
properties:
|
||||
back_default:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
back_female:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
back_shiny:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
back_shiny_female:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
front_default:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
front_female:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
front_shiny:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
front_shiny_female:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
stats:
|
||||
items:
|
||||
properties:
|
||||
base_stat:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
effort:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
stat:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- array
|
||||
types:
|
||||
items:
|
||||
properties:
|
||||
slot:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
type:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- array
|
||||
weight:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
type: object
|
||||
name: pokemon
|
||||
source_defined_primary_key: []
|
||||
supported_sync_modes:
|
||||
- full_refresh
|
||||
@@ -1,367 +0,0 @@
|
||||
# Configuration for connection poke_to_pg
|
||||
definition_type: connection
|
||||
resource_name: poke_to_pg
|
||||
source_configuration_path: TO_UPDATE_FROM_TEST
|
||||
destination_configuration_path: TO_UPDATE_FROM_TEST
|
||||
|
||||
# EDIT THE CONFIGURATION BELOW!
|
||||
configuration:
|
||||
status: active # REQUIRED | string | Allowed values: active, inactive, deprecated
|
||||
namespace_definition: source # OPTIONAL | string | Allowed values: source, destination, customformat
|
||||
namespace_format: "${SOURCE_NAMESPACE}" # OPTIONAL | string | Used when namespaceDefinition is 'customformat'. If blank then behaves like namespaceDefinition = 'destination'. If "${SOURCE_NAMESPACE}" then behaves like namespaceDefinition = 'source'.
|
||||
prefix: "" # REQUIRED | Prefix that will be prepended to the name of each stream when it is written to the destination
|
||||
resource_requirements: # OPTIONAL | object | Resource requirements to run workers (blank for unbounded allocations)
|
||||
cpu_limit: "" # OPTIONAL
|
||||
cpu_request: "" # OPTIONAL
|
||||
memory_limit: "" # OPTIONAL
|
||||
memory_request: "" # OPTIONAL
|
||||
schedule_type: basic
|
||||
schedule_data:
|
||||
basic_schedule:
|
||||
time_unit: hours # REQUIRED | string | Allowed values: minutes, hours, days, weeks, months
|
||||
units: 1 # REQUIRED | integer
|
||||
operations:
|
||||
- name: "Normalization"
|
||||
operator_configuration:
|
||||
normalization:
|
||||
option: "basic"
|
||||
operator_type: "normalization"
|
||||
sync_catalog: # OPTIONAL | object | 🚨 ONLY edit streams.config, streams.stream should not be edited as schema cannot be changed.
|
||||
streams:
|
||||
- config:
|
||||
alias_name: pokemon
|
||||
cursor_field: []
|
||||
destination_sync_mode: append
|
||||
primary_key: []
|
||||
selected: true
|
||||
sync_mode: full_refresh
|
||||
stream:
|
||||
default_cursor_field: []
|
||||
json_schema:
|
||||
$schema: http://json-schema.org/draft-07/schema#
|
||||
properties:
|
||||
abilities:
|
||||
items:
|
||||
properties:
|
||||
ability:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
is_hidden:
|
||||
type:
|
||||
- "null"
|
||||
- boolean
|
||||
slot:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- array
|
||||
base_experience:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
forms:
|
||||
items:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- array
|
||||
game_indices:
|
||||
items:
|
||||
properties:
|
||||
game_index:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
version:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- array
|
||||
height:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
held_items:
|
||||
items:
|
||||
properties:
|
||||
item:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
version_details:
|
||||
items:
|
||||
properties:
|
||||
rarity:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
version:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- array
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- array
|
||||
id:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
"is_default ":
|
||||
type:
|
||||
- "null"
|
||||
- boolean
|
||||
location_area_encounters:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
moves:
|
||||
items:
|
||||
properties:
|
||||
move:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
version_group_details:
|
||||
items:
|
||||
properties:
|
||||
level_learned_at:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
move_learn_method:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
version_group:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- array
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- array
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
order:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
species:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
sprites:
|
||||
properties:
|
||||
back_default:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
back_female:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
back_shiny:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
back_shiny_female:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
front_default:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
front_female:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
front_shiny:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
front_shiny_female:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
stats:
|
||||
items:
|
||||
properties:
|
||||
base_stat:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
effort:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
stat:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- array
|
||||
types:
|
||||
items:
|
||||
properties:
|
||||
slot:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
type:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- array
|
||||
weight:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
type: object
|
||||
name: pokemon
|
||||
source_defined_primary_key: []
|
||||
supported_sync_modes:
|
||||
- full_refresh
|
||||
@@ -1,32 +0,0 @@
|
||||
# Configuration for airbyte/destination-postgres
|
||||
# Documentation about this connector can be found at https://docs.airbyte.io/integrations/destinations/postgres
|
||||
resource_name: postgres
|
||||
definition_type: destination
|
||||
definition_id: 25c5221d-dce2-4163-ade9-739ef790f503
|
||||
definition_image: airbyte/destination-postgres
|
||||
definition_version: 0.3.15
|
||||
|
||||
# EDIT THE CONFIGURATION BELOW!
|
||||
configuration:
|
||||
ssl: False # OPTIONAL | boolean | Encrypt data using SSL.
|
||||
host: localhost # REQUIRED | string | Hostname of the database.
|
||||
port: 5433 # REQUIRED | integer | Port of the database. | Example: 5432
|
||||
schema: "public" # REQUIRED | string | The default schema tables are written to if the source does not specify a namespace. The usual value for this field is "public". | Example: public
|
||||
database: postgres # REQUIRED | string | Name of the database.
|
||||
password: ${POSTGRES_PASSWORD} # SECRET (please store in environment variables) | OPTIONAL | string | Password associated with the username.
|
||||
username: postgres # REQUIRED | string | Username to use to access the database.
|
||||
tunnel_method:
|
||||
## -------- Pick one valid structure among the examples below: --------
|
||||
tunnel_method: "NO_TUNNEL" # REQUIRED | string | No ssh tunnel needed to connect to database
|
||||
## -------- Another valid structure for tunnel_method: --------
|
||||
# ssh_key: ${SSH_KEY} # SECRET (please store in environment variables) | REQUIRED | string | OS-level user account ssh key credentials in RSA PEM format ( created with ssh-keygen -t rsa -m PEM -f myuser_rsa )
|
||||
# tunnel_host: # REQUIRED | string | Hostname of the jump server host that allows inbound ssh tunnel.
|
||||
# tunnel_port: 22 # REQUIRED | integer | Port on the proxy/jump server that accepts inbound ssh connections. | Example: 22
|
||||
# tunnel_user: # REQUIRED | string | OS-level username for logging into the jump server host.
|
||||
# tunnel_method: "SSH_KEY_AUTH" # REQUIRED | string | Connect through a jump server tunnel host using username and ssh key
|
||||
## -------- Another valid structure for tunnel_method: --------
|
||||
# tunnel_host: # REQUIRED | string | Hostname of the jump server host that allows inbound ssh tunnel.
|
||||
# tunnel_port: 22 # REQUIRED | integer | Port on the proxy/jump server that accepts inbound ssh connections. | Example: 22
|
||||
# tunnel_user: # REQUIRED | string | OS-level username for logging into the jump server host
|
||||
# tunnel_method: "SSH_PASSWORD_AUTH" # REQUIRED | string | Connect through a jump server tunnel host using username and password authentication
|
||||
# tunnel_user_password: ${TUNNEL_USER_PASSWORD} # SECRET (please store in environment variables) | REQUIRED | string | OS-level password for logging into the jump server host
|
||||
@@ -1,9 +0,0 @@
|
||||
resource_name: poke
|
||||
definition_type: source
|
||||
definition_id: 6371b14b-bc68-4236-bfbd-468e8df8e968
|
||||
definition_image: airbyte/source-pokeapi
|
||||
definition_version: 0.1.4
|
||||
|
||||
# EDIT THE CONFIGURATION BELOW!
|
||||
configuration:
|
||||
pokemon_name: ditto
|
||||
@@ -1,155 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import os
|
||||
|
||||
import pytest
|
||||
import yaml
|
||||
from airbyte_api_client.api import connection_api
|
||||
from airbyte_api_client.model.connection_id_request_body import ConnectionIdRequestBody
|
||||
from octavia_cli.apply.resources import Connection, Destination, Source
|
||||
from octavia_cli.entrypoint import get_api_client, get_workspace_id
|
||||
from octavia_cli.init.commands import DIRECTORIES_TO_CREATE as OCTAVIA_PROJECT_DIRECTORIES
|
||||
|
||||
|
||||
def silent_remove(path):
|
||||
try:
|
||||
os.remove(path)
|
||||
return True
|
||||
except FileNotFoundError:
|
||||
return False
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def octavia_tmp_project_directory(tmpdir):
|
||||
for directory in OCTAVIA_PROJECT_DIRECTORIES:
|
||||
tmpdir.mkdir(directory)
|
||||
return tmpdir
|
||||
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def octavia_test_project_directory():
|
||||
return f"{os.path.dirname(__file__)}/configurations"
|
||||
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def api_client():
|
||||
return get_api_client("http://localhost:8000", "airbyte", "password", "octavia-cli/integration-tests", None)
|
||||
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def workspace_id(api_client):
|
||||
return get_workspace_id(api_client, None)
|
||||
|
||||
|
||||
def open_yaml_configuration(path: str):
|
||||
with open(path, "r") as f:
|
||||
local_configuration = yaml.safe_load(f)
|
||||
return local_configuration, path
|
||||
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def source_configuration_and_path(octavia_test_project_directory):
|
||||
path = f"{octavia_test_project_directory}/sources/poke/configuration.yaml"
|
||||
return open_yaml_configuration(path)
|
||||
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def source_state_path(octavia_test_project_directory, workspace_id):
|
||||
state_path = f"{octavia_test_project_directory}/sources/poke/state_{workspace_id}.yaml"
|
||||
silent_remove(state_path)
|
||||
yield state_path
|
||||
silent_remove(state_path)
|
||||
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def source(api_client, workspace_id, source_configuration_and_path, source_state_path):
|
||||
configuration, path = source_configuration_and_path
|
||||
source = Source(api_client, workspace_id, configuration, path)
|
||||
yield source
|
||||
source.api_instance.delete_source(source.get_payload)
|
||||
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def destination_configuration_and_path(octavia_test_project_directory):
|
||||
path = f"{octavia_test_project_directory}/destinations/postgres/configuration.yaml"
|
||||
return open_yaml_configuration(path)
|
||||
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def destination_state_path(octavia_test_project_directory, workspace_id):
|
||||
state_path = f"{octavia_test_project_directory}/destinations/postgres/state_{workspace_id}.yaml"
|
||||
silent_remove(state_path)
|
||||
yield state_path
|
||||
silent_remove(state_path)
|
||||
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def destination(api_client, workspace_id, destination_configuration_and_path, destination_state_path):
|
||||
configuration, path = destination_configuration_and_path
|
||||
destination = Destination(api_client, workspace_id, configuration, path)
|
||||
yield destination
|
||||
destination.api_instance.delete_destination(destination.get_payload)
|
||||
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def connection_configuration_and_path(octavia_test_project_directory):
|
||||
path = f"{octavia_test_project_directory}/connections/poke_to_pg/configuration.yaml"
|
||||
with open(path, "r") as f:
|
||||
local_configuration = yaml.safe_load(f)
|
||||
return local_configuration, path
|
||||
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def connection_state_path(octavia_test_project_directory, workspace_id):
|
||||
state_path = f"{octavia_test_project_directory}/connections/poke_to_pg/state_{workspace_id}.yaml"
|
||||
silent_remove(state_path)
|
||||
yield state_path
|
||||
silent_remove(state_path)
|
||||
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def connection_with_normalization_state_path(octavia_test_project_directory, workspace_id):
|
||||
state_path = f"{octavia_test_project_directory}/connections/poke_to_pg_normalization/state_{workspace_id}.yaml"
|
||||
silent_remove(state_path)
|
||||
yield state_path
|
||||
silent_remove(state_path)
|
||||
|
||||
|
||||
def updated_connection_configuration_and_path(octavia_test_project_directory, source, destination, with_normalization=False):
|
||||
if with_normalization:
|
||||
path = f"{octavia_test_project_directory}/connections/poke_to_pg_normalization/configuration.yaml"
|
||||
edited_path = f"{octavia_test_project_directory}/connections/poke_to_pg_normalization/updated_configuration.yaml"
|
||||
else:
|
||||
path = f"{octavia_test_project_directory}/connections/poke_to_pg/configuration.yaml"
|
||||
edited_path = f"{octavia_test_project_directory}/connections/poke_to_pg/updated_configuration.yaml"
|
||||
with open(path, "r") as dumb_local_configuration_file:
|
||||
local_configuration = yaml.safe_load(dumb_local_configuration_file)
|
||||
local_configuration["source_configuration_path"] = source.configuration_path
|
||||
local_configuration["destination_configuration_path"] = destination.configuration_path
|
||||
with open(edited_path, "w") as updated_configuration_file:
|
||||
yaml.dump(local_configuration, updated_configuration_file)
|
||||
return local_configuration, edited_path
|
||||
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def connection(api_client, workspace_id, octavia_test_project_directory, source, destination, connection_state_path):
|
||||
configuration, configuration_path = updated_connection_configuration_and_path(octavia_test_project_directory, source, destination)
|
||||
connection = Connection(api_client, workspace_id, configuration, configuration_path)
|
||||
yield connection
|
||||
connection_api.ConnectionApi(api_client).delete_connection(ConnectionIdRequestBody(connection.resource_id))
|
||||
silent_remove(configuration_path)
|
||||
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def connection_with_normalization(
|
||||
api_client, workspace_id, octavia_test_project_directory, source, destination, connection_with_normalization_state_path
|
||||
):
|
||||
configuration, configuration_path = updated_connection_configuration_and_path(
|
||||
octavia_test_project_directory, source, destination, with_normalization=True
|
||||
)
|
||||
connection = Connection(api_client, workspace_id, configuration, configuration_path)
|
||||
yield connection
|
||||
connection_api.ConnectionApi(api_client).delete_connection(ConnectionIdRequestBody(connection.resource_id))
|
||||
silent_remove(configuration_path)
|
||||
@@ -1,19 +0,0 @@
|
||||
version: "3.7"
|
||||
services:
|
||||
nginx-proxy:
|
||||
build:
|
||||
context: ./octavia-cli/integration_tests
|
||||
dockerfile: nginx_proxy/Dockerfile
|
||||
ports:
|
||||
- "8010:80"
|
||||
depends_on:
|
||||
- init
|
||||
- bootloader
|
||||
- db
|
||||
- scheduler
|
||||
- worker
|
||||
- server
|
||||
- webapp
|
||||
- airbyte-temporal
|
||||
volumes:
|
||||
- "./octavia-cli/integration_tests/nginx_proxy/nginx.conf:/etc/nginx/nginx.conf"
|
||||
@@ -1,77 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import base64
|
||||
import logging
|
||||
|
||||
import pytest
|
||||
from click.testing import CliRunner
|
||||
from octavia_cli import api_http_headers, entrypoint
|
||||
|
||||
logging.basicConfig() # you need to initialize logging, otherwise you will not see anything from vcrpy
|
||||
vcr_log = logging.getLogger("vcr")
|
||||
vcr_log.setLevel(logging.WARN)
|
||||
|
||||
AIRBYTE_URL = "http://localhost:8000"
|
||||
AIRBYTE_USERNAME = "airbyte"
|
||||
AIRBYTE_PASSWORD = "password"
|
||||
|
||||
|
||||
@pytest.fixture(scope="module")
|
||||
def vcr_config():
|
||||
return {
|
||||
"record_mode": "rewrite",
|
||||
"match_on": ["method", "scheme", "host", "port", "path", "query", "headers"],
|
||||
}
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def file_based_headers(tmp_path):
|
||||
yaml_document = """
|
||||
headers:
|
||||
Custom-Header: Foo
|
||||
"""
|
||||
custom_api_http_headers_yaml_file_path = tmp_path / "custom_api_http_headers.yaml"
|
||||
custom_api_http_headers_yaml_file_path.write_text(yaml_document)
|
||||
expected_headers = [api_http_headers.ApiHttpHeader("Custom-Header", "Foo")]
|
||||
return custom_api_http_headers_yaml_file_path, expected_headers
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def option_based_headers():
|
||||
return ["Another-Custom-Header", "Bar"], [api_http_headers.ApiHttpHeader("Another-Custom-Header", "Bar")]
|
||||
|
||||
|
||||
@pytest.mark.vcr
|
||||
def test_api_http_headers(vcr, file_based_headers, option_based_headers):
|
||||
raw_option_based_headers, expected_option_based_headers = option_based_headers
|
||||
custom_api_http_headers_yaml_file_path, expected_file_based_headers = file_based_headers
|
||||
basic_auth_header_value = f"Basic {base64.b64encode(f'{AIRBYTE_USERNAME}:{AIRBYTE_PASSWORD}'.encode()).decode()}"
|
||||
expected_headers = (
|
||||
expected_option_based_headers
|
||||
+ expected_file_based_headers
|
||||
+ [api_http_headers.ApiHttpHeader("Authorization", basic_auth_header_value)]
|
||||
)
|
||||
runner = CliRunner()
|
||||
command_options = (
|
||||
[
|
||||
"--airbyte-url",
|
||||
AIRBYTE_URL,
|
||||
"--airbyte-username",
|
||||
AIRBYTE_USERNAME,
|
||||
"--airbyte-password",
|
||||
AIRBYTE_PASSWORD,
|
||||
"--api-http-headers-file-path",
|
||||
custom_api_http_headers_yaml_file_path,
|
||||
"--api-http-header",
|
||||
]
|
||||
+ raw_option_based_headers
|
||||
+ ["list", "connectors", "sources"]
|
||||
)
|
||||
|
||||
result = runner.invoke(entrypoint.octavia, command_options, obj={})
|
||||
for request in vcr.requests:
|
||||
for expected_header in expected_headers:
|
||||
assert request.headers[expected_header.name] == expected_header.value
|
||||
assert result.exit_code == 0
|
||||
@@ -1,65 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
|
||||
import pytest
|
||||
|
||||
pytestmark = pytest.mark.integration
|
||||
|
||||
|
||||
def test_source_lifecycle(source, workspace_id):
|
||||
assert not source.was_created
|
||||
source.create()
|
||||
source.state = source._get_state_from_file(source.configuration_path, workspace_id)
|
||||
assert source.was_created
|
||||
assert not source.get_diff_with_remote_resource()
|
||||
source.raw_configuration["configuration"]["pokemon_name"] = "snorlax"
|
||||
source.configuration = source._deserialize_raw_configuration()
|
||||
assert 'changed from "ditto" to "snorlax"' in source.get_diff_with_remote_resource()
|
||||
source.update()
|
||||
assert not source.get_diff_with_remote_resource()
|
||||
assert source.catalog["streams"][0]["config"]["alias_name"] == "pokemon"
|
||||
|
||||
|
||||
def test_destination_lifecycle(destination, workspace_id):
|
||||
assert not destination.was_created
|
||||
destination.create()
|
||||
destination.state = destination._get_state_from_file(destination.configuration_path, workspace_id)
|
||||
assert destination.was_created
|
||||
assert not destination.get_diff_with_remote_resource()
|
||||
destination.raw_configuration["configuration"]["host"] = "foo"
|
||||
destination.configuration = destination._deserialize_raw_configuration()
|
||||
assert 'changed from "localhost" to "foo"' in destination.get_diff_with_remote_resource()
|
||||
destination.update()
|
||||
assert not destination.get_diff_with_remote_resource()
|
||||
|
||||
|
||||
def test_connection_lifecycle(source, destination, connection, workspace_id):
|
||||
assert source.was_created
|
||||
assert destination.was_created
|
||||
assert not connection.was_created
|
||||
connection.create()
|
||||
connection.state = connection._get_state_from_file(connection.configuration_path, workspace_id)
|
||||
assert connection.was_created
|
||||
connection.raw_configuration["configuration"]["status"] = "inactive"
|
||||
connection.configuration = connection._deserialize_raw_configuration()
|
||||
assert 'changed from "active" to "inactive"' in connection.get_diff_with_remote_resource()
|
||||
connection.update()
|
||||
|
||||
|
||||
def test_connection_lifecycle_with_normalization(source, destination, connection_with_normalization, workspace_id):
|
||||
assert source.was_created
|
||||
assert destination.was_created
|
||||
assert not connection_with_normalization.was_created
|
||||
connection_with_normalization.create()
|
||||
connection_with_normalization.state = connection_with_normalization._get_state_from_file(
|
||||
connection_with_normalization.configuration_path, workspace_id
|
||||
)
|
||||
assert connection_with_normalization.was_created
|
||||
assert connection_with_normalization.remote_resource["operations"][0]["operation_id"] is not None
|
||||
assert connection_with_normalization.remote_resource["operations"][0]["operator_configuration"]["normalization"]["option"] == "basic"
|
||||
connection_with_normalization.raw_configuration["configuration"]["status"] = "inactive"
|
||||
connection_with_normalization.configuration = connection_with_normalization._deserialize_raw_configuration()
|
||||
assert 'changed from "active" to "inactive"' in connection_with_normalization.get_diff_with_remote_resource()
|
||||
connection_with_normalization.update()
|
||||
@@ -1,3 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2021 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
@@ -1,56 +0,0 @@
|
||||
# Configuration for connection my_new_connection
|
||||
definition_type: connection
|
||||
resource_name: "my_new_connection"
|
||||
source_configuration_path: source_configuration_path
|
||||
destination_configuration_path: destination_configuration_path
|
||||
|
||||
# EDIT THE CONFIGURATION BELOW!
|
||||
configuration:
|
||||
status: active # REQUIRED | string | Allowed values: active, inactive, deprecated
|
||||
skip_reset: false # OPTIONAL | boolean | Flag to check if the connection should be reset after a connection update
|
||||
namespace_definition: source # OPTIONAL | string | Allowed values: source, destination, customformat
|
||||
namespace_format: "${SOURCE_NAMESPACE}" # OPTIONAL | string | Used when namespaceDefinition is 'customformat'. If blank then behaves like namespaceDefinition = 'destination'. If "${SOURCE_NAMESPACE}" then behaves like namespaceDefinition = 'source'.
|
||||
prefix: "" # REQUIRED | Prefix that will be prepended to the name of each stream when it is written to the destination
|
||||
resource_requirements: # OPTIONAL | object | Resource requirements to run workers (blank for unbounded allocations)
|
||||
cpu_limit: "" # OPTIONAL
|
||||
cpu_request: "" # OPTIONAL
|
||||
memory_limit: "" # OPTIONAL
|
||||
memory_request: "" # OPTIONAL
|
||||
schedule_type: basic # OPTIONAL | string | Allowed values: basic, cron, manual
|
||||
schedule_data: # OPTIONAL | object
|
||||
basic_schedule:
|
||||
time_unit: hours # REQUIRED | string | Allowed values: minutes, hours, days, weeks, months
|
||||
units: 1 # REQUIRED | integer
|
||||
# cron:
|
||||
# cron_time_zone: "UTC" # REQUIRED | string
|
||||
# cron_expression: "* */2 * * * ?" # REQUIRED | string
|
||||
# operations:
|
||||
## -------- Uncomment and edit the block below if you want to enable Airbyte normalization --------
|
||||
# - name: "Normalization"
|
||||
# operator_configuration:
|
||||
# normalization:
|
||||
# option: "basic"
|
||||
# operator_type: "normalization"
|
||||
## -------- Uncomment and edit the block below if you want to declare a custom transformation --------
|
||||
# - name: "My dbt transformations" # REQUIRED | string
|
||||
# operator_configuration:
|
||||
# dbt:
|
||||
# dbt_arguments: "run" # REQUIRED | string | Entrypoint arguments for dbt cli to run the project
|
||||
# docker_image: "fishtownanalytics/dbt:0.19.1" # REQUIRED | string | Docker image URL with dbt installed
|
||||
# git_repo_branch: "your-repo-branch-name" # OPTIONAL | string | Git branch name
|
||||
# git_repo_url: "https://github.com/<your git repo>" # REQUIRED | string | Git repository URL of the custom transformation project
|
||||
# operator_type: dbt # REQUIRED | string | Allowed values: dbt, normalization
|
||||
sync_catalog: # OPTIONAL | object | 🚨 ONLY edit streams.config, streams.stream should not be edited as schema cannot be changed.
|
||||
streams:
|
||||
- config:
|
||||
alias_name: pokemon
|
||||
destination_sync_mode: append
|
||||
selected: true
|
||||
sync_mode: full_refresh
|
||||
stream:
|
||||
default_cursor_field:
|
||||
- foo
|
||||
json_schema: {}
|
||||
name: my_stream
|
||||
supported_sync_modes:
|
||||
- full_refresh
|
||||
@@ -1,40 +0,0 @@
|
||||
# Configuration for connection my_new_connection
|
||||
definition_type: connection
|
||||
resource_name: "my_new_connection"
|
||||
source_configuration_path: source_configuration_path
|
||||
destination_configuration_path: destination_configuration_path
|
||||
|
||||
# EDIT THE CONFIGURATION BELOW!
|
||||
configuration:
|
||||
status: active # REQUIRED | string | Allowed values: active, inactive, deprecated
|
||||
skip_reset: false # OPTIONAL | boolean | Flag to check if the connection should be reset after a connection update
|
||||
namespace_definition: source # OPTIONAL | string | Allowed values: source, destination, customformat
|
||||
namespace_format: "${SOURCE_NAMESPACE}" # OPTIONAL | string | Used when namespaceDefinition is 'customformat'. If blank then behaves like namespaceDefinition = 'destination'. If "${SOURCE_NAMESPACE}" then behaves like namespaceDefinition = 'source'.
|
||||
prefix: "" # REQUIRED | Prefix that will be prepended to the name of each stream when it is written to the destination
|
||||
resource_requirements: # OPTIONAL | object | Resource requirements to run workers (blank for unbounded allocations)
|
||||
cpu_limit: "" # OPTIONAL
|
||||
cpu_request: "" # OPTIONAL
|
||||
memory_limit: "" # OPTIONAL
|
||||
memory_request: "" # OPTIONAL
|
||||
schedule_type: basic # OPTIONAL | string | Allowed values: basic, cron, manual
|
||||
schedule_data: # OPTIONAL | object
|
||||
basic_schedule:
|
||||
time_unit: hours # REQUIRED | string | Allowed values: minutes, hours, days, weeks, months
|
||||
units: 1 # REQUIRED | integer
|
||||
# cron:
|
||||
# cron_time_zone: "UTC" # REQUIRED | string
|
||||
# cron_expression: "* */2 * * * ?" # REQUIRED | string
|
||||
sync_catalog: # OPTIONAL | object | 🚨 ONLY edit streams.config, streams.stream should not be edited as schema cannot be changed.
|
||||
streams:
|
||||
- config:
|
||||
alias_name: pokemon
|
||||
destination_sync_mode: append
|
||||
selected: true
|
||||
sync_mode: full_refresh
|
||||
stream:
|
||||
default_cursor_field:
|
||||
- foo
|
||||
json_schema: {}
|
||||
name: my_stream
|
||||
supported_sync_modes:
|
||||
- full_refresh
|
||||
@@ -1,32 +0,0 @@
|
||||
# Configuration for airbyte/destination-postgres
|
||||
# Documentation about this connector can be found at https://docs.airbyte.io/integrations/destinations/postgres
|
||||
resource_name: "my_postgres_destination"
|
||||
definition_type: destination
|
||||
definition_id: foobar
|
||||
definition_image: airbyte/destination-postgres
|
||||
definition_version: 0.3.13
|
||||
|
||||
# EDIT THE CONFIGURATION BELOW!
|
||||
configuration:
|
||||
host: # REQUIRED | string | Hostname of the database.
|
||||
port: 5432 # REQUIRED | integer | Port of the database. | Example: 5432
|
||||
database: # REQUIRED | string | Name of the database.
|
||||
schema: "public" # REQUIRED | string | The default schema tables are written to if the source does not specify a namespace. The usual value for this field is "public". | Example: public
|
||||
username: # REQUIRED | string | Username to use to access the database.
|
||||
password: ${PASSWORD} # SECRET (please store in environment variables) | OPTIONAL | string | Password associated with the username.
|
||||
ssl: # OPTIONAL | boolean | Encrypt data using SSL.
|
||||
tunnel_method:
|
||||
## -------- Pick one valid structure among the examples below: --------
|
||||
tunnel_method: "NO_TUNNEL" # REQUIRED | string | No ssh tunnel needed to connect to database
|
||||
## -------- Another valid structure for tunnel_method: --------
|
||||
# tunnel_method: "SSH_KEY_AUTH" # REQUIRED | string | Connect through a jump server tunnel host using username and ssh key
|
||||
# tunnel_host: # REQUIRED | string | Hostname of the jump server host that allows inbound ssh tunnel.
|
||||
# tunnel_port: 22 # REQUIRED | integer | Port on the proxy/jump server that accepts inbound ssh connections. | Example: 22
|
||||
# tunnel_user: # REQUIRED | string | OS-level username for logging into the jump server host.
|
||||
# ssh_key: ${SSH_KEY} # SECRET (please store in environment variables) | REQUIRED | string | OS-level user account ssh key credentials in RSA PEM format ( created with ssh-keygen -t rsa -m PEM -f myuser_rsa )
|
||||
## -------- Another valid structure for tunnel_method: --------
|
||||
# tunnel_method: "SSH_PASSWORD_AUTH" # REQUIRED | string | Connect through a jump server tunnel host using username and password authentication
|
||||
# tunnel_host: # REQUIRED | string | Hostname of the jump server host that allows inbound ssh tunnel.
|
||||
# tunnel_port: 22 # REQUIRED | integer | Port on the proxy/jump server that accepts inbound ssh connections. | Example: 22
|
||||
# tunnel_user: # REQUIRED | string | OS-level username for logging into the jump server host
|
||||
# tunnel_user_password: ${TUNNEL_USER_PASSWORD} # SECRET (please store in environment variables) | REQUIRED | string | OS-level password for logging into the jump server host
|
||||
@@ -1,177 +0,0 @@
|
||||
dockerImage: "airbyte/destination-postgres:0.3.13"
|
||||
spec:
|
||||
documentationUrl: "https://docs.airbyte.io/integrations/destinations/postgres"
|
||||
connectionSpecification:
|
||||
$schema: "http://json-schema.org/draft-07/schema#"
|
||||
title: "Postgres Destination Spec"
|
||||
type: "object"
|
||||
required:
|
||||
- "host"
|
||||
- "port"
|
||||
- "username"
|
||||
- "database"
|
||||
- "schema"
|
||||
additionalProperties: true
|
||||
properties:
|
||||
host:
|
||||
title: "Host"
|
||||
description: "Hostname of the database."
|
||||
type: "string"
|
||||
order: 0
|
||||
port:
|
||||
title: "Port"
|
||||
description: "Port of the database."
|
||||
type: "integer"
|
||||
minimum: 0
|
||||
maximum: 65536
|
||||
default: 5432
|
||||
examples:
|
||||
- "5432"
|
||||
order: 1
|
||||
database:
|
||||
title: "DB Name"
|
||||
description: "Name of the database."
|
||||
type: "string"
|
||||
order: 2
|
||||
schema:
|
||||
title: "Default Schema"
|
||||
description:
|
||||
"The default schema tables are written to if the source does\
|
||||
\ not specify a namespace. The usual value for this field is \"public\"\
|
||||
."
|
||||
type: "string"
|
||||
examples:
|
||||
- "public"
|
||||
default: "public"
|
||||
order: 3
|
||||
username:
|
||||
title: "User"
|
||||
description: "Username to use to access the database."
|
||||
type: "string"
|
||||
order: 4
|
||||
password:
|
||||
title: "Password"
|
||||
description: "Password associated with the username."
|
||||
type: "string"
|
||||
airbyte_secret: true
|
||||
order: 5
|
||||
ssl:
|
||||
title: "SSL Connection"
|
||||
description: "Encrypt data using SSL."
|
||||
type: "boolean"
|
||||
default: false
|
||||
order: 6
|
||||
tunnel_method:
|
||||
type: "object"
|
||||
title: "SSH Tunnel Method"
|
||||
description:
|
||||
"Whether to initiate an SSH tunnel before connecting to the\
|
||||
\ database, and if so, which kind of authentication to use."
|
||||
oneOf:
|
||||
- title: "No Tunnel"
|
||||
required:
|
||||
- "tunnel_method"
|
||||
properties:
|
||||
tunnel_method:
|
||||
description: "No ssh tunnel needed to connect to database"
|
||||
type: "string"
|
||||
const: "NO_TUNNEL"
|
||||
order: 0
|
||||
- title: "SSH Key Authentication"
|
||||
required:
|
||||
- "tunnel_method"
|
||||
- "tunnel_host"
|
||||
- "tunnel_port"
|
||||
- "tunnel_user"
|
||||
- "ssh_key"
|
||||
properties:
|
||||
tunnel_method:
|
||||
description:
|
||||
"Connect through a jump server tunnel host using username\
|
||||
\ and ssh key"
|
||||
type: "string"
|
||||
const: "SSH_KEY_AUTH"
|
||||
order: 0
|
||||
tunnel_host:
|
||||
title: "SSH Tunnel Jump Server Host"
|
||||
description:
|
||||
"Hostname of the jump server host that allows inbound\
|
||||
\ ssh tunnel."
|
||||
type: "string"
|
||||
order: 1
|
||||
tunnel_port:
|
||||
title: "SSH Connection Port"
|
||||
description:
|
||||
"Port on the proxy/jump server that accepts inbound ssh\
|
||||
\ connections."
|
||||
type: "integer"
|
||||
minimum: 0
|
||||
maximum: 65536
|
||||
default: 22
|
||||
examples:
|
||||
- "22"
|
||||
order: 2
|
||||
tunnel_user:
|
||||
title: "SSH Login Username"
|
||||
description: "OS-level username for logging into the jump server host."
|
||||
type: "string"
|
||||
order: 3
|
||||
ssh_key:
|
||||
title: "SSH Private Key"
|
||||
description:
|
||||
"OS-level user account ssh key credentials in RSA PEM\
|
||||
\ format ( created with ssh-keygen -t rsa -m PEM -f myuser_rsa )"
|
||||
type: "string"
|
||||
airbyte_secret: true
|
||||
multiline: true
|
||||
order: 4
|
||||
- title: "Password Authentication"
|
||||
required:
|
||||
- "tunnel_method"
|
||||
- "tunnel_host"
|
||||
- "tunnel_port"
|
||||
- "tunnel_user"
|
||||
- "tunnel_user_password"
|
||||
properties:
|
||||
tunnel_method:
|
||||
description:
|
||||
"Connect through a jump server tunnel host using username\
|
||||
\ and password authentication"
|
||||
type: "string"
|
||||
const: "SSH_PASSWORD_AUTH"
|
||||
order: 0
|
||||
tunnel_host:
|
||||
title: "SSH Tunnel Jump Server Host"
|
||||
description:
|
||||
"Hostname of the jump server host that allows inbound\
|
||||
\ ssh tunnel."
|
||||
type: "string"
|
||||
order: 1
|
||||
tunnel_port:
|
||||
title: "SSH Connection Port"
|
||||
description:
|
||||
"Port on the proxy/jump server that accepts inbound ssh\
|
||||
\ connections."
|
||||
type: "integer"
|
||||
minimum: 0
|
||||
maximum: 65536
|
||||
default: 22
|
||||
examples:
|
||||
- "22"
|
||||
order: 2
|
||||
tunnel_user:
|
||||
title: "SSH Login Username"
|
||||
description: "OS-level username for logging into the jump server host"
|
||||
type: "string"
|
||||
order: 3
|
||||
tunnel_user_password:
|
||||
title: "Password"
|
||||
description: "OS-level password for logging into the jump server host"
|
||||
type: "string"
|
||||
airbyte_secret: true
|
||||
order: 4
|
||||
supportsIncremental: true
|
||||
supported_destination_sync_modes:
|
||||
- "overwrite"
|
||||
- "append"
|
||||
- "append_dedup"
|
||||
@@ -1,52 +0,0 @@
|
||||
# Configuration for airbyte/destination-s3
|
||||
# Documentation about this connector can be found at https://docs.airbyte.io/integrations/destinations/s3
|
||||
resource_name: "my_s3_destination"
|
||||
definition_type: destination
|
||||
definition_id: foobar
|
||||
definition_image: airbyte/destination-s3
|
||||
definition_version: 0.2.5
|
||||
|
||||
# EDIT THE CONFIGURATION BELOW!
|
||||
configuration:
|
||||
s3_endpoint: # OPTIONAL | string | This is your S3 endpoint url.(if you are working with AWS S3, just leave empty). | Example: http://localhost:9000
|
||||
s3_bucket_name: # REQUIRED | string | The name of the S3 bucket. | Example: airbyte_sync
|
||||
s3_bucket_path: # REQUIRED | string | Directory under the S3 bucket where data will be written. | Example: data_sync/test
|
||||
s3_bucket_region: # REQUIRED | string | The region of the S3 bucket.
|
||||
access_key_id: ${ACCESS_KEY_ID} # SECRET (please store in environment variables) | OPTIONAL | string | The access key id to access the S3 bucket. Airbyte requires Read and Write permissions to the given bucket, if not set, Airbyte will rely on Instance Profile. | Example: A012345678910EXAMPLE
|
||||
secret_access_key: ${SECRET_ACCESS_KEY} # SECRET (please store in environment variables) | OPTIONAL | string | The corresponding secret to the access key id, if S3 Key Id is set, then S3 Access Key must also be provided | Example: a012345678910ABCDEFGH/AbCdEfGhEXAMPLEKEY
|
||||
format:
|
||||
## -------- Pick one valid structure among the examples below: --------
|
||||
format_type: "Avro" # REQUIRED | string}
|
||||
compression_codec:
|
||||
## -------- Pick one valid structure among the examples below: --------
|
||||
codec: "no compression" # REQUIRED | string
|
||||
## -------- Another valid structure for compression_codec: --------
|
||||
# codec: "Deflate" # REQUIRED | string
|
||||
# compression_level: # REQUIRED | integer | 0: no compression & fastest, 9: best compression & slowest.
|
||||
## -------- Another valid structure for compression_codec: --------
|
||||
# codec: "bzip2" # REQUIRED | string
|
||||
## -------- Another valid structure for compression_codec: --------
|
||||
# codec: "xz" # REQUIRED | string
|
||||
# compression_level: 6 # REQUIRED | integer | See <a href="https://commons.apache.org/proper/commons-compress/apidocs/org/apache/commons/compress/compressors/xz/XZCompressorOutputStream.html#XZCompressorOutputStream-java.io.OutputStream-int-">here</a> for details.
|
||||
## -------- Another valid structure for compression_codec: --------
|
||||
# codec: "zstandard" # REQUIRED | string
|
||||
# compression_level: 3 # REQUIRED | integer | Negative levels are 'fast' modes akin to lz4 or snappy, levels above 9 are generally for archival purposes, and levels above 18 use a lot of memory.
|
||||
# include_checksum: # OPTIONAL | boolean | If true, include a checksum with each data block.
|
||||
## -------- Another valid structure for compression_codec: --------
|
||||
# codec: "snappy" # REQUIRED | string
|
||||
part_size_mb: 5 # OPTIONAL | integer | This is the size of a "Part" being buffered in memory. It limits the memory usage when writing. Larger values will allow to upload a bigger files and improve the speed, but consumes more memory. Allowed values: min=5MB, max=525MB Default: 5MB. | Example: 5
|
||||
## -------- Another valid structure for format: --------
|
||||
# format_type: "CSV" # REQUIRED | string
|
||||
# flattening: "No flattening" # REQUIRED | string | Whether the input json data should be normalized (flattened) in the output CSV. Please refer to docs for details.
|
||||
# part_size_mb: 5 # OPTIONAL | integer | This is the size of a "Part" being buffered in memory. It limits the memory usage when writing. Larger values will allow to upload a bigger files and improve the speed, but consumes more memory. Allowed values: min=5MB, max=525MB Default: 5MB. | Example: 5
|
||||
## -------- Another valid structure for format: --------
|
||||
# format_type: "JSONL" # REQUIRED | string
|
||||
# part_size_mb: 5 # OPTIONAL | integer | This is the size of a "Part" being buffered in memory. It limits the memory usage when writing. Larger values will allow to upload a bigger files and improve the speed, but consumes more memory. Allowed values: min=5MB, max=525MB Default: 5MB. | Example: 5
|
||||
## -------- Another valid structure for format: --------
|
||||
# format_type: "Parquet" # REQUIRED | string
|
||||
# compression_codec: "UNCOMPRESSED" # OPTIONAL | string | The compression algorithm used to compress data pages.
|
||||
# block_size_mb: 128 # OPTIONAL | integer | This is the size of a row group being buffered in memory. It limits the memory usage when writing. Larger values will improve the IO when reading, but consume more memory when writing. Default: 128 MB. | Example: 128
|
||||
# max_padding_size_mb: 8 # OPTIONAL | integer | Maximum size allowed as padding to align row groups. This is also the minimum size of a row group. Default: 8 MB. | Example: 8
|
||||
# page_size_kb: 1024 # OPTIONAL | integer | The page size is for compression. A block is composed of pages. A page is the smallest unit that must be read fully to access a single record. If this value is too small, the compression will deteriorate. Default: 1024 KB. | Example: 1024
|
||||
# dictionary_page_size_kb: 1024 # OPTIONAL | integer | There is one dictionary page per column per row group when dictionary encoding is used. The dictionary page size works like the page size but for dictionary. Default: 1024 KB. | Example: 1024
|
||||
# dictionary_encoding: true # OPTIONAL | boolean | Default: true.
|
||||
@@ -1,330 +0,0 @@
|
||||
dockerImage: "airbyte/destination-s3:0.2.5"
|
||||
spec:
|
||||
documentationUrl: "https://docs.airbyte.io/integrations/destinations/s3"
|
||||
connectionSpecification:
|
||||
$schema: "http://json-schema.org/draft-07/schema#"
|
||||
title: "S3 Destination Spec"
|
||||
type: "object"
|
||||
required:
|
||||
- "s3_bucket_name"
|
||||
- "s3_bucket_path"
|
||||
- "s3_bucket_region"
|
||||
- "format"
|
||||
additionalProperties: false
|
||||
properties:
|
||||
s3_endpoint:
|
||||
title: "Endpoint"
|
||||
type: "string"
|
||||
default: ""
|
||||
description: "This is your S3 endpoint url.(if you are working with AWS\
|
||||
\ S3, just leave empty)."
|
||||
examples:
|
||||
- "http://localhost:9000"
|
||||
s3_bucket_name:
|
||||
title: "S3 Bucket Name"
|
||||
type: "string"
|
||||
description: "The name of the S3 bucket."
|
||||
examples:
|
||||
- "airbyte_sync"
|
||||
s3_bucket_path:
|
||||
description: "Directory under the S3 bucket where data will be written."
|
||||
type: "string"
|
||||
examples:
|
||||
- "data_sync/test"
|
||||
s3_bucket_region:
|
||||
title: "S3 Bucket Region"
|
||||
type: "string"
|
||||
default: ""
|
||||
description: "The region of the S3 bucket."
|
||||
enum:
|
||||
- ""
|
||||
- "us-east-1"
|
||||
- "us-east-2"
|
||||
- "us-west-1"
|
||||
- "us-west-2"
|
||||
- "af-south-1"
|
||||
- "ap-east-1"
|
||||
- "ap-south-1"
|
||||
- "ap-northeast-1"
|
||||
- "ap-northeast-2"
|
||||
- "ap-northeast-3"
|
||||
- "ap-southeast-1"
|
||||
- "ap-southeast-2"
|
||||
- "ca-central-1"
|
||||
- "cn-north-1"
|
||||
- "cn-northwest-1"
|
||||
- "eu-central-1"
|
||||
- "eu-north-1"
|
||||
- "eu-south-1"
|
||||
- "eu-west-1"
|
||||
- "eu-west-2"
|
||||
- "eu-west-3"
|
||||
- "sa-east-1"
|
||||
- "me-south-1"
|
||||
- "us-gov-east-1"
|
||||
- "us-gov-west-1"
|
||||
access_key_id:
|
||||
type: "string"
|
||||
description:
|
||||
"The access key id to access the S3 bucket. Airbyte requires\
|
||||
\ Read and Write permissions to the given bucket, if not set, Airbyte\
|
||||
\ will rely on Instance Profile."
|
||||
title: "S3 Key Id"
|
||||
airbyte_secret: true
|
||||
examples:
|
||||
- "A012345678910EXAMPLE"
|
||||
secret_access_key:
|
||||
type: "string"
|
||||
description:
|
||||
"The corresponding secret to the access key id, if S3 Key Id\
|
||||
\ is set, then S3 Access Key must also be provided"
|
||||
title: "S3 Access Key"
|
||||
airbyte_secret: true
|
||||
examples:
|
||||
- "a012345678910ABCDEFGH/AbCdEfGhEXAMPLEKEY"
|
||||
format:
|
||||
title: "Output Format"
|
||||
type: "object"
|
||||
description: "Output data format"
|
||||
oneOf:
|
||||
- title: "Avro: Apache Avro"
|
||||
required:
|
||||
- "format_type"
|
||||
- "compression_codec"
|
||||
properties:
|
||||
format_type:
|
||||
type: "string"
|
||||
enum:
|
||||
- "Avro"
|
||||
default: "Avro"
|
||||
compression_codec:
|
||||
title: "Compression Codec"
|
||||
description:
|
||||
"The compression algorithm used to compress data. Default\
|
||||
\ to no compression."
|
||||
type: "object"
|
||||
oneOf:
|
||||
- title: "no compression"
|
||||
required:
|
||||
- "codec"
|
||||
properties:
|
||||
codec:
|
||||
type: "string"
|
||||
enum:
|
||||
- "no compression"
|
||||
default: "no compression"
|
||||
- title: "Deflate"
|
||||
required:
|
||||
- "codec"
|
||||
- "compression_level"
|
||||
properties:
|
||||
codec:
|
||||
type: "string"
|
||||
enum:
|
||||
- "Deflate"
|
||||
default: "Deflate"
|
||||
compression_level:
|
||||
title: "Deflate level"
|
||||
description:
|
||||
"0: no compression & fastest, 9: best compression\
|
||||
\ & slowest."
|
||||
type: "integer"
|
||||
default: 0
|
||||
minimum: 0
|
||||
maximum: 9
|
||||
- title: "bzip2"
|
||||
required:
|
||||
- "codec"
|
||||
properties:
|
||||
codec:
|
||||
type: "string"
|
||||
enum:
|
||||
- "bzip2"
|
||||
default: "bzip2"
|
||||
- title: "xz"
|
||||
required:
|
||||
- "codec"
|
||||
- "compression_level"
|
||||
properties:
|
||||
codec:
|
||||
type: "string"
|
||||
enum:
|
||||
- "xz"
|
||||
default: "xz"
|
||||
compression_level:
|
||||
title: "Compression level"
|
||||
description:
|
||||
"See <a href=\"https://commons.apache.org/proper/commons-compress/apidocs/org/apache/commons/compress/compressors/xz/XZCompressorOutputStream.html#XZCompressorOutputStream-java.io.OutputStream-int-\"\
|
||||
>here</a> for details."
|
||||
type: "integer"
|
||||
default: 6
|
||||
minimum: 0
|
||||
maximum: 9
|
||||
- title: "zstandard"
|
||||
required:
|
||||
- "codec"
|
||||
- "compression_level"
|
||||
properties:
|
||||
codec:
|
||||
type: "string"
|
||||
enum:
|
||||
- "zstandard"
|
||||
default: "zstandard"
|
||||
compression_level:
|
||||
title: "Compression level"
|
||||
description:
|
||||
"Negative levels are 'fast' modes akin to lz4 or\
|
||||
\ snappy, levels above 9 are generally for archival purposes,\
|
||||
\ and levels above 18 use a lot of memory."
|
||||
type: "integer"
|
||||
default: 3
|
||||
minimum: -5
|
||||
maximum: 22
|
||||
include_checksum:
|
||||
title: "Include checksum"
|
||||
description: "If true, include a checksum with each data block."
|
||||
type: "boolean"
|
||||
default: false
|
||||
- title: "snappy"
|
||||
required:
|
||||
- "codec"
|
||||
properties:
|
||||
codec:
|
||||
type: "string"
|
||||
enum:
|
||||
- "snappy"
|
||||
default: "snappy"
|
||||
part_size_mb:
|
||||
title: "Block Size (MB) for Amazon S3 multipart upload"
|
||||
description:
|
||||
"This is the size of a \"Part\" being buffered in memory.\
|
||||
\ It limits the memory usage when writing. Larger values will allow\
|
||||
\ to upload a bigger files and improve the speed, but consumes\
|
||||
\ more memory. Allowed values: min=5MB, max=525MB Default: 5MB."
|
||||
type: "integer"
|
||||
default: 5
|
||||
examples:
|
||||
- 5
|
||||
- title: "CSV: Comma-Separated Values"
|
||||
required:
|
||||
- "format_type"
|
||||
- "flattening"
|
||||
properties:
|
||||
format_type:
|
||||
type: "string"
|
||||
enum:
|
||||
- "CSV"
|
||||
default: "CSV"
|
||||
flattening:
|
||||
type: "string"
|
||||
title: "Normalization (Flattening)"
|
||||
description:
|
||||
"Whether the input json data should be normalized (flattened)\
|
||||
\ in the output CSV. Please refer to docs for details."
|
||||
default: "No flattening"
|
||||
enum:
|
||||
- "No flattening"
|
||||
- "Root level flattening"
|
||||
part_size_mb:
|
||||
title: "Block Size (MB) for Amazon S3 multipart upload"
|
||||
description:
|
||||
"This is the size of a \"Part\" being buffered in memory.\
|
||||
\ It limits the memory usage when writing. Larger values will allow\
|
||||
\ to upload a bigger files and improve the speed, but consumes\
|
||||
\ more memory. Allowed values: min=5MB, max=525MB Default: 5MB."
|
||||
type: "integer"
|
||||
default: 5
|
||||
examples:
|
||||
- 5
|
||||
- title: "JSON Lines: newline-delimited JSON"
|
||||
required:
|
||||
- "format_type"
|
||||
properties:
|
||||
format_type:
|
||||
type: "string"
|
||||
enum:
|
||||
- "JSONL"
|
||||
default: "JSONL"
|
||||
part_size_mb:
|
||||
title: "Block Size (MB) for Amazon S3 multipart upload"
|
||||
description:
|
||||
"This is the size of a \"Part\" being buffered in memory.\
|
||||
\ It limits the memory usage when writing. Larger values will allow\
|
||||
\ to upload a bigger files and improve the speed, but consumes\
|
||||
\ more memory. Allowed values: min=5MB, max=525MB Default: 5MB."
|
||||
type: "integer"
|
||||
default: 5
|
||||
examples:
|
||||
- 5
|
||||
- title: "Parquet: Columnar Storage"
|
||||
required:
|
||||
- "format_type"
|
||||
properties:
|
||||
format_type:
|
||||
type: "string"
|
||||
enum:
|
||||
- "Parquet"
|
||||
default: "Parquet"
|
||||
compression_codec:
|
||||
title: "Compression Codec"
|
||||
description: "The compression algorithm used to compress data pages."
|
||||
type: "string"
|
||||
enum:
|
||||
- "UNCOMPRESSED"
|
||||
- "SNAPPY"
|
||||
- "GZIP"
|
||||
- "LZO"
|
||||
- "BROTLI"
|
||||
- "LZ4"
|
||||
- "ZSTD"
|
||||
default: "UNCOMPRESSED"
|
||||
block_size_mb:
|
||||
title: "Block Size (Row Group Size) (MB)"
|
||||
description:
|
||||
"This is the size of a row group being buffered in memory.\
|
||||
\ It limits the memory usage when writing. Larger values will improve\
|
||||
\ the IO when reading, but consume more memory when writing. Default:\
|
||||
\ 128 MB."
|
||||
type: "integer"
|
||||
default: 128
|
||||
examples:
|
||||
- 128
|
||||
max_padding_size_mb:
|
||||
title: "Max Padding Size (MB)"
|
||||
description:
|
||||
"Maximum size allowed as padding to align row groups.\
|
||||
\ This is also the minimum size of a row group. Default: 8 MB."
|
||||
type: "integer"
|
||||
default: 8
|
||||
examples:
|
||||
- 8
|
||||
page_size_kb:
|
||||
title: "Page Size (KB)"
|
||||
description:
|
||||
"The page size is for compression. A block is composed\
|
||||
\ of pages. A page is the smallest unit that must be read fully\
|
||||
\ to access a single record. If this value is too small, the compression\
|
||||
\ will deteriorate. Default: 1024 KB."
|
||||
type: "integer"
|
||||
default: 1024
|
||||
examples:
|
||||
- 1024
|
||||
dictionary_page_size_kb:
|
||||
title: "Dictionary Page Size (KB)"
|
||||
description:
|
||||
"There is one dictionary page per column per row group\
|
||||
\ when dictionary encoding is used. The dictionary page size works\
|
||||
\ like the page size but for dictionary. Default: 1024 KB."
|
||||
type: "integer"
|
||||
default: 1024
|
||||
examples:
|
||||
- 1024
|
||||
dictionary_encoding:
|
||||
title: "Dictionary Encoding"
|
||||
description: "Default: true."
|
||||
type: "boolean"
|
||||
default: true
|
||||
supportsIncremental: true
|
||||
supported_destination_sync_modes:
|
||||
- "overwrite"
|
||||
- "append"
|
||||
@@ -1,40 +0,0 @@
|
||||
# Configuration for airbyte/source-postgres
|
||||
# Documentation about this connector can be found at https://docs.airbyte.com/integrations/sources/postgres
|
||||
resource_name: "my_postgres_source"
|
||||
definition_type: source
|
||||
definition_id: foobar
|
||||
definition_image: airbyte/source-postgres
|
||||
definition_version: 0.4.4
|
||||
|
||||
# EDIT THE CONFIGURATION BELOW!
|
||||
configuration:
|
||||
host: # REQUIRED | string | Hostname of the database.
|
||||
port: 5432 # REQUIRED | integer | Port of the database. | Example: 5432
|
||||
database: # REQUIRED | string | Name of the database.
|
||||
schemas: ["public"] # OPTIONAL | array | The list of schemas to sync from. Defaults to user. Case sensitive.
|
||||
username: # REQUIRED | string | Username to use to access the database.
|
||||
password: ${PASSWORD} # SECRET (please store in environment variables) | OPTIONAL | string | Password associated with the username.
|
||||
ssl: # OPTIONAL | boolean | Encrypt client/server communications for increased security.
|
||||
replication_method:
|
||||
## -------- Pick one valid structure among the examples below: --------
|
||||
method: "Standard" # REQUIRED | string
|
||||
## -------- Another valid structure for replication_method: --------
|
||||
# method: "CDC" # REQUIRED | string
|
||||
# plugin: "pgoutput" # OPTIONAL | string | A logical decoding plug-in installed on the PostgreSQL server. `pgoutput` plug-in is used by default.If replication table contains a lot of big jsonb values it is recommended to use `wal2json` plug-in. For more information about `wal2json` plug-in read <a href="https://docs.airbyte.com/integrations/sources/postgres">Postgres Source</a> docs.
|
||||
# replication_slot: # REQUIRED | string | A plug-in logical replication slot.
|
||||
# publication: # REQUIRED | string | A Postgres publication used for consuming changes.
|
||||
tunnel_method:
|
||||
## -------- Pick one valid structure among the examples below: --------
|
||||
tunnel_method: "NO_TUNNEL" # REQUIRED | string | No ssh tunnel needed to connect to database
|
||||
## -------- Another valid structure for tunnel_method: --------
|
||||
# tunnel_method: "SSH_KEY_AUTH" # REQUIRED | string | Connect through a jump server tunnel host using username and ssh key
|
||||
# tunnel_host: # REQUIRED | string | Hostname of the jump server host that allows inbound ssh tunnel.
|
||||
# tunnel_port: 22 # REQUIRED | integer | Port on the proxy/jump server that accepts inbound ssh connections. | Example: 22
|
||||
# tunnel_user: # REQUIRED | string | OS-level username for logging into the jump server host.
|
||||
# ssh_key: ${SSH_KEY} # SECRET (please store in environment variables) | REQUIRED | string | OS-level user account ssh key credentials in RSA PEM format ( created with ssh-keygen -t rsa -m PEM -f myuser_rsa )
|
||||
## -------- Another valid structure for tunnel_method: --------
|
||||
# tunnel_method: "SSH_PASSWORD_AUTH" # REQUIRED | string | Connect through a jump server tunnel host using username and password authentication
|
||||
# tunnel_host: # REQUIRED | string | Hostname of the jump server host that allows inbound ssh tunnel.
|
||||
# tunnel_port: 22 # REQUIRED | integer | Port on the proxy/jump server that accepts inbound ssh connections. | Example: 22
|
||||
# tunnel_user: # REQUIRED | string | OS-level username for logging into the jump server host
|
||||
# tunnel_user_password: ${TUNNEL_USER_PASSWORD} # SECRET (please store in environment variables) | REQUIRED | string | OS-level password for logging into the jump server host
|
||||
@@ -1,238 +0,0 @@
|
||||
dockerImage: "airbyte/source-postgres:0.4.4"
|
||||
spec:
|
||||
documentationUrl: "https://docs.airbyte.com/integrations/sources/postgres"
|
||||
connectionSpecification:
|
||||
$schema: "http://json-schema.org/draft-07/schema#"
|
||||
title: "Postgres Source Spec"
|
||||
type: "object"
|
||||
required:
|
||||
- "host"
|
||||
- "port"
|
||||
- "database"
|
||||
- "username"
|
||||
additionalProperties: false
|
||||
properties:
|
||||
host:
|
||||
title: "Host"
|
||||
description: "Hostname of the database."
|
||||
type: "string"
|
||||
order: 0
|
||||
port:
|
||||
title: "Port"
|
||||
description: "Port of the database."
|
||||
type: "integer"
|
||||
minimum: 0
|
||||
maximum: 65536
|
||||
default: 5432
|
||||
examples:
|
||||
- "5432"
|
||||
order: 1
|
||||
database:
|
||||
title: "DB Name"
|
||||
description: "Name of the database."
|
||||
type: "string"
|
||||
order: 2
|
||||
schemas:
|
||||
title: "Schemas"
|
||||
description: "The list of schemas to sync from. Defaults to user. Case sensitive."
|
||||
type: "array"
|
||||
items:
|
||||
type: "string"
|
||||
minItems: 0
|
||||
uniqueItems: true
|
||||
default:
|
||||
- "public"
|
||||
order: 3
|
||||
username:
|
||||
title: "User"
|
||||
description: "Username to use to access the database."
|
||||
type: "string"
|
||||
order: 4
|
||||
password:
|
||||
title: "Password"
|
||||
description: "Password associated with the username."
|
||||
type: "string"
|
||||
airbyte_secret: true
|
||||
order: 5
|
||||
ssl:
|
||||
title: "Connect using SSL"
|
||||
description: "Encrypt client/server communications for increased security."
|
||||
type: "boolean"
|
||||
default: false
|
||||
order: 6
|
||||
replication_method:
|
||||
type: "object"
|
||||
title: "Replication Method"
|
||||
description: "Replication method to use for extracting data from the database."
|
||||
order: 7
|
||||
oneOf:
|
||||
- title: "Standard"
|
||||
additionalProperties: false
|
||||
description:
|
||||
"Standard replication requires no setup on the DB side but\
|
||||
\ will not be able to represent deletions incrementally."
|
||||
required:
|
||||
- "method"
|
||||
properties:
|
||||
method:
|
||||
type: "string"
|
||||
const: "Standard"
|
||||
enum:
|
||||
- "Standard"
|
||||
default: "Standard"
|
||||
order: 0
|
||||
- title: "Logical Replication (CDC)"
|
||||
additionalProperties: false
|
||||
description:
|
||||
"Logical replication uses the Postgres write-ahead log (WAL)\
|
||||
\ to detect inserts, updates, and deletes. This needs to be configured\
|
||||
\ on the source database itself. Only available on Postgres 10 and above.\
|
||||
\ Read the <a href=\"https://docs.airbyte.com/integrations/sources/postgres\"\
|
||||
>Postgres Source</a> docs for more information."
|
||||
required:
|
||||
- "method"
|
||||
- "replication_slot"
|
||||
- "publication"
|
||||
properties:
|
||||
method:
|
||||
type: "string"
|
||||
const: "CDC"
|
||||
enum:
|
||||
- "CDC"
|
||||
default: "CDC"
|
||||
order: 0
|
||||
plugin:
|
||||
type: "string"
|
||||
title: "Plugin"
|
||||
description:
|
||||
"A logical decoding plug-in installed on the PostgreSQL\
|
||||
\ server. `pgoutput` plug-in is used by default.\nIf replication\
|
||||
\ table contains a lot of big jsonb values it is recommended to\
|
||||
\ use `wal2json` plug-in. For more information about `wal2json`\
|
||||
\ plug-in read <a href=\"https://docs.airbyte.com/integrations/sources/postgres\"\
|
||||
>Postgres Source</a> docs."
|
||||
enum:
|
||||
- "pgoutput"
|
||||
- "wal2json"
|
||||
default: "pgoutput"
|
||||
order: 1
|
||||
replication_slot:
|
||||
type: "string"
|
||||
title: "Replication Slot"
|
||||
description: "A plug-in logical replication slot."
|
||||
order: 2
|
||||
publication:
|
||||
type: "string"
|
||||
title: "Publication"
|
||||
description: "A Postgres publication used for consuming changes."
|
||||
order: 3
|
||||
tunnel_method:
|
||||
type: "object"
|
||||
title: "SSH Tunnel Method"
|
||||
description:
|
||||
"Whether to initiate an SSH tunnel before connecting to the\
|
||||
\ database, and if so, which kind of authentication to use."
|
||||
oneOf:
|
||||
- title: "No Tunnel"
|
||||
required:
|
||||
- "tunnel_method"
|
||||
properties:
|
||||
tunnel_method:
|
||||
description: "No ssh tunnel needed to connect to database"
|
||||
type: "string"
|
||||
const: "NO_TUNNEL"
|
||||
order: 0
|
||||
- title: "SSH Key Authentication"
|
||||
required:
|
||||
- "tunnel_method"
|
||||
- "tunnel_host"
|
||||
- "tunnel_port"
|
||||
- "tunnel_user"
|
||||
- "ssh_key"
|
||||
properties:
|
||||
tunnel_method:
|
||||
description:
|
||||
"Connect through a jump server tunnel host using username\
|
||||
\ and ssh key"
|
||||
type: "string"
|
||||
const: "SSH_KEY_AUTH"
|
||||
order: 0
|
||||
tunnel_host:
|
||||
title: "SSH Tunnel Jump Server Host"
|
||||
description:
|
||||
"Hostname of the jump server host that allows inbound\
|
||||
\ ssh tunnel."
|
||||
type: "string"
|
||||
order: 1
|
||||
tunnel_port:
|
||||
title: "SSH Connection Port"
|
||||
description:
|
||||
"Port on the proxy/jump server that accepts inbound ssh\
|
||||
\ connections."
|
||||
type: "integer"
|
||||
minimum: 0
|
||||
maximum: 65536
|
||||
default: 22
|
||||
examples:
|
||||
- "22"
|
||||
order: 2
|
||||
tunnel_user:
|
||||
title: "SSH Login Username"
|
||||
description: "OS-level username for logging into the jump server host."
|
||||
type: "string"
|
||||
order: 3
|
||||
ssh_key:
|
||||
title: "SSH Private Key"
|
||||
description:
|
||||
"OS-level user account ssh key credentials in RSA PEM\
|
||||
\ format ( created with ssh-keygen -t rsa -m PEM -f myuser_rsa )"
|
||||
type: "string"
|
||||
airbyte_secret: true
|
||||
multiline: true
|
||||
order: 4
|
||||
- title: "Password Authentication"
|
||||
required:
|
||||
- "tunnel_method"
|
||||
- "tunnel_host"
|
||||
- "tunnel_port"
|
||||
- "tunnel_user"
|
||||
- "tunnel_user_password"
|
||||
properties:
|
||||
tunnel_method:
|
||||
description:
|
||||
"Connect through a jump server tunnel host using username\
|
||||
\ and password authentication"
|
||||
type: "string"
|
||||
const: "SSH_PASSWORD_AUTH"
|
||||
order: 0
|
||||
tunnel_host:
|
||||
title: "SSH Tunnel Jump Server Host"
|
||||
description:
|
||||
"Hostname of the jump server host that allows inbound\
|
||||
\ ssh tunnel."
|
||||
type: "string"
|
||||
order: 1
|
||||
tunnel_port:
|
||||
title: "SSH Connection Port"
|
||||
description:
|
||||
"Port on the proxy/jump server that accepts inbound ssh\
|
||||
\ connections."
|
||||
type: "integer"
|
||||
minimum: 0
|
||||
maximum: 65536
|
||||
default: 22
|
||||
examples:
|
||||
- "22"
|
||||
order: 2
|
||||
tunnel_user:
|
||||
title: "SSH Login Username"
|
||||
description: "OS-level username for logging into the jump server host"
|
||||
type: "string"
|
||||
order: 3
|
||||
tunnel_user_password:
|
||||
title: "Password"
|
||||
description: "OS-level password for logging into the jump server host"
|
||||
type: "string"
|
||||
airbyte_secret: true
|
||||
order: 4
|
||||
supported_destination_sync_modes: []
|
||||
@@ -1,33 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import os
|
||||
|
||||
import pytest
|
||||
import yaml
|
||||
from octavia_cli.generate.commands import generate_source_or_destination
|
||||
|
||||
pytestmark = pytest.mark.integration
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
("definition_type, definition_id, resource_name"),
|
||||
[
|
||||
("source", "6371b14b-bc68-4236-bfbd-468e8df8e968", "test_generate_source"),
|
||||
("destination", "22f6c74f-5699-40ff-833c-4a879ea40133", "test_generate_destination"),
|
||||
],
|
||||
)
|
||||
def test_generate_source_or_destination(
|
||||
octavia_tmp_project_directory, api_client, workspace_id, definition_type, definition_id, resource_name
|
||||
):
|
||||
current_path = os.getcwd()
|
||||
os.chdir(octavia_tmp_project_directory)
|
||||
generate_source_or_destination(definition_type, api_client, workspace_id, definition_id, resource_name)
|
||||
expected_output_path = f"{definition_type}s/{resource_name}/configuration.yaml"
|
||||
with open(expected_output_path, "r") as f:
|
||||
parsed_yaml = yaml.safe_load(f)
|
||||
assert parsed_yaml["resource_name"] == resource_name
|
||||
assert parsed_yaml["definition_type"] == definition_type
|
||||
assert parsed_yaml["definition_id"] == definition_id
|
||||
os.chdir(current_path)
|
||||
@@ -1,2 +0,0 @@
|
||||
**/state_*.yaml
|
||||
**/updated_*.yaml
|
||||
@@ -1,367 +0,0 @@
|
||||
# Configuration for connection poke_to_pg
|
||||
definition_type: connection
|
||||
resource_name: poke_to_pg_to_import
|
||||
source_configuration_path: sources/poke_to_import/configuration.yaml
|
||||
destination_configuration_path: destinations/postgres_to_import/configuration.yaml
|
||||
|
||||
# EDIT THE CONFIGURATION BELOW!
|
||||
configuration:
|
||||
status: active # REQUIRED | string | Allowed values: active, inactive, deprecated
|
||||
namespace_definition: source # OPTIONAL | string | Allowed values: source, destination, customformat
|
||||
namespace_format: "${SOURCE_NAMESPACE}" # OPTIONAL | string | Used when namespaceDefinition is 'customformat'. If blank then behaves like namespaceDefinition = 'destination'. If "${SOURCE_NAMESPACE}" then behaves like namespaceDefinition = 'source'.
|
||||
prefix: "" # REQUIRED | Prefix that will be prepended to the name of each stream when it is written to the destination
|
||||
resource_requirements: # OPTIONAL | object | Resource requirements to run workers (blank for unbounded allocations)
|
||||
cpu_limit: "" # OPTIONAL
|
||||
cpu_request: "" # OPTIONAL
|
||||
memory_limit: "" # OPTIONAL
|
||||
memory_request: "" # OPTIONAL
|
||||
schedule_type: basic
|
||||
schedule_data:
|
||||
basic_schedule:
|
||||
time_unit: hours # REQUIRED | string | Allowed values: minutes, hours, days, weeks, months
|
||||
units: 1 # REQUIRED | integer
|
||||
operations:
|
||||
- name: "Normalization"
|
||||
operator_configuration:
|
||||
normalization:
|
||||
option: "basic"
|
||||
operator_type: "normalization"
|
||||
sync_catalog: # OPTIONAL | object | 🚨 ONLY edit streams.config, streams.stream should not be edited as schema cannot be changed.
|
||||
streams:
|
||||
- config:
|
||||
alias_name: pokemon
|
||||
cursor_field: []
|
||||
destination_sync_mode: append
|
||||
primary_key: []
|
||||
selected: true
|
||||
sync_mode: full_refresh
|
||||
stream:
|
||||
default_cursor_field: []
|
||||
json_schema:
|
||||
$schema: http://json-schema.org/draft-07/schema#
|
||||
properties:
|
||||
abilities:
|
||||
items:
|
||||
properties:
|
||||
ability:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
is_hidden:
|
||||
type:
|
||||
- "null"
|
||||
- boolean
|
||||
slot:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- array
|
||||
base_experience:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
forms:
|
||||
items:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- array
|
||||
game_indices:
|
||||
items:
|
||||
properties:
|
||||
game_index:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
version:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- array
|
||||
height:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
held_items:
|
||||
items:
|
||||
properties:
|
||||
item:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
version_details:
|
||||
items:
|
||||
properties:
|
||||
rarity:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
version:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- array
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- array
|
||||
id:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
"is_default ":
|
||||
type:
|
||||
- "null"
|
||||
- boolean
|
||||
location_area_encounters:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
moves:
|
||||
items:
|
||||
properties:
|
||||
move:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
version_group_details:
|
||||
items:
|
||||
properties:
|
||||
level_learned_at:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
move_learn_method:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
version_group:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- array
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- array
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
order:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
species:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
sprites:
|
||||
properties:
|
||||
back_default:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
back_female:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
back_shiny:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
back_shiny_female:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
front_default:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
front_female:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
front_shiny:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
front_shiny_female:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
stats:
|
||||
items:
|
||||
properties:
|
||||
base_stat:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
effort:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
stat:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- array
|
||||
types:
|
||||
items:
|
||||
properties:
|
||||
slot:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
type:
|
||||
properties:
|
||||
name:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
url:
|
||||
type:
|
||||
- "null"
|
||||
- string
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- object
|
||||
type:
|
||||
- "null"
|
||||
- array
|
||||
weight:
|
||||
type:
|
||||
- "null"
|
||||
- integer
|
||||
type: object
|
||||
name: pokemon
|
||||
source_defined_primary_key: []
|
||||
supported_sync_modes:
|
||||
- full_refresh
|
||||
@@ -1,32 +0,0 @@
|
||||
# Configuration for airbyte/destination-postgres
|
||||
# Documentation about this connector can be found at https://docs.airbyte.io/integrations/destinations/postgres
|
||||
resource_name: postgres_to_import
|
||||
definition_type: destination
|
||||
definition_id: 25c5221d-dce2-4163-ade9-739ef790f503
|
||||
definition_image: airbyte/destination-postgres
|
||||
definition_version: 0.3.15
|
||||
|
||||
# EDIT THE CONFIGURATION BELOW!
|
||||
configuration:
|
||||
ssl: False # OPTIONAL | boolean | Encrypt data using SSL.
|
||||
host: localhost # REQUIRED | string | Hostname of the database.
|
||||
port: 5433 # REQUIRED | integer | Port of the database. | Example: 5432
|
||||
schema: "public" # REQUIRED | string | The default schema tables are written to if the source does not specify a namespace. The usual value for this field is "public". | Example: public
|
||||
database: postgres # REQUIRED | string | Name of the database.
|
||||
password: my_secret_password # SECRET (please store in environment variables) | OPTIONAL | string | Password associated with the username.
|
||||
username: postgres # REQUIRED | string | Username to use to access the database.
|
||||
tunnel_method:
|
||||
## -------- Pick one valid structure among the examples below: --------
|
||||
tunnel_method: "NO_TUNNEL" # REQUIRED | string | No ssh tunnel needed to connect to database
|
||||
## -------- Another valid structure for tunnel_method: --------
|
||||
# ssh_key: ${SSH_KEY} # SECRET (please store in environment variables) | REQUIRED | string | OS-level user account ssh key credentials in RSA PEM format ( created with ssh-keygen -t rsa -m PEM -f myuser_rsa )
|
||||
# tunnel_host: # REQUIRED | string | Hostname of the jump server host that allows inbound ssh tunnel.
|
||||
# tunnel_port: 22 # REQUIRED | integer | Port on the proxy/jump server that accepts inbound ssh connections. | Example: 22
|
||||
# tunnel_user: # REQUIRED | string | OS-level username for logging into the jump server host.
|
||||
# tunnel_method: "SSH_KEY_AUTH" # REQUIRED | string | Connect through a jump server tunnel host using username and ssh key
|
||||
## -------- Another valid structure for tunnel_method: --------
|
||||
# tunnel_host: # REQUIRED | string | Hostname of the jump server host that allows inbound ssh tunnel.
|
||||
# tunnel_port: 22 # REQUIRED | integer | Port on the proxy/jump server that accepts inbound ssh connections. | Example: 22
|
||||
# tunnel_user: # REQUIRED | string | OS-level username for logging into the jump server host
|
||||
# tunnel_method: "SSH_PASSWORD_AUTH" # REQUIRED | string | Connect through a jump server tunnel host using username and password authentication
|
||||
# tunnel_user_password: ${TUNNEL_USER_PASSWORD} # SECRET (please store in environment variables) | REQUIRED | string | OS-level password for logging into the jump server host
|
||||
@@ -1,9 +0,0 @@
|
||||
resource_name: poke_to_import
|
||||
definition_type: source
|
||||
definition_id: 6371b14b-bc68-4236-bfbd-468e8df8e968
|
||||
definition_image: airbyte/source-pokeapi
|
||||
definition_version: 0.1.4
|
||||
|
||||
# EDIT THE CONFIGURATION BELOW!
|
||||
configuration:
|
||||
pokemon_name: ditto
|
||||
@@ -1,159 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
|
||||
import glob
|
||||
import os
|
||||
import shutil
|
||||
from distutils.dir_util import copy_tree
|
||||
from pathlib import Path
|
||||
from unittest import mock
|
||||
|
||||
import pytest
|
||||
from airbyte_api_client.api import connection_api
|
||||
from airbyte_api_client.model.connection_id_request_body import ConnectionIdRequestBody
|
||||
from click.testing import CliRunner
|
||||
from octavia_cli._import.commands import all as octavia_import_all
|
||||
from octavia_cli._import.commands import connection as octavia_import_connection
|
||||
from octavia_cli._import.commands import destination as octavia_import_destination
|
||||
from octavia_cli._import.commands import source as octavia_import_source
|
||||
from octavia_cli.apply.commands import apply as octavia_apply
|
||||
from octavia_cli.apply.resources import ResourceState
|
||||
from octavia_cli.apply.resources import factory as resource_factory
|
||||
|
||||
pytestmark = pytest.mark.integration
|
||||
click_runner = CliRunner()
|
||||
|
||||
|
||||
@pytest.fixture(scope="module")
|
||||
def context_object(api_client, workspace_id):
|
||||
return {"TELEMETRY_CLIENT": mock.MagicMock(), "PROJECT_IS_INITIALIZED": True, "API_CLIENT": api_client, "WORKSPACE_ID": workspace_id}
|
||||
|
||||
|
||||
@pytest.fixture(scope="module")
|
||||
def initialized_project_directory(context_object):
|
||||
"""This fixture initializes a temporary local directory with configuration.yaml files copied from ./octavia_project_to_migrate
|
||||
It runs octavia apply on these configurations and then removes the local yaml files.
|
||||
At the end of the run of this function we have remote resources an our Airbyte instance but they are not managed by octavia due to the file deletion.
|
||||
The fixture returns source, destination and connection previously instantiated resources to make sure the import command ran in the following tests imports configuration at the right location.
|
||||
"""
|
||||
cwd = os.getcwd()
|
||||
dir_path = f"{os.path.dirname(__file__)}/octavia_test_project"
|
||||
copy_tree(f"{os.path.dirname(__file__)}/octavia_project_to_migrate", dir_path)
|
||||
os.chdir(dir_path)
|
||||
|
||||
result = click_runner.invoke(octavia_apply, obj=context_object)
|
||||
assert result.exit_code == 0
|
||||
for configuration_file in glob.glob("./**/configuration.yaml", recursive=True):
|
||||
resource = resource_factory(context_object["API_CLIENT"], context_object["WORKSPACE_ID"], configuration_file)
|
||||
if resource.resource_type == "source":
|
||||
source_id, source_expected_configuration_path, source_expected_state_path = (
|
||||
resource.resource_id,
|
||||
resource.configuration_path,
|
||||
resource.state.path,
|
||||
)
|
||||
if resource.resource_type == "destination":
|
||||
destination_id, destination_expected_configuration_path, destination_expected_state_path = (
|
||||
resource.resource_id,
|
||||
resource.configuration_path,
|
||||
ResourceState._get_path_from_configuration_and_workspace_id(resource.configuration_path, context_object["WORKSPACE_ID"]),
|
||||
)
|
||||
if resource.resource_type == "connection":
|
||||
connection_id, connection_configuration_path, connection_expected_state_path = (
|
||||
resource.resource_id,
|
||||
resource.configuration_path,
|
||||
ResourceState._get_path_from_configuration_and_workspace_id(resource.configuration_path, context_object["WORKSPACE_ID"]),
|
||||
)
|
||||
os.remove(configuration_file)
|
||||
os.remove(resource.state.path)
|
||||
yield (source_id, source_expected_configuration_path, source_expected_state_path), (
|
||||
destination_id,
|
||||
destination_expected_configuration_path,
|
||||
destination_expected_state_path,
|
||||
), (connection_id, connection_configuration_path, connection_expected_state_path)
|
||||
os.chdir(cwd)
|
||||
shutil.rmtree(dir_path)
|
||||
|
||||
|
||||
@pytest.fixture(scope="module")
|
||||
def expected_source(initialized_project_directory):
|
||||
yield initialized_project_directory[0]
|
||||
|
||||
|
||||
@pytest.fixture(scope="module")
|
||||
def expected_destination(initialized_project_directory):
|
||||
yield initialized_project_directory[1]
|
||||
|
||||
|
||||
@pytest.fixture(scope="module")
|
||||
def expected_connection(initialized_project_directory, context_object, expected_source, expected_destination):
|
||||
connection_id, connection_configuration_path, connection_expected_state_path = initialized_project_directory[2]
|
||||
yield connection_id, connection_configuration_path, connection_expected_state_path
|
||||
# To delete the connection we have to create a ConnectionApi instance because WebBackendApi instance does not have delete endpoint
|
||||
connection = resource_factory(context_object["API_CLIENT"], context_object["WORKSPACE_ID"], connection_configuration_path)
|
||||
connection_api_instance = connection_api.ConnectionApi(context_object["API_CLIENT"])
|
||||
connection_api_instance.delete_connection(
|
||||
ConnectionIdRequestBody(
|
||||
connection_id=connection.resource_id,
|
||||
)
|
||||
)
|
||||
# Delete source and destination after connection to not make the connection deprecated
|
||||
source = resource_factory(context_object["API_CLIENT"], context_object["WORKSPACE_ID"], expected_source[1])
|
||||
source.api_instance.delete_source(source.get_payload)
|
||||
destination = resource_factory(context_object["API_CLIENT"], context_object["WORKSPACE_ID"], expected_destination[1])
|
||||
destination.api_instance.delete_destination(destination.get_payload)
|
||||
|
||||
|
||||
def test_import_source(expected_source, context_object):
|
||||
source_id, expected_configuration_path, expected_state_path = expected_source
|
||||
result = click_runner.invoke(octavia_import_source, source_id, obj=context_object)
|
||||
assert result.exit_code == 0
|
||||
assert Path(expected_configuration_path).is_file() and Path(expected_state_path).is_file()
|
||||
source = resource_factory(context_object["API_CLIENT"], context_object["WORKSPACE_ID"], expected_configuration_path)
|
||||
assert source.was_created # Check if the remote resource is considered as managed by octavia and exists remotely
|
||||
assert source.get_diff_with_remote_resource() == ""
|
||||
assert source.state.path in str(expected_state_path)
|
||||
|
||||
|
||||
def test_import_destination(expected_destination, context_object):
|
||||
destination_id, expected_configuration_path, expected_state_path = expected_destination
|
||||
result = click_runner.invoke(octavia_import_destination, destination_id, obj=context_object)
|
||||
assert result.exit_code == 0
|
||||
assert Path(expected_configuration_path).is_file() and Path(expected_state_path).is_file()
|
||||
destination = resource_factory(context_object["API_CLIENT"], context_object["WORKSPACE_ID"], expected_configuration_path)
|
||||
assert destination.was_created # Check if the remote resource is considered as managed by octavia and exists remotely
|
||||
assert destination.get_diff_with_remote_resource() == ""
|
||||
assert destination.state.path in str(expected_state_path)
|
||||
assert destination.configuration["password"] == "**********"
|
||||
|
||||
|
||||
def test_import_connection(expected_connection, context_object):
|
||||
connection_id, expected_configuration_path, expected_state_path = expected_connection
|
||||
result = click_runner.invoke(octavia_import_connection, connection_id, obj=context_object)
|
||||
assert result.exit_code == 0
|
||||
assert Path(expected_configuration_path).is_file() and Path(expected_state_path).is_file()
|
||||
connection = resource_factory(context_object["API_CLIENT"], context_object["WORKSPACE_ID"], expected_configuration_path)
|
||||
assert connection.was_created # Check if the remote resource is considered as managed by octavia and exists remotely
|
||||
assert connection.get_diff_with_remote_resource() == ""
|
||||
assert connection.state.path in str(expected_state_path)
|
||||
|
||||
|
||||
def test_import_all(expected_source, expected_destination, expected_connection, context_object):
|
||||
_, source_expected_configuration_path, source_expected_state_path = expected_source
|
||||
_, destination_expected_configuration_path, destination_expected_state_path = expected_destination
|
||||
_, connection_expected_configuration_path, connection_expected_state_path = expected_connection
|
||||
paths_to_first_delete_and_then_check_existence = [
|
||||
source_expected_configuration_path,
|
||||
source_expected_state_path,
|
||||
destination_expected_configuration_path,
|
||||
destination_expected_state_path,
|
||||
connection_expected_configuration_path,
|
||||
connection_expected_state_path,
|
||||
]
|
||||
for path in paths_to_first_delete_and_then_check_existence:
|
||||
os.remove(path)
|
||||
result = click_runner.invoke(octavia_import_all, obj=context_object)
|
||||
assert result.exit_code == 0
|
||||
for path in paths_to_first_delete_and_then_check_existence:
|
||||
assert os.path.exists(path)
|
||||
@@ -1,30 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
|
||||
import pytest
|
||||
from octavia_cli.list.listings import Connections, DestinationConnectorsDefinitions, Destinations, SourceConnectorsDefinitions, Sources
|
||||
|
||||
pytestmark = pytest.mark.integration
|
||||
|
||||
|
||||
@pytest.mark.parametrize("ConnectorsDefinitionListing", [SourceConnectorsDefinitions, DestinationConnectorsDefinitions])
|
||||
def test_list_connectors(api_client, ConnectorsDefinitionListing):
|
||||
connector_definitions = ConnectorsDefinitionListing(api_client)
|
||||
listing = connector_definitions.get_listing()
|
||||
assert len(listing) > 0
|
||||
assert len(listing[0]) == len(ConnectorsDefinitionListing.fields_to_display)
|
||||
assert str(listing)
|
||||
|
||||
|
||||
@pytest.mark.parametrize("WorkspaceListing", [Sources, Destinations, Connections])
|
||||
def test_list_workspace_resource(api_client, source, destination, connection, workspace_id, WorkspaceListing):
|
||||
assert source.was_created
|
||||
assert destination.was_created
|
||||
assert connection.was_created
|
||||
connector_definitions = WorkspaceListing(api_client, workspace_id)
|
||||
listing = connector_definitions.get_listing()
|
||||
assert len(listing) >= 1
|
||||
assert len(listing[0]) == len(WorkspaceListing.fields_to_display)
|
||||
assert str(listing)
|
||||
@@ -1,3 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2021 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
@@ -1,3 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
@@ -1,180 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import json
|
||||
from typing import List, Type, Union
|
||||
|
||||
import airbyte_api_client
|
||||
import click
|
||||
from octavia_cli.apply import resources
|
||||
from octavia_cli.base_commands import OctaviaCommand
|
||||
from octavia_cli.check_context import requires_init
|
||||
from octavia_cli.generate import definitions, renderers
|
||||
from octavia_cli.get.commands import get_json_representation
|
||||
from octavia_cli.get.resources import Connection as UnmanagedConnection
|
||||
from octavia_cli.get.resources import Destination as UnmanagedDestination
|
||||
from octavia_cli.get.resources import Source as UnmanagedSource
|
||||
from octavia_cli.list.listings import Connections as UnmanagedConnections
|
||||
from octavia_cli.list.listings import Destinations as UnmanagedDestinations
|
||||
from octavia_cli.list.listings import Sources as UnmanagedSources
|
||||
|
||||
|
||||
class MissingResourceDependencyError(click.UsageError):
|
||||
pass
|
||||
|
||||
|
||||
def build_help_message(resource_type: str) -> str:
|
||||
"""Helper function to build help message consistently for all the commands in this module.
|
||||
Args:
|
||||
resource_type (str): source, destination or connection
|
||||
Returns:
|
||||
str: The generated help message.
|
||||
"""
|
||||
return f"Import an existing {resource_type} to manage it with octavia-cli."
|
||||
|
||||
|
||||
def import_source_or_destination(
|
||||
api_client: airbyte_api_client.ApiClient,
|
||||
workspace_id: str,
|
||||
ResourceClass: Type[Union[UnmanagedSource, UnmanagedDestination]],
|
||||
resource_to_get: str,
|
||||
) -> str:
|
||||
"""Helper function to import sources & destinations.
|
||||
|
||||
Args:
|
||||
api_client (airbyte_api_client.ApiClient): the Airbyte API client.
|
||||
workspace_id (str): current Airbyte workspace id.
|
||||
ResourceClass (Union[UnmanagedSource, UnmanagedDestination]): the Airbyte Resource Class.
|
||||
resource_to_get (str): the name or ID of the resource in the current Airbyte workspace id.
|
||||
|
||||
Returns:
|
||||
str: The generated import message.
|
||||
"""
|
||||
remote_configuration = json.loads(get_json_representation(api_client, workspace_id, ResourceClass, resource_to_get))
|
||||
|
||||
resource_type = ResourceClass.__name__.lower()
|
||||
|
||||
definition = definitions.factory(resource_type, api_client, workspace_id, remote_configuration[f"{resource_type}_definition_id"])
|
||||
|
||||
renderer = renderers.ConnectorSpecificationRenderer(remote_configuration["name"], definition)
|
||||
|
||||
new_configuration_path = renderer.import_configuration(project_path=".", configuration=remote_configuration["connection_configuration"])
|
||||
managed_resource, state = resources.factory(api_client, workspace_id, new_configuration_path).manage(
|
||||
remote_configuration[f"{resource_type}_id"]
|
||||
)
|
||||
message = f"✅ - Imported {resource_type} {managed_resource.name} in {new_configuration_path}. State stored in {state.path}"
|
||||
click.echo(click.style(message, fg="green"))
|
||||
message = f"⚠️ - Please update any secrets stored in {new_configuration_path}"
|
||||
click.echo(click.style(message, fg="yellow"))
|
||||
|
||||
|
||||
def import_connection(
|
||||
api_client: airbyte_api_client.ApiClient,
|
||||
workspace_id: str,
|
||||
resource_to_get: str,
|
||||
) -> str:
|
||||
"""Helper function to import connection.
|
||||
|
||||
Args:
|
||||
api_client (airbyte_api_client.ApiClient): the Airbyte API client.
|
||||
workspace_id (str): current Airbyte workspace id.
|
||||
resource_to_get (str): the name or ID of the resource in the current Airbyte workspace id.
|
||||
|
||||
Returns:
|
||||
str: The generated import message.
|
||||
"""
|
||||
remote_configuration = json.loads(get_json_representation(api_client, workspace_id, UnmanagedConnection, resource_to_get))
|
||||
# Since #15253 "schedule" is deprecated
|
||||
remote_configuration.pop("schedule", None)
|
||||
source_name, destination_name = remote_configuration["source"]["name"], remote_configuration["destination"]["name"]
|
||||
source_configuration_path = renderers.ConnectorSpecificationRenderer.get_output_path(
|
||||
project_path=".", definition_type="source", resource_name=source_name
|
||||
)
|
||||
|
||||
destination_configuration_path = renderers.ConnectorSpecificationRenderer.get_output_path(
|
||||
project_path=".", definition_type="destination", resource_name=destination_name
|
||||
)
|
||||
if not source_configuration_path.is_file():
|
||||
raise MissingResourceDependencyError(
|
||||
f"The source {source_name} is not managed by octavia-cli, please import and apply it before importing your connection."
|
||||
)
|
||||
elif not destination_configuration_path.is_file():
|
||||
raise MissingResourceDependencyError(
|
||||
f"The destination {destination_name} is not managed by octavia-cli, please import and apply it before importing your connection."
|
||||
)
|
||||
else:
|
||||
source = resources.factory(api_client, workspace_id, source_configuration_path)
|
||||
destination = resources.factory(api_client, workspace_id, destination_configuration_path)
|
||||
if not source.was_created:
|
||||
raise resources.NonExistingResourceError(
|
||||
f"The source defined at {source_configuration_path} does not exists. Please run octavia apply before creating this connection."
|
||||
)
|
||||
if not destination.was_created:
|
||||
raise resources.NonExistingResourceError(
|
||||
f"The destination defined at {destination_configuration_path} does not exists. Please run octavia apply before creating this connection."
|
||||
)
|
||||
|
||||
connection_name, connection_id = remote_configuration["name"], remote_configuration["connection_id"]
|
||||
connection_renderer = renderers.ConnectionRenderer(connection_name, source, destination)
|
||||
new_configuration_path = connection_renderer.import_configuration(".", remote_configuration)
|
||||
managed_resource, state = resources.factory(api_client, workspace_id, new_configuration_path).manage(connection_id)
|
||||
message = f"✅ - Imported connection {managed_resource.name} in {new_configuration_path}. State stored in {state.path}"
|
||||
click.echo(click.style(message, fg="green"))
|
||||
|
||||
|
||||
@click.group(
|
||||
"import",
|
||||
help=f'{build_help_message("source, destination or connection")}. ID or name can be used as argument. Example: \'octavia import source "My Pokemon source"\' or \'octavia import source cb5413b2-4159-46a2-910a-dc282a439d2d\'',
|
||||
)
|
||||
@click.pass_context
|
||||
def _import(ctx: click.Context): # pragma: no cover
|
||||
pass
|
||||
|
||||
|
||||
@_import.command(cls=OctaviaCommand, name="source", help=build_help_message("source"))
|
||||
@click.argument("resource", type=click.STRING)
|
||||
@click.pass_context
|
||||
@requires_init
|
||||
def source(ctx: click.Context, resource: str):
|
||||
click.echo(import_source_or_destination(ctx.obj["API_CLIENT"], ctx.obj["WORKSPACE_ID"], UnmanagedSource, resource))
|
||||
|
||||
|
||||
@_import.command(cls=OctaviaCommand, name="destination", help=build_help_message("destination"))
|
||||
@click.argument("resource", type=click.STRING)
|
||||
@click.pass_context
|
||||
@requires_init
|
||||
def destination(ctx: click.Context, resource: str):
|
||||
click.echo(import_source_or_destination(ctx.obj["API_CLIENT"], ctx.obj["WORKSPACE_ID"], UnmanagedDestination, resource))
|
||||
|
||||
|
||||
@_import.command(cls=OctaviaCommand, name="connection", help=build_help_message("connection"))
|
||||
@click.argument("resource", type=click.STRING)
|
||||
@click.pass_context
|
||||
@requires_init
|
||||
def connection(ctx: click.Context, resource: str):
|
||||
click.echo(import_connection(ctx.obj["API_CLIENT"], ctx.obj["WORKSPACE_ID"], resource))
|
||||
|
||||
|
||||
@_import.command(cls=OctaviaCommand, name="all", help=build_help_message("all"))
|
||||
@click.pass_context
|
||||
@requires_init
|
||||
def all(ctx: click.Context):
|
||||
api_client, workspace_id = ctx.obj["API_CLIENT"], ctx.obj["WORKSPACE_ID"]
|
||||
for _, _, resource_id in UnmanagedSources(api_client, workspace_id).get_listing():
|
||||
import_source_or_destination(api_client, workspace_id, UnmanagedSource, resource_id)
|
||||
for _, _, resource_id in UnmanagedDestinations(api_client, workspace_id).get_listing():
|
||||
import_source_or_destination(api_client, workspace_id, UnmanagedDestination, resource_id)
|
||||
for _, resource_id, _, _, _ in UnmanagedConnections(api_client, workspace_id).get_listing():
|
||||
import_connection(api_client, workspace_id, resource_id)
|
||||
|
||||
|
||||
AVAILABLE_COMMANDS: List[click.Command] = [source, destination, connection]
|
||||
|
||||
|
||||
def add_commands_to_list():
|
||||
for command in AVAILABLE_COMMANDS:
|
||||
_import.add_command(command)
|
||||
|
||||
|
||||
add_commands_to_list()
|
||||
@@ -1,106 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
from dataclasses import dataclass
|
||||
from typing import List, Optional, Tuple
|
||||
|
||||
import airbyte_api_client
|
||||
import click
|
||||
import yaml
|
||||
|
||||
from .apply.yaml_loaders import EnvVarLoader
|
||||
from .init.commands import API_HTTP_HEADERS_TARGET_PATH
|
||||
|
||||
|
||||
class InvalidApiHttpHeadersFileError(click.exceptions.ClickException):
|
||||
pass
|
||||
|
||||
|
||||
@dataclass
|
||||
class ApiHttpHeader:
|
||||
name: str
|
||||
value: str
|
||||
|
||||
def __post_init__(self):
|
||||
try:
|
||||
assert isinstance(self.name, str) and self.name
|
||||
assert isinstance(self.value, str) and self.value
|
||||
except AssertionError:
|
||||
raise AttributeError("Header name and value must be non empty string.")
|
||||
self.name = self.name.strip()
|
||||
self.value = self.value.strip()
|
||||
|
||||
|
||||
def deserialize_file_based_headers(header_configuration_path: str) -> List[ApiHttpHeader]:
|
||||
"""Parse API HTTP headers declared in a YAML file to a list of ApiHttpHeaders
|
||||
|
||||
Args:
|
||||
header_configuration_path (str): Path to the YAML file where API HTTP headers are declared.
|
||||
|
||||
Raises:
|
||||
InvalidApiHttpHeadersFileError: Raised if the YAML structure is not valid.
|
||||
|
||||
Returns:
|
||||
List[ApiHttpHeader]: List of HTTP headers parsed from the YAML file.
|
||||
"""
|
||||
with open(header_configuration_path) as file:
|
||||
try:
|
||||
content = yaml.load(file, EnvVarLoader)
|
||||
headers = content["headers"]
|
||||
except (TypeError, KeyError, yaml.scanner.ScannerError):
|
||||
raise InvalidApiHttpHeadersFileError(
|
||||
f"Please provide valid yaml file to declare API HTTP headers. Please check the {API_HTTP_HEADERS_TARGET_PATH} file."
|
||||
)
|
||||
|
||||
return [ApiHttpHeader(name, value) for name, value in headers.items()]
|
||||
|
||||
|
||||
def deserialize_option_based_headers(api_http_headers: List[Tuple[str, str]]) -> List[ApiHttpHeader]:
|
||||
"""Parse API HTTP headers declared in CLI options to a list of ApiHttpHeaders
|
||||
|
||||
Args:
|
||||
api_http_headers (List[Tuple[str, str]]): Raw list of api headers tuples retrieved from CLI options.
|
||||
|
||||
Returns:
|
||||
List[ApiHttpHeader]: List of HTTP headers parsed from the CLI options.
|
||||
"""
|
||||
return list({header_name: ApiHttpHeader(header_name, header_value) for header_name, header_value in api_http_headers}.values())
|
||||
|
||||
|
||||
def merge_api_headers(
|
||||
option_based_api_http_headers: Optional[List[Tuple[str, str]]], api_http_headers_file_path: Optional[str]
|
||||
) -> List[ApiHttpHeader]:
|
||||
"""Deserialize headers from options and files into ApiHttpHeader and merge options based headers with file based headers.
|
||||
|
||||
Args:
|
||||
option_based_api_http_headers (Optional[List[Tuple[str, str]]]): Option based headers.
|
||||
api_http_headers_file_path (Optional[str]): Path to the YAML file with http headers.
|
||||
|
||||
Returns:
|
||||
List[ApiHttpHeader]: Lit of unique ApiHttpHeaders
|
||||
"""
|
||||
if option_based_api_http_headers and api_http_headers_file_path:
|
||||
click.echo(
|
||||
"ℹ️ - You passed API HTTP headers in a file and in options at the same time. Option based headers will override file based headers."
|
||||
)
|
||||
option_based_headers = (
|
||||
deserialize_option_based_headers(option_based_api_http_headers) if option_based_api_http_headers is not None else []
|
||||
)
|
||||
file_based_headers = deserialize_file_based_headers(api_http_headers_file_path) if api_http_headers_file_path else []
|
||||
|
||||
merged_headers = {header.name: header for header in file_based_headers}
|
||||
for header in option_based_headers:
|
||||
merged_headers[header.name] = header
|
||||
return list(merged_headers.values())
|
||||
|
||||
|
||||
def set_api_headers_on_api_client(api_client: airbyte_api_client.ApiClient, api_headers: List[ApiHttpHeader]) -> None:
|
||||
"""Set the API headers on the API client
|
||||
|
||||
Args:
|
||||
api_client (airbyte_api_client.ApiClient): The API client on which headers will be set.
|
||||
api_headers (List[ApiHttpHeader]): Headers to set on the API client.
|
||||
"""
|
||||
for api_header in api_headers:
|
||||
api_client.set_default_header(api_header.name, api_header.value)
|
||||
@@ -1,3 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2021 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
@@ -1,172 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
from glob import glob
|
||||
from typing import List, Optional, Tuple
|
||||
|
||||
import airbyte_api_client
|
||||
import click
|
||||
from octavia_cli.base_commands import OctaviaCommand
|
||||
from octavia_cli.check_context import REQUIRED_PROJECT_DIRECTORIES, requires_init
|
||||
|
||||
from .diff_helpers import display_diff_line
|
||||
from .resources import BaseResource
|
||||
from .resources import factory as resource_factory
|
||||
|
||||
|
||||
@click.command(cls=OctaviaCommand, name="apply", help="Create or update Airbyte remote resources according local YAML configurations.")
|
||||
@click.option("--file", "-f", "configurations_files", type=click.Path(), multiple=True)
|
||||
@click.option("--force", is_flag=True, default=False, help="Does not display the diff and updates without user prompt.")
|
||||
@click.pass_context
|
||||
@requires_init
|
||||
def apply(ctx: click.Context, configurations_files: List[click.Path], force: bool):
|
||||
if not configurations_files:
|
||||
configurations_files = find_local_configuration_files()
|
||||
|
||||
resources = get_resources_to_apply(configurations_files, ctx.obj["API_CLIENT"], ctx.obj["WORKSPACE_ID"])
|
||||
for resource in resources:
|
||||
apply_single_resource(resource, force)
|
||||
|
||||
|
||||
def get_resources_to_apply(
|
||||
configuration_files: List[str], api_client: airbyte_api_client.ApiClient, workspace_id: str
|
||||
) -> List[BaseResource]:
|
||||
"""Create resource objects with factory and sort according to apply priority.
|
||||
|
||||
Args:
|
||||
configuration_files (List[str]): List of YAML configuration files.
|
||||
api_client (airbyte_api_client.ApiClient): the Airbyte API client.
|
||||
workspace_id (str): current Airbyte workspace id.
|
||||
|
||||
Returns:
|
||||
List[BaseResource]: Resources sorted according to their apply priority.
|
||||
"""
|
||||
all_resources = [resource_factory(api_client, workspace_id, path) for path in configuration_files]
|
||||
return sorted(all_resources, key=lambda resource: resource.APPLY_PRIORITY)
|
||||
|
||||
|
||||
def apply_single_resource(resource: BaseResource, force: bool) -> None:
|
||||
"""Runs resource creation if it was not created, update it otherwise.
|
||||
|
||||
Args:
|
||||
resource (BaseResource): The resource to apply.
|
||||
force (bool): Whether force mode is on.
|
||||
"""
|
||||
if resource.was_created:
|
||||
click.echo(
|
||||
click.style(
|
||||
f"🐙 - {resource.resource_name} exists on your Airbyte instance according to your state file, let's check if we need to update it!",
|
||||
fg="yellow",
|
||||
)
|
||||
)
|
||||
messages = update_resource(resource, force)
|
||||
else:
|
||||
click.echo(click.style(f"🐙 - {resource.resource_name} does not exists on your Airbyte instance, let's create it!", fg="green"))
|
||||
messages = create_resource(resource)
|
||||
click.echo("\n".join(messages))
|
||||
|
||||
|
||||
def should_update_resource(force: bool, user_validation: Optional[bool], local_file_changed: bool) -> Tuple[bool, str]:
|
||||
"""Function to decide if the resource needs an update or not.
|
||||
|
||||
Args:
|
||||
force (bool): Whether force mode is on.
|
||||
user_validation (bool): User validated the existing changes.
|
||||
local_file_changed (bool): Whether the local file describing the resource was modified.
|
||||
|
||||
Returns:
|
||||
Tuple[bool, str]: Boolean to know if resource should be updated and string describing the update reason.
|
||||
"""
|
||||
if force:
|
||||
should_update, update_reason = True, "🚨 - Running update because the force mode is activated."
|
||||
elif user_validation is True:
|
||||
should_update, update_reason = True, "🟢 - Running update because you validated the changes."
|
||||
elif user_validation is False:
|
||||
should_update, update_reason = False, "🔴 - Did not update because you refused the changes."
|
||||
elif user_validation is None and local_file_changed:
|
||||
should_update, update_reason = (
|
||||
True,
|
||||
"🟡 - Running update because a local file change was detected and a secret field might have been edited.",
|
||||
)
|
||||
else:
|
||||
should_update, update_reason = False, "😴 - Did not update because no change detected."
|
||||
return should_update, click.style(update_reason, fg="green")
|
||||
|
||||
|
||||
def prompt_for_diff_validation(resource_name: str, diff: str) -> bool:
|
||||
"""Display the diff to user and prompt them from validation.
|
||||
|
||||
Args:
|
||||
resource_name (str): Name of the resource the diff was computed for.
|
||||
diff (str): The diff.
|
||||
|
||||
Returns:
|
||||
bool: Whether user validated the diff.
|
||||
"""
|
||||
if diff:
|
||||
click.echo(
|
||||
click.style("👀 - Here's the computed diff (🚨 remind that diff on secret fields are not displayed):", fg="magenta", bold=True)
|
||||
)
|
||||
for line in diff.split("\n"):
|
||||
display_diff_line(line)
|
||||
return click.confirm(click.style(f"❓ - Do you want to update {resource_name}?", bold=True))
|
||||
else:
|
||||
return False
|
||||
|
||||
|
||||
def create_resource(resource: BaseResource) -> List[str]:
|
||||
"""Run a resource creation.
|
||||
|
||||
Args:
|
||||
resource (BaseResource): The resource to create.
|
||||
|
||||
Returns:
|
||||
List[str]: Post create messages to display to standard output.
|
||||
"""
|
||||
created_resource, state = resource.create()
|
||||
return [
|
||||
click.style(f"🎉 - Successfully created {created_resource.name} on your Airbyte instance!", fg="green", bold=True),
|
||||
click.style(f"💾 - New state for {created_resource.name} saved at {state.path}", fg="yellow"),
|
||||
]
|
||||
|
||||
|
||||
def update_resource(resource: BaseResource, force: bool) -> List[str]:
|
||||
"""Run a resource update. Check if update is required and prompt for user diff validation if needed.
|
||||
|
||||
Args:
|
||||
resource (BaseResource): Resource to update
|
||||
force (bool): Whether force mode is on.
|
||||
|
||||
Returns:
|
||||
List[str]: Post update messages to display to standard output.
|
||||
"""
|
||||
output_messages = []
|
||||
diff = resource.get_diff_with_remote_resource()
|
||||
user_validation = None
|
||||
if not force and diff:
|
||||
user_validation = prompt_for_diff_validation(resource.resource_name, diff)
|
||||
should_update, update_reason = should_update_resource(force, user_validation, resource.local_file_changed)
|
||||
click.echo(update_reason)
|
||||
|
||||
if should_update:
|
||||
updated_resource, state = resource.update()
|
||||
output_messages.append(
|
||||
click.style(f"🎉 - Successfully updated {updated_resource.name} on your Airbyte instance!", fg="green", bold=True)
|
||||
)
|
||||
output_messages.append(click.style(f"💾 - New state for {updated_resource.name} stored at {state.path}.", fg="yellow"))
|
||||
return output_messages
|
||||
|
||||
|
||||
def find_local_configuration_files() -> List[str]:
|
||||
"""Discover local configuration files.
|
||||
|
||||
Returns:
|
||||
List[str]: Paths to YAML configuration files.
|
||||
"""
|
||||
configuration_files = []
|
||||
for resource_directory in REQUIRED_PROJECT_DIRECTORIES:
|
||||
configuration_files += glob(f"./{resource_directory}/**/configuration.yaml")
|
||||
if not configuration_files:
|
||||
click.echo(click.style("😒 - No YAML file found to run apply.", fg="red"))
|
||||
return configuration_files
|
||||
@@ -1,75 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import hashlib
|
||||
import json
|
||||
from typing import Any
|
||||
|
||||
import click
|
||||
from deepdiff import DeepDiff
|
||||
|
||||
SECRET_MASK = "**********"
|
||||
|
||||
|
||||
def hash_config(configuration: dict) -> str:
|
||||
"""Computes a SHA256 hash from a dictionnary.
|
||||
|
||||
Args:
|
||||
configuration (dict): The configuration to hash
|
||||
|
||||
Returns:
|
||||
str: _description_
|
||||
"""
|
||||
stringified = json.dumps(configuration, sort_keys=True)
|
||||
return hashlib.sha256(stringified.encode("utf-8")).hexdigest()
|
||||
|
||||
|
||||
def exclude_secrets_from_diff(obj: Any, path: str) -> bool:
|
||||
"""Callback function used with DeepDiff to ignore secret values from the diff.
|
||||
|
||||
Args:
|
||||
obj (Any): Object for which a diff will be computed.
|
||||
path (str): unused.
|
||||
|
||||
Returns:
|
||||
bool: Whether to ignore the object from the diff.
|
||||
"""
|
||||
if isinstance(obj, str):
|
||||
return True if SECRET_MASK in obj else False
|
||||
else:
|
||||
return False
|
||||
|
||||
|
||||
def compute_diff(a: Any, b: Any) -> DeepDiff:
|
||||
"""Wrapper around the DeepDiff computation.
|
||||
|
||||
Args:
|
||||
a (Any): Object to compare with b.
|
||||
b (Any): Object to compare with a.
|
||||
|
||||
Returns:
|
||||
DeepDiff: the computed diff object.
|
||||
"""
|
||||
return DeepDiff(a, b, view="tree", exclude_obj_callback=exclude_secrets_from_diff)
|
||||
|
||||
|
||||
def display_diff_line(diff_line: str) -> None:
|
||||
"""Prettify a diff line and print it to standard output.
|
||||
|
||||
Args:
|
||||
diff_line (str): The diff line to display.
|
||||
"""
|
||||
if "changed from" in diff_line:
|
||||
color = "yellow"
|
||||
prefix = "E"
|
||||
elif "added" in diff_line:
|
||||
color = "green"
|
||||
prefix = "+"
|
||||
elif "removed" in diff_line:
|
||||
color = "red"
|
||||
prefix = "-"
|
||||
else:
|
||||
prefix = ""
|
||||
color = None
|
||||
click.echo(click.style(f"\t{prefix} - {diff_line}", fg=color))
|
||||
@@ -1,886 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import abc
|
||||
import os
|
||||
import time
|
||||
from copy import deepcopy
|
||||
from pathlib import Path
|
||||
from typing import Callable, List, Optional, Set, Tuple, Type, Union
|
||||
|
||||
import airbyte_api_client
|
||||
import click
|
||||
import yaml
|
||||
from airbyte_api_client.api import (
|
||||
destination_api,
|
||||
destination_definition_api,
|
||||
destination_definition_specification_api,
|
||||
source_api,
|
||||
source_definition_api,
|
||||
source_definition_specification_api,
|
||||
web_backend_api,
|
||||
)
|
||||
from airbyte_api_client.model.airbyte_catalog import AirbyteCatalog
|
||||
from airbyte_api_client.model.airbyte_stream import AirbyteStream
|
||||
from airbyte_api_client.model.airbyte_stream_and_configuration import AirbyteStreamAndConfiguration
|
||||
from airbyte_api_client.model.airbyte_stream_configuration import AirbyteStreamConfiguration
|
||||
from airbyte_api_client.model.connection_read import ConnectionRead
|
||||
from airbyte_api_client.model.connection_schedule_data import ConnectionScheduleData
|
||||
from airbyte_api_client.model.connection_schedule_data_basic_schedule import ConnectionScheduleDataBasicSchedule
|
||||
from airbyte_api_client.model.connection_schedule_data_cron import ConnectionScheduleDataCron
|
||||
from airbyte_api_client.model.connection_schedule_type import ConnectionScheduleType
|
||||
from airbyte_api_client.model.connection_status import ConnectionStatus
|
||||
from airbyte_api_client.model.destination_create import DestinationCreate
|
||||
from airbyte_api_client.model.destination_definition_id_request_body import DestinationDefinitionIdRequestBody
|
||||
from airbyte_api_client.model.destination_definition_id_with_workspace_id import DestinationDefinitionIdWithWorkspaceId
|
||||
from airbyte_api_client.model.destination_definition_read import DestinationDefinitionRead
|
||||
from airbyte_api_client.model.destination_definition_specification_read import DestinationDefinitionSpecificationRead
|
||||
from airbyte_api_client.model.destination_id_request_body import DestinationIdRequestBody
|
||||
from airbyte_api_client.model.destination_read import DestinationRead
|
||||
from airbyte_api_client.model.destination_sync_mode import DestinationSyncMode
|
||||
from airbyte_api_client.model.destination_update import DestinationUpdate
|
||||
from airbyte_api_client.model.geography import Geography
|
||||
from airbyte_api_client.model.namespace_definition_type import NamespaceDefinitionType
|
||||
from airbyte_api_client.model.non_breaking_changes_preference import NonBreakingChangesPreference
|
||||
from airbyte_api_client.model.operation_create import OperationCreate
|
||||
from airbyte_api_client.model.operator_configuration import OperatorConfiguration
|
||||
from airbyte_api_client.model.operator_dbt import OperatorDbt
|
||||
from airbyte_api_client.model.operator_normalization import OperatorNormalization
|
||||
from airbyte_api_client.model.operator_type import OperatorType
|
||||
from airbyte_api_client.model.resource_requirements import ResourceRequirements
|
||||
from airbyte_api_client.model.selected_field_info import SelectedFieldInfo
|
||||
from airbyte_api_client.model.source_create import SourceCreate
|
||||
from airbyte_api_client.model.source_definition_id_request_body import SourceDefinitionIdRequestBody
|
||||
from airbyte_api_client.model.source_definition_id_with_workspace_id import SourceDefinitionIdWithWorkspaceId
|
||||
from airbyte_api_client.model.source_definition_read import SourceDefinitionRead
|
||||
from airbyte_api_client.model.source_definition_specification_read import SourceDefinitionSpecificationRead
|
||||
from airbyte_api_client.model.source_discover_schema_request_body import SourceDiscoverSchemaRequestBody
|
||||
from airbyte_api_client.model.source_id_request_body import SourceIdRequestBody
|
||||
from airbyte_api_client.model.source_read import SourceRead
|
||||
from airbyte_api_client.model.source_update import SourceUpdate
|
||||
from airbyte_api_client.model.sync_mode import SyncMode
|
||||
from airbyte_api_client.model.web_backend_connection_create import WebBackendConnectionCreate
|
||||
from airbyte_api_client.model.web_backend_connection_request_body import WebBackendConnectionRequestBody
|
||||
from airbyte_api_client.model.web_backend_connection_update import WebBackendConnectionUpdate
|
||||
from airbyte_api_client.model.web_backend_operation_create_or_update import WebBackendOperationCreateOrUpdate
|
||||
|
||||
from .diff_helpers import compute_diff, hash_config
|
||||
from .yaml_loaders import EnvVarLoader
|
||||
|
||||
|
||||
class NonExistingResourceError(click.ClickException):
|
||||
pass
|
||||
|
||||
|
||||
class InvalidConfigurationError(click.ClickException):
|
||||
pass
|
||||
|
||||
|
||||
class InvalidStateError(click.ClickException):
|
||||
pass
|
||||
|
||||
|
||||
class MissingStateError(click.ClickException):
|
||||
pass
|
||||
|
||||
|
||||
class ResourceState:
|
||||
def __init__(
|
||||
self,
|
||||
configuration_path: Union[str, Path],
|
||||
workspace_id: Optional[str],
|
||||
resource_id: str,
|
||||
generation_timestamp: int,
|
||||
configuration_hash: str,
|
||||
):
|
||||
"""This constructor is meant to be private. Construction shall be made with create or from_file class methods.
|
||||
Args:
|
||||
configuration_path (str): Path to the configuration this state relates to.
|
||||
workspace_id Optional(str): Id of the workspace the state relates to. #TODO mark this a not optional after the user base has upgraded to >= 0.39.18
|
||||
resource_id (str): Id of the resource the state relates to.
|
||||
generation_timestamp (int): State generation timestamp.
|
||||
configuration_hash (str): Hash of the loaded configuration file.
|
||||
"""
|
||||
self.configuration_path = str(configuration_path)
|
||||
self.resource_id = resource_id
|
||||
self.generation_timestamp = generation_timestamp
|
||||
self.configuration_hash = configuration_hash
|
||||
self.workspace_id = workspace_id
|
||||
self.path = self._get_path_from_configuration_and_workspace_id(configuration_path, workspace_id)
|
||||
|
||||
def as_dict(self):
|
||||
return {
|
||||
"resource_id": self.resource_id,
|
||||
"workspace_id": self.workspace_id,
|
||||
"generation_timestamp": self.generation_timestamp,
|
||||
"configuration_path": self.configuration_path,
|
||||
"configuration_hash": self.configuration_hash,
|
||||
}
|
||||
|
||||
def _save(self) -> None:
|
||||
"""Save the state as a YAML file."""
|
||||
with open(self.path, "w") as state_file:
|
||||
yaml.dump(self.as_dict(), state_file)
|
||||
|
||||
@classmethod
|
||||
def create(cls, configuration_path: str, configuration_hash: str, workspace_id: str, resource_id: str) -> "ResourceState":
|
||||
"""Create a state for a resource configuration.
|
||||
Args:
|
||||
configuration_path (str): Path to the YAML file defining the resource.
|
||||
configuration_hash (str): Hash of the loaded configuration fie.
|
||||
resource_id (str): UUID of the resource.
|
||||
Returns:
|
||||
ResourceState: state representing the resource.
|
||||
"""
|
||||
generation_timestamp = int(time.time())
|
||||
state = ResourceState(configuration_path, workspace_id, resource_id, generation_timestamp, configuration_hash)
|
||||
state._save()
|
||||
return state
|
||||
|
||||
def delete(self) -> None:
|
||||
"""Delete the state file"""
|
||||
os.remove(self.path)
|
||||
|
||||
@classmethod
|
||||
def from_file(cls, file_path: str) -> "ResourceState":
|
||||
"""Deserialize a state from a YAML path.
|
||||
Args:
|
||||
file_path (str): Path to the YAML state.
|
||||
Returns:
|
||||
ResourceState: state deserialized from YAML.
|
||||
"""
|
||||
with open(file_path, "r") as f:
|
||||
raw_state = yaml.safe_load(f)
|
||||
return ResourceState(
|
||||
raw_state["configuration_path"],
|
||||
# TODO: workspace id should not be nullable after the user base has upgraded to >= 0.39.18
|
||||
raw_state.get("workspace_id"),
|
||||
raw_state["resource_id"],
|
||||
raw_state["generation_timestamp"],
|
||||
raw_state["configuration_hash"],
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def _get_path_from_configuration_and_workspace_id(cls, configuration_path, workspace_id):
|
||||
return os.path.join(os.path.dirname(configuration_path), f"state_{workspace_id}.yaml")
|
||||
|
||||
@classmethod
|
||||
def from_configuration_path_and_workspace(cls, configuration_path, workspace_id):
|
||||
state_path = cls._get_path_from_configuration_and_workspace_id(configuration_path, workspace_id)
|
||||
state = cls.from_file(state_path)
|
||||
return state
|
||||
|
||||
@classmethod
|
||||
def migrate(self, state_to_migrate_path: str, workspace_id: str) -> "ResourceState":
|
||||
"""Create a per workspace state from a legacy state file and remove the legacy state file.
|
||||
Args:
|
||||
state_to_migrate_path (str): Path to the legacy state file to migrate.
|
||||
workspace_id (str): Workspace id for which the new state will be stored.
|
||||
Returns:
|
||||
ResourceState: The new state after migration.
|
||||
"""
|
||||
state_to_migrate = ResourceState.from_file(state_to_migrate_path)
|
||||
new_state = ResourceState.create(
|
||||
state_to_migrate.configuration_path, state_to_migrate.configuration_hash, workspace_id, state_to_migrate.resource_id
|
||||
)
|
||||
state_to_migrate.delete()
|
||||
return new_state
|
||||
|
||||
|
||||
class BaseResource(abc.ABC):
|
||||
# Priority of the resource during the apply. 0 means the resource is top priority.
|
||||
APPLY_PRIORITY = 0
|
||||
|
||||
@property
|
||||
@abc.abstractmethod
|
||||
def api(
|
||||
self,
|
||||
): # pragma: no cover
|
||||
pass
|
||||
|
||||
@property
|
||||
@abc.abstractmethod
|
||||
def create_function_name(
|
||||
self,
|
||||
): # pragma: no cover
|
||||
pass
|
||||
|
||||
@property
|
||||
@abc.abstractmethod
|
||||
def create_payload(
|
||||
self,
|
||||
): # pragma: no cover
|
||||
pass
|
||||
|
||||
@property
|
||||
@abc.abstractmethod
|
||||
def update_payload(
|
||||
self,
|
||||
): # pragma: no cover
|
||||
pass
|
||||
|
||||
@property
|
||||
@abc.abstractmethod
|
||||
def update_function_name(
|
||||
self,
|
||||
): # pragma: no cover
|
||||
pass
|
||||
|
||||
@property
|
||||
@abc.abstractmethod
|
||||
def get_function_name(
|
||||
self,
|
||||
): # pragma: no cover
|
||||
pass
|
||||
|
||||
@property
|
||||
@abc.abstractmethod
|
||||
def get_payload(
|
||||
self,
|
||||
): # pragma: no cover
|
||||
pass
|
||||
|
||||
@property
|
||||
@abc.abstractmethod
|
||||
def resource_id_field(
|
||||
self,
|
||||
): # pragma: no cover
|
||||
pass
|
||||
|
||||
@property
|
||||
@abc.abstractmethod
|
||||
def resource_type(
|
||||
self,
|
||||
): # pragma: no cover
|
||||
pass
|
||||
|
||||
def __init__(
|
||||
self, api_client: airbyte_api_client.ApiClient, workspace_id: str, raw_configuration: dict, configuration_path: str
|
||||
) -> None:
|
||||
"""Create a BaseResource object.
|
||||
Args:
|
||||
api_client (airbyte_api_client.ApiClient): the Airbyte API client.
|
||||
workspace_id (str): the workspace id.
|
||||
raw_configuration (dict): The local configuration describing the resource.
|
||||
configuration_path (str): The path to the local configuration describing the resource with YAML.
|
||||
"""
|
||||
self._create_fn = getattr(self.api, self.create_function_name)
|
||||
self._update_fn = getattr(self.api, self.update_function_name)
|
||||
self._get_fn = getattr(self.api, self.get_function_name)
|
||||
|
||||
self.workspace_id = workspace_id
|
||||
self.configuration_path = configuration_path
|
||||
self.state = self._get_state_from_file(configuration_path, workspace_id)
|
||||
self.configuration_hash = hash_config(
|
||||
raw_configuration
|
||||
) # Hash as early as possible to limit risks of raw_configuration downstream mutations.
|
||||
|
||||
self.local_file_changed = True if self.state is None else self.configuration_hash != self.state.configuration_hash
|
||||
|
||||
self.raw_configuration = raw_configuration
|
||||
self.configuration = self._deserialize_raw_configuration()
|
||||
self.api_client = api_client
|
||||
self.api_instance = self.api(api_client)
|
||||
self.resource_name = raw_configuration["resource_name"]
|
||||
|
||||
def _deserialize_raw_configuration(self):
|
||||
"""Deserialize a raw configuration into another object and perform extra validation if needed.
|
||||
The base implementation does nothing except extracting the configuration field and returning a copy of it.
|
||||
Returns:
|
||||
dict: Deserialized configuration
|
||||
"""
|
||||
return deepcopy(self.raw_configuration["configuration"])
|
||||
|
||||
@staticmethod
|
||||
def _check_for_invalid_configuration_keys(dict_to_check: dict, invalid_keys: Set[str], error_message: str):
|
||||
"""Utils function to check if a configuration dictionnary has legacy keys that were removed/renamed after an octavia update.
|
||||
Args:
|
||||
dict_to_check (dict): The dictionnary for which keys should be checked
|
||||
invalid_keys (Set[str]): The set of invalid keys we want to check the existence
|
||||
error_message (str): The error message to display to the user
|
||||
Raises:
|
||||
InvalidConfigurationError: Raised if an invalid key was found in the dict_to_check
|
||||
"""
|
||||
invalid_keys = list(set(dict_to_check.keys()) & invalid_keys)
|
||||
if invalid_keys:
|
||||
raise InvalidConfigurationError(f"Invalid configuration keys: {', '.join(invalid_keys)}. {error_message}. ")
|
||||
|
||||
@property
|
||||
def remote_resource(self):
|
||||
return self._get_remote_resource() if self.state else None
|
||||
|
||||
def _get_local_comparable_configuration(self) -> dict:
|
||||
return self.raw_configuration["configuration"]
|
||||
|
||||
@abc.abstractmethod
|
||||
def _get_remote_comparable_configuration(
|
||||
self,
|
||||
) -> dict: # pragma: no cover
|
||||
raise NotImplementedError
|
||||
|
||||
@property
|
||||
def was_created(self):
|
||||
return True if self.remote_resource else False
|
||||
|
||||
def _get_remote_resource(self) -> Union[SourceRead, DestinationRead, ConnectionRead]:
|
||||
"""Retrieve a resources on the remote Airbyte instance.
|
||||
Returns:
|
||||
Union[SourceReadList, DestinationReadList, ConnectionReadList]: Search results
|
||||
"""
|
||||
return self._get_fn(self.api_instance, self.get_payload)
|
||||
|
||||
@staticmethod
|
||||
def _get_state_from_file(configuration_file: str, workspace_id: str) -> Optional[ResourceState]:
|
||||
"""Retrieve a state object from a local YAML file if it exists.
|
||||
Returns:
|
||||
Optional[ResourceState]: the deserialized resource state if YAML file found.
|
||||
"""
|
||||
expected_state_path = Path(os.path.join(os.path.dirname(configuration_file), f"state_{workspace_id}.yaml"))
|
||||
legacy_state_path = Path(os.path.join(os.path.dirname(configuration_file), "state.yaml"))
|
||||
if expected_state_path.is_file():
|
||||
return ResourceState.from_file(expected_state_path)
|
||||
elif legacy_state_path.is_file(): # TODO: remove condition after user base has upgraded to >= 0.39.18
|
||||
if click.confirm(
|
||||
click.style(
|
||||
f"⚠️ - State files are now saved on a workspace basis. Do you want octavia to rename and update {legacy_state_path}? ",
|
||||
fg="red",
|
||||
)
|
||||
):
|
||||
return ResourceState.migrate(legacy_state_path, workspace_id)
|
||||
else:
|
||||
raise InvalidStateError(
|
||||
f"Octavia expects the state file to be located at {expected_state_path} with a workspace_id key. Please update {legacy_state_path}."
|
||||
)
|
||||
else:
|
||||
return None
|
||||
|
||||
def get_diff_with_remote_resource(self) -> str:
|
||||
"""Compute the diff between current resource and the remote resource.
|
||||
Raises:
|
||||
NonExistingResourceError: Raised if the remote resource does not exist.
|
||||
Returns:
|
||||
str: The prettyfied diff.
|
||||
"""
|
||||
if not self.was_created:
|
||||
raise NonExistingResourceError("Cannot compute diff with a non existing remote resource.")
|
||||
local_config = self._get_local_comparable_configuration()
|
||||
remote_config = self._get_remote_comparable_configuration()
|
||||
diff = compute_diff(remote_config, local_config)
|
||||
return diff.pretty()
|
||||
|
||||
def _create_or_update(
|
||||
self,
|
||||
operation_fn: Callable,
|
||||
payload: Union[
|
||||
SourceCreate, SourceUpdate, DestinationCreate, DestinationUpdate, WebBackendConnectionCreate, WebBackendConnectionUpdate
|
||||
],
|
||||
) -> Union[SourceRead, DestinationRead]:
|
||||
"""Wrapper to trigger create or update of remote resource.
|
||||
Args:
|
||||
operation_fn (Callable): The API function to run.
|
||||
payload (Union[SourceCreate, SourceUpdate, DestinationCreate, DestinationUpdate]): The payload to send to create or update the resource.
|
||||
.
|
||||
Raises:
|
||||
InvalidConfigurationError: Raised if the create or update payload is invalid.
|
||||
ApiException: Raised in case of other API errors.
|
||||
Returns:
|
||||
Union[SourceRead, DestinationRead, ConnectionRead]: The created or updated resource.
|
||||
"""
|
||||
try:
|
||||
result = operation_fn(self.api_instance, payload)
|
||||
new_state = ResourceState.create(
|
||||
self.configuration_path, self.configuration_hash, self.workspace_id, result[self.resource_id_field]
|
||||
)
|
||||
return result, new_state
|
||||
except airbyte_api_client.ApiException as api_error:
|
||||
if api_error.status == 422:
|
||||
# This API response error is really verbose, but it embodies all the details about why the config is not valid.
|
||||
# TODO alafanechere: try to parse it and display it in a more readable way.
|
||||
raise InvalidConfigurationError(api_error.body)
|
||||
else:
|
||||
raise api_error
|
||||
|
||||
def manage(
|
||||
self, resource_id: str
|
||||
) -> Union[Tuple[SourceRead, ResourceState], Tuple[DestinationRead, ResourceState], Tuple[ConnectionRead, ResourceState]]:
|
||||
"""Declare a remote resource as locally managed by creating a local state
|
||||
|
||||
Args:
|
||||
resource_id (str): Remote resource ID.
|
||||
|
||||
Returns:
|
||||
Union[Tuple[SourceRead, ResourceState], Tuple[DestinationRead, ResourceState], Tuple[ConnectionRead, ResourceState]]: The remote resource model instance and its local state.
|
||||
"""
|
||||
self.state = ResourceState.create(self.configuration_path, self.configuration_hash, self.workspace_id, resource_id)
|
||||
|
||||
return self.remote_resource, self.state
|
||||
|
||||
def create(self) -> Union[SourceRead, DestinationRead, ConnectionRead]:
|
||||
"""Public function to create the resource on the remote Airbyte instance.
|
||||
Returns:
|
||||
Union[SourceRead, DestinationRead, ConnectionRead]: The created resource.
|
||||
"""
|
||||
return self._create_or_update(self._create_fn, self.create_payload)
|
||||
|
||||
def update(self) -> Union[SourceRead, DestinationRead, ConnectionRead]:
|
||||
"""Public function to update the resource on the remote Airbyte instance.
|
||||
Returns:
|
||||
Union[SourceRead, DestinationRead, ConnectionRead]: The updated resource.
|
||||
"""
|
||||
return self._create_or_update(self._update_fn, self.update_payload)
|
||||
|
||||
@property
|
||||
def resource_id(self) -> Optional[str]:
|
||||
"""Exposes the resource UUID of the remote resource
|
||||
Returns:
|
||||
str: Remote resource's UUID
|
||||
"""
|
||||
return self.state.resource_id if self.state is not None else None
|
||||
|
||||
|
||||
class SourceAndDestination(BaseResource):
|
||||
@property
|
||||
@abc.abstractmethod
|
||||
def definition(
|
||||
self,
|
||||
): # pragma: no cover
|
||||
pass
|
||||
|
||||
@property
|
||||
@abc.abstractmethod
|
||||
def definition_specification(
|
||||
self,
|
||||
): # pragma: no cover
|
||||
pass
|
||||
|
||||
@property
|
||||
def definition_id(self):
|
||||
return self.raw_configuration["definition_id"]
|
||||
|
||||
@property
|
||||
def definition_image(self):
|
||||
return self.raw_configuration["definition_image"]
|
||||
|
||||
@property
|
||||
def definition_version(self):
|
||||
return self.raw_configuration["definition_version"]
|
||||
|
||||
def _get_remote_comparable_configuration(self) -> dict:
|
||||
return self.remote_resource.connection_configuration
|
||||
|
||||
|
||||
class Source(SourceAndDestination):
|
||||
|
||||
api = source_api.SourceApi
|
||||
create_function_name = "create_source"
|
||||
resource_id_field = "source_id"
|
||||
get_function_name = "get_source"
|
||||
update_function_name = "update_source"
|
||||
resource_type = "source"
|
||||
|
||||
@property
|
||||
def create_payload(self):
|
||||
return SourceCreate(self.definition_id, self.configuration, self.workspace_id, self.resource_name)
|
||||
|
||||
@property
|
||||
def get_payload(self) -> Optional[SourceIdRequestBody]:
|
||||
"""Defines the payload to retrieve the remote source if a state exists.
|
||||
Returns:
|
||||
SourceIdRequestBody: The SourceIdRequestBody payload.
|
||||
"""
|
||||
if self.state is not None:
|
||||
return SourceIdRequestBody(self.state.resource_id)
|
||||
|
||||
@property
|
||||
def update_payload(self):
|
||||
return SourceUpdate(
|
||||
source_id=self.resource_id,
|
||||
connection_configuration=self.configuration,
|
||||
name=self.resource_name,
|
||||
)
|
||||
|
||||
@property
|
||||
def source_discover_schema_request_body(self) -> SourceDiscoverSchemaRequestBody:
|
||||
"""Creates SourceDiscoverSchemaRequestBody from resource id.
|
||||
Raises:
|
||||
NonExistingResourceError: raised if the resource id is None.
|
||||
Returns:
|
||||
SourceDiscoverSchemaRequestBody: The SourceDiscoverSchemaRequestBody model instance.
|
||||
"""
|
||||
if self.resource_id is None:
|
||||
raise NonExistingResourceError("The resource id could not be retrieved, the remote resource is not existing.")
|
||||
return SourceDiscoverSchemaRequestBody(self.resource_id)
|
||||
|
||||
@property
|
||||
def catalog(self) -> AirbyteCatalog:
|
||||
"""Retrieves the source's Airbyte catalog.
|
||||
Returns:
|
||||
AirbyteCatalog: The catalog issued by schema discovery.
|
||||
"""
|
||||
schema = self.api_instance.discover_schema_for_source(self.source_discover_schema_request_body)
|
||||
if schema.job_info.succeeded:
|
||||
return schema.catalog
|
||||
raise Exception("Could not discover schema for source", self.source_discover_schema_request_body, schema.job_info.logs)
|
||||
|
||||
@property
|
||||
def definition(self) -> SourceDefinitionRead:
|
||||
api_instance = source_definition_api.SourceDefinitionApi(self.api_client)
|
||||
payload = SourceDefinitionIdRequestBody(source_definition_id=self.definition_id)
|
||||
return api_instance.get_source_definition(payload)
|
||||
|
||||
@property
|
||||
def definition_specification(self) -> SourceDefinitionSpecificationRead:
|
||||
api_instance = source_definition_specification_api.SourceDefinitionSpecificationApi(self.api_client)
|
||||
payload = SourceDefinitionIdWithWorkspaceId(source_definition_id=self.definition_id, workspace_id=self.workspace_id)
|
||||
return api_instance.get_source_definition_specification(payload)
|
||||
|
||||
|
||||
class Destination(SourceAndDestination):
|
||||
api = destination_api.DestinationApi
|
||||
create_function_name = "create_destination"
|
||||
resource_id_field = "destination_id"
|
||||
get_function_name = "get_destination"
|
||||
update_function_name = "update_destination"
|
||||
resource_type = "destination"
|
||||
|
||||
@property
|
||||
def create_payload(self) -> DestinationCreate:
|
||||
"""Defines the payload to create the remote resource.
|
||||
Returns:
|
||||
DestinationCreate: The DestinationCreate model instance
|
||||
"""
|
||||
return DestinationCreate(self.workspace_id, self.resource_name, self.definition_id, self.configuration)
|
||||
|
||||
@property
|
||||
def get_payload(self) -> Optional[DestinationRead]:
|
||||
"""Defines the payload to retrieve the remote destination if a state exists.
|
||||
Returns:
|
||||
DestinationRead: The DestinationRead model instance
|
||||
"""
|
||||
if self.state is not None:
|
||||
return DestinationIdRequestBody(self.state.resource_id)
|
||||
|
||||
@property
|
||||
def update_payload(self) -> DestinationUpdate:
|
||||
"""Defines the payload to update a remote resource.
|
||||
Returns:
|
||||
DestinationUpdate: The DestinationUpdate model instance.
|
||||
"""
|
||||
return DestinationUpdate(
|
||||
destination_id=self.resource_id,
|
||||
connection_configuration=self.configuration,
|
||||
name=self.resource_name,
|
||||
)
|
||||
|
||||
@property
|
||||
def definition(self) -> DestinationDefinitionRead:
|
||||
api_instance = destination_definition_api.DestinationDefinitionApi(self.api_client)
|
||||
payload = DestinationDefinitionIdRequestBody(destination_definition_id=self.definition_id)
|
||||
return api_instance.get_destination_definition(payload)
|
||||
|
||||
@property
|
||||
def definition_specification(self) -> DestinationDefinitionSpecificationRead:
|
||||
api_instance = destination_definition_specification_api.DestinationDefinitionSpecificationApi(self.api_client)
|
||||
payload = DestinationDefinitionIdWithWorkspaceId(destination_definition_id=self.definition_id, workspace_id=self.workspace_id)
|
||||
return api_instance.get_destination_definition_specification(payload)
|
||||
|
||||
|
||||
class Connection(BaseResource):
|
||||
# Set to 1 to create connection after source or destination.
|
||||
APPLY_PRIORITY = 1
|
||||
api = web_backend_api.WebBackendApi
|
||||
create_function_name = "web_backend_create_connection"
|
||||
update_function_name = "web_backend_update_connection"
|
||||
get_function_name = "web_backend_get_connection"
|
||||
resource_id_field = "connection_id"
|
||||
|
||||
resource_type = "connection"
|
||||
|
||||
local_root_level_keys_to_remove_during_create = ["skip_reset"] # Remove these keys when sending a create request
|
||||
|
||||
local_root_level_keys_to_filter_out_for_comparison = ["skip_reset"] # Remote do not have these keys
|
||||
|
||||
remote_root_level_keys_to_filter_out_for_comparison = [
|
||||
"name",
|
||||
"source",
|
||||
"destination",
|
||||
"source_id",
|
||||
"destination_id",
|
||||
"connection_id",
|
||||
"operation_ids",
|
||||
"source_catalog_id",
|
||||
"catalog_id",
|
||||
"is_syncing",
|
||||
"latest_sync_job_status",
|
||||
"latest_sync_job_created_at",
|
||||
"schedule",
|
||||
] # We do not allow local editing of these keys
|
||||
|
||||
# We do not allow local editing of these keys
|
||||
remote_operation_level_keys_to_filter_out = ["workspace_id", "operation_id"]
|
||||
|
||||
def _deserialize_raw_configuration(self):
|
||||
"""Deserialize a raw configuration into another dict and perform serialization if needed.
|
||||
In this implementation we cast raw types to Airbyte API client models types for validation.
|
||||
Args:
|
||||
raw_configuration (dict): The raw configuration
|
||||
Returns:
|
||||
dict: Deserialized connection configuration
|
||||
"""
|
||||
self._check_for_legacy_raw_configuration_keys(self.raw_configuration)
|
||||
configuration = super()._deserialize_raw_configuration()
|
||||
self._check_for_legacy_connection_configuration_keys(configuration)
|
||||
configuration["sync_catalog"] = self._create_configured_catalog(configuration["sync_catalog"])
|
||||
configuration["namespace_definition"] = NamespaceDefinitionType(configuration["namespace_definition"])
|
||||
if "non_breaking_changes_preference" in configuration:
|
||||
configuration["non_breaking_changes_preference"] = NonBreakingChangesPreference(
|
||||
configuration["non_breaking_changes_preference"]
|
||||
)
|
||||
else:
|
||||
configuration["non_breaking_changes_preference"] = NonBreakingChangesPreference("ignore")
|
||||
if "geography" in configuration:
|
||||
configuration["geography"] = Geography(configuration["geography"])
|
||||
else:
|
||||
configuration["geography"] = Geography("auto")
|
||||
|
||||
if "schedule_type" in configuration:
|
||||
# If schedule type is manual we do not expect a schedule_data field to be set
|
||||
# TODO: sending a WebConnectionCreate payload without schedule_data (for manual) fails.
|
||||
is_manual = configuration["schedule_type"] == "manual"
|
||||
configuration["schedule_type"] = ConnectionScheduleType(configuration["schedule_type"])
|
||||
if not is_manual:
|
||||
if "basic_schedule" in configuration["schedule_data"]:
|
||||
basic_schedule = ConnectionScheduleDataBasicSchedule(**configuration["schedule_data"]["basic_schedule"])
|
||||
configuration["schedule_data"]["basic_schedule"] = basic_schedule
|
||||
if "cron" in configuration["schedule_data"]:
|
||||
cron = ConnectionScheduleDataCron(**configuration["schedule_data"]["cron"])
|
||||
configuration["schedule_data"]["cron"] = cron
|
||||
configuration["schedule_data"] = ConnectionScheduleData(**configuration["schedule_data"])
|
||||
if "resource_requirements" in configuration:
|
||||
configuration["resource_requirements"] = ResourceRequirements(**configuration["resource_requirements"])
|
||||
configuration["status"] = ConnectionStatus(configuration["status"])
|
||||
return configuration
|
||||
|
||||
@property
|
||||
def source_id(self):
|
||||
"""Retrieve the source id from the source state file of the current workspace.
|
||||
Raises:
|
||||
MissingStateError: Raised if the state file of the current workspace is not found.
|
||||
Returns:
|
||||
str: source id
|
||||
"""
|
||||
try:
|
||||
source_state = ResourceState.from_configuration_path_and_workspace(
|
||||
self.raw_configuration["source_configuration_path"], self.workspace_id
|
||||
)
|
||||
except FileNotFoundError:
|
||||
raise MissingStateError(
|
||||
f"Could not find the source state file for configuration {self.raw_configuration['source_configuration_path']}."
|
||||
)
|
||||
return source_state.resource_id
|
||||
|
||||
@property
|
||||
def destination_id(self):
|
||||
"""Retrieve the destination id from the destination state file of the current workspace.
|
||||
Raises:
|
||||
MissingStateError: Raised if the state file of the current workspace is not found.
|
||||
Returns:
|
||||
str: destination id
|
||||
"""
|
||||
try:
|
||||
destination_state = ResourceState.from_configuration_path_and_workspace(
|
||||
self.raw_configuration["destination_configuration_path"], self.workspace_id
|
||||
)
|
||||
except FileNotFoundError:
|
||||
raise MissingStateError(
|
||||
f"Could not find the destination state file for configuration {self.raw_configuration['destination_configuration_path']}."
|
||||
)
|
||||
return destination_state.resource_id
|
||||
|
||||
@property
|
||||
def create_payload(self) -> WebBackendConnectionCreate:
|
||||
"""Defines the payload to create the remote connection.
|
||||
Returns:
|
||||
WebBackendConnectionCreate: The WebBackendConnectionCreate model instance
|
||||
"""
|
||||
|
||||
if self.raw_configuration["configuration"].get("operations") is not None:
|
||||
self.configuration["operations"] = self._deserialize_operations(
|
||||
self.raw_configuration["configuration"]["operations"], OperationCreate
|
||||
)
|
||||
for k in self.local_root_level_keys_to_remove_during_create:
|
||||
self.configuration.pop(k, None)
|
||||
return WebBackendConnectionCreate(
|
||||
name=self.resource_name, source_id=self.source_id, destination_id=self.destination_id, **self.configuration
|
||||
)
|
||||
|
||||
@property
|
||||
def get_payload(self) -> Optional[WebBackendConnectionRequestBody]:
|
||||
"""Defines the payload to retrieve the remote connection if a state exists.
|
||||
Returns:
|
||||
ConnectionIdRequestBody: The ConnectionIdRequestBody payload.
|
||||
"""
|
||||
if self.state is not None:
|
||||
return WebBackendConnectionRequestBody(connection_id=self.state.resource_id, with_refreshed_catalog=False)
|
||||
|
||||
@property
|
||||
def update_payload(self) -> WebBackendConnectionUpdate:
|
||||
"""Defines the payload to update a remote connection.
|
||||
Returns:
|
||||
WebBackendConnectionUpdate: The DestinationUpdate model instance.
|
||||
"""
|
||||
if self.raw_configuration["configuration"].get("operations") is not None:
|
||||
self.configuration["operations"] = self._deserialize_operations(
|
||||
self.raw_configuration["configuration"]["operations"], WebBackendOperationCreateOrUpdate
|
||||
)
|
||||
return WebBackendConnectionUpdate(connection_id=self.resource_id, **self.configuration)
|
||||
|
||||
def create(self) -> dict:
|
||||
return self._create_or_update(self._create_fn, self.create_payload)
|
||||
|
||||
def update(self) -> dict:
|
||||
return self._create_or_update(self._update_fn, self.update_payload)
|
||||
|
||||
@staticmethod
|
||||
def _create_configured_catalog(sync_catalog: dict) -> AirbyteCatalog:
|
||||
"""Deserialize a sync_catalog represented as dict to an AirbyteCatalog.
|
||||
Args:
|
||||
sync_catalog (dict): The sync catalog represented as a dict.
|
||||
Returns:
|
||||
AirbyteCatalog: The configured catalog.
|
||||
"""
|
||||
streams_and_configurations = []
|
||||
for stream in sync_catalog["streams"]:
|
||||
stream["stream"]["supported_sync_modes"] = [SyncMode(sm) for sm in stream["stream"]["supported_sync_modes"]]
|
||||
stream["config"]["sync_mode"] = SyncMode(stream["config"]["sync_mode"])
|
||||
stream["config"]["destination_sync_mode"] = DestinationSyncMode(stream["config"]["destination_sync_mode"])
|
||||
if "selected_fields" in stream["config"]:
|
||||
stream["config"]["selected_fields"] = [
|
||||
SelectedFieldInfo(field_path=selected_field["field_path"]) for selected_field in stream["config"]["selected_fields"]
|
||||
]
|
||||
streams_and_configurations.append(
|
||||
AirbyteStreamAndConfiguration(
|
||||
stream=AirbyteStream(**stream["stream"]), config=AirbyteStreamConfiguration(**stream["config"])
|
||||
)
|
||||
)
|
||||
return AirbyteCatalog(streams_and_configurations)
|
||||
|
||||
def _deserialize_operations(
|
||||
self, operations: List[dict], outputModelClass: Union[Type[OperationCreate], Type[WebBackendOperationCreateOrUpdate]]
|
||||
) -> List[Union[OperationCreate, WebBackendOperationCreateOrUpdate]]:
|
||||
"""Deserialize operations to OperationCreate (to create connection) or WebBackendOperationCreateOrUpdate (to update connection) models.
|
||||
Args:
|
||||
operations (List[dict]): List of operations to deserialize
|
||||
outputModelClass (Union[Type[OperationCreate], Type[WebBackendOperationCreateOrUpdate]]): The model to which the operation dict will be deserialized
|
||||
Raises:
|
||||
ValueError: Raised if the operator type declared in the configuration is not supported
|
||||
Returns:
|
||||
List[Union[OperationCreate, WebBackendOperationCreateOrUpdate]]: Deserialized operations
|
||||
"""
|
||||
deserialized_operations = []
|
||||
for operation in operations:
|
||||
if operation["operator_configuration"]["operator_type"] == "normalization":
|
||||
operation = outputModelClass(
|
||||
workspace_id=self.workspace_id,
|
||||
name=operation["name"],
|
||||
operator_configuration=OperatorConfiguration(
|
||||
operator_type=OperatorType(operation["operator_configuration"]["operator_type"]),
|
||||
normalization=OperatorNormalization(**operation["operator_configuration"]["normalization"]),
|
||||
),
|
||||
)
|
||||
elif operation["operator_configuration"]["operator_type"] == "dbt":
|
||||
operation = outputModelClass(
|
||||
workspace_id=self.workspace_id,
|
||||
name=operation["name"],
|
||||
operator_configuration=OperatorConfiguration(
|
||||
operator_type=OperatorType(operation["operator_configuration"]["operator_type"]),
|
||||
dbt=OperatorDbt(**operation["operator_configuration"]["dbt"]),
|
||||
),
|
||||
)
|
||||
else:
|
||||
raise ValueError(f"Operation type {operation['operator_configuration']['operator_type']} is not supported")
|
||||
deserialized_operations.append(operation)
|
||||
return deserialized_operations
|
||||
|
||||
def _check_for_legacy_connection_configuration_keys(self, configuration_to_check):
|
||||
self._check_for_wrong_casing_in_connection_configurations_keys(configuration_to_check)
|
||||
self._check_for_schedule_in_connection_configurations_keys(configuration_to_check)
|
||||
|
||||
# TODO this check can be removed when all our active user are on >= 0.37.0
|
||||
def _check_for_schedule_in_connection_configurations_keys(self, configuration_to_check):
|
||||
error_message = "The schedule key is deprecated since 0.40.0, please use a combination of schedule_type and schedule_data"
|
||||
self._check_for_invalid_configuration_keys(configuration_to_check, {"schedule"}, error_message)
|
||||
|
||||
def _check_for_wrong_casing_in_connection_configurations_keys(self, configuration_to_check):
|
||||
"""We changed connection configuration keys from camelCase to snake_case in 0.37.0.
|
||||
This function check if the connection configuration has some camelCase keys and display a meaningful error message.
|
||||
Args:
|
||||
configuration_to_check (dict): Configuration to validate
|
||||
"""
|
||||
error_message = "These keys should be in snake_case since version 0.37.0, please edit or regenerate your connection configuration"
|
||||
self._check_for_invalid_configuration_keys(
|
||||
configuration_to_check, {"syncCatalog", "namespaceDefinition", "namespaceFormat", "resourceRequirements"}, error_message
|
||||
)
|
||||
self._check_for_invalid_configuration_keys(configuration_to_check.get("schedule", {}), {"timeUnit"}, error_message)
|
||||
for stream in configuration_to_check["sync_catalog"]["streams"]:
|
||||
self._check_for_invalid_configuration_keys(
|
||||
stream["stream"],
|
||||
{"defaultCursorField", "jsonSchema", "sourceDefinedCursor", "sourceDefinedPrimaryKey", "supportedSyncModes"},
|
||||
error_message,
|
||||
)
|
||||
self._check_for_invalid_configuration_keys(
|
||||
stream["config"], {"aliasName", "cursorField", "destinationSyncMode", "primaryKey", "syncMode"}, error_message
|
||||
)
|
||||
|
||||
# TODO this check can be removed when all our active user are on > 0.39.18
|
||||
def _check_for_legacy_raw_configuration_keys(self, raw_configuration):
|
||||
self._check_for_invalid_configuration_keys(
|
||||
raw_configuration,
|
||||
{"source_id", "destination_id"},
|
||||
"These keys changed to source_configuration_path and destination_configuration_path in version > 0.39.18, please update your connection configuration to give path to source and destination configuration files or regenerate the connection",
|
||||
)
|
||||
|
||||
def _get_local_comparable_configuration(self) -> dict:
|
||||
comparable = {
|
||||
k: v
|
||||
for k, v in self.raw_configuration["configuration"].items()
|
||||
if k not in self.local_root_level_keys_to_filter_out_for_comparison
|
||||
}
|
||||
return comparable
|
||||
|
||||
def _get_remote_comparable_configuration(self) -> dict:
|
||||
|
||||
comparable = {
|
||||
k: v for k, v in self.remote_resource.to_dict().items() if k not in self.remote_root_level_keys_to_filter_out_for_comparison
|
||||
}
|
||||
if "operations" in comparable:
|
||||
for operation in comparable["operations"]:
|
||||
for k in self.remote_operation_level_keys_to_filter_out:
|
||||
operation.pop(k)
|
||||
if "operations" in comparable and len(comparable["operations"]) == 0:
|
||||
comparable.pop("operations")
|
||||
return comparable
|
||||
|
||||
|
||||
def factory(api_client: airbyte_api_client.ApiClient, workspace_id: str, configuration_path: str) -> Union[Source, Destination, Connection]:
|
||||
"""Create resource object according to the definition type field in their YAML configuration.
|
||||
Args:
|
||||
api_client (airbyte_api_client.ApiClient): The Airbyte API client.
|
||||
workspace_id (str): The current workspace id.
|
||||
configuration_path (str): Path to the YAML file with the configuration.
|
||||
Raises:
|
||||
NotImplementedError: Raised if the definition type found in the YAML is not a supported resource.
|
||||
Returns:
|
||||
Union[Source, Destination, Connection]: The resource object created from the YAML config.
|
||||
"""
|
||||
with open(configuration_path, "r") as f:
|
||||
raw_configuration = yaml.load(f, EnvVarLoader)
|
||||
if raw_configuration["definition_type"] == "source":
|
||||
return Source(api_client, workspace_id, raw_configuration, configuration_path)
|
||||
if raw_configuration["definition_type"] == "destination":
|
||||
return Destination(api_client, workspace_id, raw_configuration, configuration_path)
|
||||
if raw_configuration["definition_type"] == "connection":
|
||||
return Connection(api_client, workspace_id, raw_configuration, configuration_path)
|
||||
else:
|
||||
raise NotImplementedError(f"Resource {raw_configuration['definition_type']} was not yet implemented")
|
||||
@@ -1,35 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import os
|
||||
import re
|
||||
from typing import Any
|
||||
|
||||
import yaml
|
||||
|
||||
ENV_VAR_MATCHER_PATTERN = re.compile(r".*\$\{([^}^{]+)\}.*")
|
||||
|
||||
|
||||
def env_var_replacer(loader: yaml.Loader, node: yaml.Node) -> Any:
|
||||
"""Convert a YAML node to a Python object, expanding variable.
|
||||
|
||||
Args:
|
||||
loader (yaml.Loader): Not used
|
||||
node (yaml.Node): Yaml node to convert to python object
|
||||
|
||||
Returns:
|
||||
Any: Python object with expanded vars.
|
||||
"""
|
||||
return os.path.expandvars(node.value)
|
||||
|
||||
|
||||
class EnvVarLoader(yaml.SafeLoader):
|
||||
pass
|
||||
|
||||
|
||||
# All yaml nodes matching the regex will be tagged as !environment_variable.
|
||||
EnvVarLoader.add_implicit_resolver("!environment_variable", ENV_VAR_MATCHER_PATTERN, None)
|
||||
|
||||
# All yaml nodes tagged as !environment_variable will be constructed with the env_var_replacer callback.
|
||||
EnvVarLoader.add_constructor("!environment_variable", env_var_replacer)
|
||||
@@ -1,56 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import typing as t
|
||||
|
||||
import click
|
||||
|
||||
|
||||
class OctaviaCommand(click.Command):
|
||||
def make_context(
|
||||
self, info_name: t.Optional[str], args: t.List[str], parent: t.Optional[click.Context] = None, **extra: t.Any
|
||||
) -> click.Context:
|
||||
"""Wrap parent make context with telemetry sending in case of failure.
|
||||
|
||||
Args:
|
||||
info_name (t.Optional[str]): The info name for this invocation.
|
||||
args (t.List[str]): The arguments to parse as list of strings.
|
||||
parent (t.Optional[click.Context], optional): The parent context if available.. Defaults to None.
|
||||
|
||||
Raises:
|
||||
e: Raise whatever exception that was caught.
|
||||
|
||||
Returns:
|
||||
click.Context: The built context.
|
||||
"""
|
||||
try:
|
||||
return super().make_context(info_name, args, parent, **extra)
|
||||
except Exception as e:
|
||||
telemetry_client = parent.obj["TELEMETRY_CLIENT"]
|
||||
if isinstance(e, click.exceptions.Exit) and e.exit_code == 0: # Click raises Exit(0) errors when running --help commands
|
||||
telemetry_client.send_command_telemetry(parent, extra_info_name=info_name, is_help=True)
|
||||
else:
|
||||
telemetry_client.send_command_telemetry(parent, error=e, extra_info_name=info_name)
|
||||
raise e
|
||||
|
||||
def invoke(self, ctx: click.Context) -> t.Any:
|
||||
"""Wrap parent invoke by sending telemetry in case of success or failure.
|
||||
|
||||
Args:
|
||||
ctx (click.Context): The invocation context.
|
||||
|
||||
Raises:
|
||||
e: Raise whatever exception that was caught.
|
||||
|
||||
Returns:
|
||||
t.Any: The invocation return value.
|
||||
"""
|
||||
telemetry_client = ctx.obj["TELEMETRY_CLIENT"]
|
||||
try:
|
||||
result = super().invoke(ctx)
|
||||
except Exception as e:
|
||||
telemetry_client.send_command_telemetry(ctx, error=e)
|
||||
raise e
|
||||
telemetry_client.send_command_telemetry(ctx)
|
||||
return result
|
||||
@@ -1,93 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import os
|
||||
|
||||
import airbyte_api_client
|
||||
import click
|
||||
from airbyte_api_client.api import health_api, workspace_api
|
||||
from airbyte_api_client.model.workspace_id_request_body import WorkspaceIdRequestBody
|
||||
from urllib3.exceptions import MaxRetryError
|
||||
|
||||
from .init.commands import DIRECTORIES_TO_CREATE as REQUIRED_PROJECT_DIRECTORIES
|
||||
|
||||
|
||||
class UnhealthyApiError(click.ClickException):
|
||||
pass
|
||||
|
||||
|
||||
class UnreachableAirbyteInstanceError(click.ClickException):
|
||||
pass
|
||||
|
||||
|
||||
class WorkspaceIdError(click.ClickException):
|
||||
pass
|
||||
|
||||
|
||||
class ProjectNotInitializedError(click.ClickException):
|
||||
pass
|
||||
|
||||
|
||||
def check_api_health(api_client: airbyte_api_client.ApiClient) -> None:
|
||||
"""Check if the Airbyte API is network reachable and healthy.
|
||||
|
||||
Args:
|
||||
api_client (airbyte_api_client.ApiClient): Airbyte API client.
|
||||
|
||||
Raises:
|
||||
click.ClickException: Raised if the Airbyte api server is unavailable according to the API response.
|
||||
click.ClickException: Raised if the Airbyte URL is not reachable.
|
||||
"""
|
||||
api_instance = health_api.HealthApi(api_client)
|
||||
try:
|
||||
api_response = api_instance.get_health_check()
|
||||
if not api_response.available:
|
||||
raise UnhealthyApiError(
|
||||
"Your Airbyte instance is not ready to receive requests: the health endpoint returned 'available: False.'"
|
||||
)
|
||||
except (airbyte_api_client.ApiException, MaxRetryError) as e:
|
||||
raise UnreachableAirbyteInstanceError(
|
||||
f"Could not reach your Airbyte instance, make sure the instance is up and running and network reachable: {e}"
|
||||
)
|
||||
|
||||
|
||||
def check_workspace_exists(api_client: airbyte_api_client.ApiClient, workspace_id: str) -> None:
|
||||
"""Check if the provided workspace id corresponds to an existing workspace on the Airbyte instance.
|
||||
|
||||
Args:
|
||||
api_client (airbyte_api_client.ApiClient): Airbyte API client.
|
||||
workspace_id (str): Id of the workspace whose existence we are trying to verify.
|
||||
|
||||
Raises:
|
||||
click.ClickException: Raised if the workspace does not exist on the Airbyte instance.
|
||||
"""
|
||||
api_instance = workspace_api.WorkspaceApi(api_client)
|
||||
try:
|
||||
api_instance.get_workspace(WorkspaceIdRequestBody(workspace_id=workspace_id), _check_return_type=False)
|
||||
except airbyte_api_client.ApiException:
|
||||
raise WorkspaceIdError("The workspace you are trying to use does not exist in your Airbyte instance")
|
||||
|
||||
|
||||
def check_is_initialized(project_directory: str = ".") -> bool:
|
||||
"""Check if required project directories exist to consider the project as initialized.
|
||||
|
||||
Args:
|
||||
project_directory (str, optional): Where the project should be initialized. Defaults to ".".
|
||||
|
||||
Returns:
|
||||
bool: [description]
|
||||
"""
|
||||
sub_directories = [f.name for f in os.scandir(project_directory) if f.is_dir()]
|
||||
return set(REQUIRED_PROJECT_DIRECTORIES).issubset(sub_directories)
|
||||
|
||||
|
||||
def requires_init(f):
|
||||
def wrapper(ctx, **kwargs):
|
||||
if not ctx.obj["PROJECT_IS_INITIALIZED"]:
|
||||
raise ProjectNotInitializedError(
|
||||
"Your octavia project is not initialized, please run 'octavia init' before running this command."
|
||||
)
|
||||
f(ctx, **kwargs)
|
||||
|
||||
return wrapper
|
||||
@@ -1,187 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
from typing import List, Optional, Tuple
|
||||
|
||||
import airbyte_api_client
|
||||
import click
|
||||
import pkg_resources
|
||||
from airbyte_api_client.api import workspace_api
|
||||
from airbyte_api_client.model.workspace_id_request_body import WorkspaceIdRequestBody
|
||||
|
||||
from ._import import commands as import_commands
|
||||
from .api_http_headers import ApiHttpHeader, merge_api_headers, set_api_headers_on_api_client
|
||||
from .apply import commands as apply_commands
|
||||
from .check_context import check_api_health, check_is_initialized, check_workspace_exists
|
||||
from .generate import commands as generate_commands
|
||||
from .get import commands as get_commands
|
||||
from .init import commands as init_commands
|
||||
from .list import commands as list_commands
|
||||
from .telemetry import TelemetryClient, build_user_agent
|
||||
|
||||
AVAILABLE_COMMANDS: List[click.Command] = [
|
||||
list_commands._list,
|
||||
get_commands.get,
|
||||
import_commands._import,
|
||||
init_commands.init,
|
||||
generate_commands.generate,
|
||||
apply_commands.apply,
|
||||
]
|
||||
|
||||
|
||||
def set_context_object(
|
||||
ctx: click.Context,
|
||||
airbyte_url: str,
|
||||
airbyte_username: str,
|
||||
airbyte_password: str,
|
||||
workspace_id: str,
|
||||
enable_telemetry: bool,
|
||||
option_based_api_http_headers: Optional[List[Tuple[str, str]]],
|
||||
api_http_headers_file_path: Optional[str],
|
||||
) -> click.Context:
|
||||
"""Fill the context object with resources that will be reused by other commands.
|
||||
Performs check and telemetry sending in case of error.
|
||||
|
||||
Args:
|
||||
ctx (click.Context): Current command context.
|
||||
airbyte_url (str): The airbyte instance url.
|
||||
airbyte_username (str): The OSS airbyte instance username.
|
||||
airbyte_password (str): The OSS airbyte instance password.
|
||||
workspace_id (str): The user_defined workspace id.
|
||||
enable_telemetry (bool): Whether the telemetry should send data.
|
||||
option_based_api_http_headers (Optional[List[Tuple[str, str]]]): Option based headers.
|
||||
api_http_headers_file_path (Optional[str]): Path to the YAML file with http headers.
|
||||
|
||||
Raises:
|
||||
e: Raise whatever error that might happen during the execution.
|
||||
|
||||
Returns:
|
||||
click.Context: The context with it's updated object.
|
||||
"""
|
||||
telemetry_client = TelemetryClient(enable_telemetry)
|
||||
|
||||
try:
|
||||
ctx.ensure_object(dict)
|
||||
ctx.obj["OCTAVIA_VERSION"] = pkg_resources.require("octavia-cli")[0].version
|
||||
ctx.obj["TELEMETRY_CLIENT"] = telemetry_client
|
||||
user_agent = build_user_agent(ctx.obj["OCTAVIA_VERSION"])
|
||||
api_http_headers = merge_api_headers(option_based_api_http_headers, api_http_headers_file_path)
|
||||
api_client = get_api_client(airbyte_url, airbyte_username, airbyte_password, user_agent, api_http_headers)
|
||||
ctx.obj["WORKSPACE_ID"] = get_workspace_id(api_client, workspace_id)
|
||||
ctx.obj["ANONYMOUS_DATA_COLLECTION"] = get_anonymous_data_collection(api_client, ctx.obj["WORKSPACE_ID"])
|
||||
ctx.obj["API_CLIENT"] = api_client
|
||||
ctx.obj["PROJECT_IS_INITIALIZED"] = check_is_initialized()
|
||||
except Exception as e:
|
||||
telemetry_client.send_command_telemetry(ctx, error=e)
|
||||
raise e
|
||||
return ctx
|
||||
|
||||
|
||||
@click.group()
|
||||
@click.option("--airbyte-url", envvar="AIRBYTE_URL", default="http://localhost:8000", help="The URL of your Airbyte instance.")
|
||||
@click.option("--airbyte-username", envvar="AIRBYTE_USERNAME", default="airbyte", help="The username for your Airbyte OSS instance.")
|
||||
@click.option("--airbyte-password", envvar="AIRBYTE_PASSWORD", default="password", help="The password for your Airbyte OSS instance.")
|
||||
@click.option(
|
||||
"--workspace-id",
|
||||
envvar="AIRBYTE_WORKSPACE_ID",
|
||||
default=None,
|
||||
help="The id of the workspace on which you want octavia-cli to work. Defaults to the first one found on your Airbyte instance.",
|
||||
)
|
||||
@click.option(
|
||||
"--enable-telemetry/--disable-telemetry",
|
||||
envvar="OCTAVIA_ENABLE_TELEMETRY",
|
||||
default=True,
|
||||
help="Enable or disable telemetry for product improvement.",
|
||||
type=bool,
|
||||
)
|
||||
@click.option(
|
||||
"--api-http-header",
|
||||
"-ah",
|
||||
"option_based_api_http_headers",
|
||||
help='Additional HTTP header name and header value pairs to pass to use when calling Airbyte\'s API ex. --api-http-header "Authorization" "Basic dXNlcjpwYXNzd29yZA=="',
|
||||
multiple=True,
|
||||
nargs=2,
|
||||
type=click.Tuple([str, str]),
|
||||
)
|
||||
@click.option(
|
||||
"--api-http-headers-file-path",
|
||||
help=f"Path to the Yaml file with API HTTP headers. Please check the {init_commands.API_HTTP_HEADERS_TARGET_PATH} file.",
|
||||
type=click.Path(exists=True, readable=True),
|
||||
)
|
||||
@click.pass_context
|
||||
def octavia(
|
||||
ctx: click.Context,
|
||||
airbyte_url: str,
|
||||
airbyte_username: str,
|
||||
airbyte_password: str,
|
||||
workspace_id: str,
|
||||
enable_telemetry: bool,
|
||||
option_based_api_http_headers: Optional[List[Tuple[str, str]]] = None,
|
||||
api_http_headers_file_path: Optional[str] = None,
|
||||
) -> None:
|
||||
|
||||
ctx = set_context_object(
|
||||
ctx,
|
||||
airbyte_url,
|
||||
airbyte_username,
|
||||
airbyte_password,
|
||||
workspace_id,
|
||||
enable_telemetry,
|
||||
option_based_api_http_headers,
|
||||
api_http_headers_file_path,
|
||||
)
|
||||
|
||||
click.echo(
|
||||
click.style(
|
||||
f"🐙 - Octavia is targetting your Airbyte instance running at {airbyte_url} on workspace {ctx.obj['WORKSPACE_ID']}.", fg="green"
|
||||
)
|
||||
)
|
||||
if not ctx.obj["PROJECT_IS_INITIALIZED"]:
|
||||
click.echo(click.style("🐙 - Project is not yet initialized.", fg="red", bold=True))
|
||||
|
||||
|
||||
def get_api_client(
|
||||
airbyte_url: str, airbyte_username: str, airbyte_password: str, user_agent: str, api_http_headers: Optional[List[ApiHttpHeader]]
|
||||
):
|
||||
client_configuration = airbyte_api_client.Configuration(host=f"{airbyte_url}/api", username=airbyte_username, password=airbyte_password)
|
||||
api_client = airbyte_api_client.ApiClient(client_configuration)
|
||||
api_client.user_agent = user_agent
|
||||
api_http_headers = api_http_headers if api_http_headers else []
|
||||
has_existing_authorization_headers = bool([header for header in api_http_headers if header.name.lower() == "authorization"])
|
||||
if not has_existing_authorization_headers:
|
||||
basic_auth_token = client_configuration.get_basic_auth_token()
|
||||
api_http_headers.append(ApiHttpHeader("Authorization", basic_auth_token))
|
||||
api_http_headers.append(ApiHttpHeader("X-Airbyte-Analytic-Source", "octavia-cli"))
|
||||
set_api_headers_on_api_client(api_client, api_http_headers)
|
||||
check_api_health(api_client)
|
||||
return api_client
|
||||
|
||||
|
||||
def get_workspace_id(api_client, user_defined_workspace_id):
|
||||
if user_defined_workspace_id:
|
||||
check_workspace_exists(api_client, user_defined_workspace_id)
|
||||
return user_defined_workspace_id
|
||||
else:
|
||||
api_instance = workspace_api.WorkspaceApi(api_client)
|
||||
api_response = api_instance.list_workspaces(_check_return_type=False)
|
||||
return api_response.workspaces[0]["workspaceId"]
|
||||
|
||||
|
||||
def get_anonymous_data_collection(api_client, workspace_id):
|
||||
api_instance = workspace_api.WorkspaceApi(api_client)
|
||||
api_response = api_instance.get_workspace(WorkspaceIdRequestBody(workspace_id), _check_return_type=False)
|
||||
return api_response.get("anonymous_data_collection", True)
|
||||
|
||||
|
||||
def add_commands_to_octavia():
|
||||
for command in AVAILABLE_COMMANDS:
|
||||
octavia.add_command(command)
|
||||
|
||||
|
||||
@octavia.command(help="[NOT IMPLEMENTED] Delete resources")
|
||||
def delete() -> None:
|
||||
raise click.ClickException("The delete command is not yet implemented.")
|
||||
|
||||
|
||||
add_commands_to_octavia()
|
||||
@@ -1,3 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2021 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
@@ -1,78 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import click
|
||||
import octavia_cli.generate.definitions as definitions
|
||||
from octavia_cli.apply import resources
|
||||
from octavia_cli.base_commands import OctaviaCommand
|
||||
from octavia_cli.check_context import requires_init
|
||||
|
||||
from .renderers import ConnectionRenderer, ConnectorSpecificationRenderer
|
||||
|
||||
|
||||
@click.group("generate", help="Generate a YAML template for a source, destination or a connection.")
|
||||
@click.pass_context
|
||||
@requires_init
|
||||
def generate(ctx: click.Context):
|
||||
pass
|
||||
|
||||
|
||||
def generate_source_or_destination(definition_type, api_client, workspace_id, definition_id, resource_name):
|
||||
definition = definitions.factory(definition_type, api_client, workspace_id, definition_id)
|
||||
renderer = ConnectorSpecificationRenderer(resource_name, definition)
|
||||
output_path = renderer.write_yaml(project_path=".")
|
||||
message = f"✅ - Created the {definition_type} template for {resource_name} in {output_path}."
|
||||
click.echo(click.style(message, fg="green"))
|
||||
|
||||
|
||||
@generate.command(cls=OctaviaCommand, name="source", help="Create YAML for a source")
|
||||
@click.argument("definition_id", type=click.STRING)
|
||||
@click.argument("resource_name", type=click.STRING)
|
||||
@click.pass_context
|
||||
def source(ctx: click.Context, definition_id: str, resource_name: str):
|
||||
generate_source_or_destination("source", ctx.obj["API_CLIENT"], ctx.obj["WORKSPACE_ID"], definition_id, resource_name)
|
||||
|
||||
|
||||
@generate.command(cls=OctaviaCommand, name="destination", help="Create YAML for a destination")
|
||||
@click.argument("definition_id", type=click.STRING)
|
||||
@click.argument("resource_name", type=click.STRING)
|
||||
@click.pass_context
|
||||
def destination(ctx: click.Context, definition_id: str, resource_name: str):
|
||||
generate_source_or_destination("destination", ctx.obj["API_CLIENT"], ctx.obj["WORKSPACE_ID"], definition_id, resource_name)
|
||||
|
||||
|
||||
@generate.command(cls=OctaviaCommand, name="connection", help="Generate a YAML template for a connection.")
|
||||
@click.argument("connection_name", type=click.STRING)
|
||||
@click.option(
|
||||
"--source",
|
||||
"source_path",
|
||||
type=click.Path(exists=True, readable=True),
|
||||
required=True,
|
||||
help="Path to the YAML file defining your source configuration.",
|
||||
)
|
||||
@click.option(
|
||||
"--destination",
|
||||
"destination_path",
|
||||
type=click.Path(exists=True, readable=True),
|
||||
required=True,
|
||||
help="Path to the YAML file defining your destination configuration.",
|
||||
)
|
||||
@click.pass_context
|
||||
def connection(ctx: click.Context, connection_name: str, source_path: str, destination_path: str):
|
||||
source = resources.factory(ctx.obj["API_CLIENT"], ctx.obj["WORKSPACE_ID"], source_path)
|
||||
if not source.was_created:
|
||||
raise resources.NonExistingResourceError(
|
||||
f"The source defined at {source_path} does not exists. Please run octavia apply before creating this connection."
|
||||
)
|
||||
|
||||
destination = resources.factory(ctx.obj["API_CLIENT"], ctx.obj["WORKSPACE_ID"], destination_path)
|
||||
if not destination.was_created:
|
||||
raise resources.NonExistingResourceError(
|
||||
f"The destination defined at {destination_path} does not exists. Please run octavia apply before creating this connection."
|
||||
)
|
||||
|
||||
connection_renderer = ConnectionRenderer(connection_name, source, destination)
|
||||
output_path = connection_renderer.write_yaml(project_path=".")
|
||||
message = f"✅ - Created the connection template for {connection_name} in {output_path}."
|
||||
click.echo(click.style(message, fg="green"))
|
||||
@@ -1,153 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import abc
|
||||
from typing import Any, Callable, Union
|
||||
|
||||
import airbyte_api_client
|
||||
import click
|
||||
from airbyte_api_client.api import (
|
||||
destination_definition_api,
|
||||
destination_definition_specification_api,
|
||||
source_definition_api,
|
||||
source_definition_specification_api,
|
||||
)
|
||||
from airbyte_api_client.exceptions import ApiException
|
||||
from airbyte_api_client.model.destination_definition_id_request_body import DestinationDefinitionIdRequestBody
|
||||
from airbyte_api_client.model.destination_definition_id_with_workspace_id import DestinationDefinitionIdWithWorkspaceId
|
||||
from airbyte_api_client.model.source_definition_id_request_body import SourceDefinitionIdRequestBody
|
||||
from airbyte_api_client.model.source_definition_id_with_workspace_id import SourceDefinitionIdWithWorkspaceId
|
||||
|
||||
|
||||
class DefinitionNotFoundError(click.ClickException):
|
||||
pass
|
||||
|
||||
|
||||
class BaseDefinition(abc.ABC):
|
||||
COMMON_GET_FUNCTION_KWARGS = {"_check_return_type": False}
|
||||
|
||||
specification = None
|
||||
|
||||
@property
|
||||
@abc.abstractmethod
|
||||
def api(
|
||||
self,
|
||||
): # pragma: no cover
|
||||
pass
|
||||
|
||||
@property
|
||||
@abc.abstractmethod
|
||||
def type(
|
||||
self,
|
||||
): # pragma: no cover
|
||||
pass
|
||||
|
||||
@property
|
||||
@abc.abstractmethod
|
||||
def get_function_name(
|
||||
self,
|
||||
): # pragma: no cover
|
||||
pass
|
||||
|
||||
@property
|
||||
def _get_fn(self) -> Callable:
|
||||
return getattr(self.api, self.get_function_name)
|
||||
|
||||
@property
|
||||
def _get_fn_kwargs(self) -> dict:
|
||||
return {}
|
||||
|
||||
def __init__(self, api_client: airbyte_api_client.ApiClient, id: str) -> None:
|
||||
self.id = id
|
||||
self.api_instance = self.api(api_client)
|
||||
self._api_data = self._read()
|
||||
|
||||
def _read(self) -> dict:
|
||||
try:
|
||||
return self._get_fn(self.api_instance, **self._get_fn_kwargs, **self.COMMON_GET_FUNCTION_KWARGS)
|
||||
except ApiException as e:
|
||||
if e.status in [422, 404]:
|
||||
raise DefinitionNotFoundError(f"Definition {self.id} does not exists on your Airbyte instance.")
|
||||
raise e
|
||||
|
||||
def __getattr__(self, name: str) -> Any:
|
||||
"""Map attribute of the API response to the BaseDefinition object.
|
||||
|
||||
Args:
|
||||
name (str): Attribute name
|
||||
|
||||
Raises:
|
||||
AttributeError: Raised if the attributed was not found in the API response payload.
|
||||
|
||||
Returns:
|
||||
[Any]: Attribute value
|
||||
"""
|
||||
if name in self._api_data:
|
||||
return self._api_data.get(name)
|
||||
raise AttributeError(f"{self.__class__.__name__}.{name} is invalid.")
|
||||
|
||||
|
||||
class ConnectionDefinition(BaseDefinition):
|
||||
type = "connection"
|
||||
|
||||
|
||||
class SourceDefinition(BaseDefinition):
|
||||
api = source_definition_api.SourceDefinitionApi
|
||||
type = "source"
|
||||
get_function_name = "get_source_definition"
|
||||
|
||||
@property
|
||||
def _get_fn_kwargs(self) -> dict:
|
||||
return {"source_definition_id_request_body": SourceDefinitionIdRequestBody(self.id)}
|
||||
|
||||
|
||||
class DestinationDefinition(BaseDefinition):
|
||||
api = destination_definition_api.DestinationDefinitionApi
|
||||
type = "destination"
|
||||
get_function_name = "get_destination_definition"
|
||||
|
||||
@property
|
||||
def _get_fn_kwargs(self) -> dict:
|
||||
return {"destination_definition_id_request_body": DestinationDefinitionIdRequestBody(self.id)}
|
||||
|
||||
|
||||
class DefinitionSpecification(BaseDefinition):
|
||||
def __init__(self, api_client: airbyte_api_client.ApiClient, workspace_id: str, id: str) -> None:
|
||||
self.workspace_id = workspace_id
|
||||
super().__init__(api_client, id)
|
||||
|
||||
|
||||
class SourceDefinitionSpecification(DefinitionSpecification):
|
||||
api = source_definition_specification_api.SourceDefinitionSpecificationApi
|
||||
type = "source"
|
||||
get_function_name = "get_source_definition_specification"
|
||||
|
||||
@property
|
||||
def _get_fn_kwargs(self) -> dict:
|
||||
return {"source_definition_id_with_workspace_id": SourceDefinitionIdWithWorkspaceId(self.id, self.workspace_id)}
|
||||
|
||||
|
||||
class DestinationDefinitionSpecification(DefinitionSpecification):
|
||||
api = destination_definition_specification_api.DestinationDefinitionSpecificationApi
|
||||
type = "destination"
|
||||
get_function_name = "get_destination_definition_specification"
|
||||
|
||||
@property
|
||||
def _get_fn_kwargs(self) -> dict:
|
||||
return {"destination_definition_id_with_workspace_id": DestinationDefinitionIdWithWorkspaceId(self.id, self.workspace_id)}
|
||||
|
||||
|
||||
def factory(
|
||||
definition_type: str, api_client: airbyte_api_client.ApiClient, workspace_id: str, definition_id: str
|
||||
) -> Union[SourceDefinition, DestinationDefinition]:
|
||||
if definition_type == "source":
|
||||
definition = SourceDefinition(api_client, definition_id)
|
||||
specification = SourceDefinitionSpecification(api_client, workspace_id, definition_id)
|
||||
elif definition_type == "destination":
|
||||
definition = DestinationDefinition(api_client, definition_id)
|
||||
specification = DestinationDefinitionSpecification(api_client, workspace_id, definition_id)
|
||||
else:
|
||||
raise ValueError(f"{definition_type} does not exist")
|
||||
definition.specification = specification
|
||||
return definition
|
||||
@@ -1,328 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import abc
|
||||
import os
|
||||
from pathlib import Path
|
||||
from typing import Any, Callable, List
|
||||
|
||||
import click
|
||||
import yaml
|
||||
from airbyte_api_client.model.airbyte_catalog import AirbyteCatalog
|
||||
from jinja2 import Environment, PackageLoader, Template, select_autoescape
|
||||
from octavia_cli.apply import resources
|
||||
from slugify import slugify
|
||||
|
||||
from .definitions import BaseDefinition, ConnectionDefinition
|
||||
from .yaml_dumpers import CatalogDumper
|
||||
|
||||
JINJA_ENV = Environment(loader=PackageLoader(__package__), autoescape=select_autoescape(), trim_blocks=False, lstrip_blocks=True)
|
||||
|
||||
|
||||
class FieldToRender:
|
||||
def __init__(self, name: str, required: bool, field_metadata: dict) -> None:
|
||||
"""Initialize a FieldToRender instance
|
||||
Args:
|
||||
name (str): name of the field
|
||||
required (bool): whether it's a required field or not
|
||||
field_metadata (dict): metadata associated with the field
|
||||
"""
|
||||
self.name = name
|
||||
self.required = required
|
||||
self.field_metadata = field_metadata
|
||||
self.one_of_values = self._get_one_of_values()
|
||||
self.object_properties = get_object_fields(field_metadata)
|
||||
self.array_items = self._get_array_items()
|
||||
self.comment = self._build_comment(
|
||||
[
|
||||
self._get_secret_comment,
|
||||
self._get_required_comment,
|
||||
self._get_type_comment,
|
||||
self._get_description_comment,
|
||||
self._get_example_comment,
|
||||
]
|
||||
)
|
||||
self.default = self._get_default()
|
||||
|
||||
def __getattr__(self, name: str) -> Any:
|
||||
"""Map field_metadata keys to attributes of Field.
|
||||
Args:
|
||||
name (str): attribute name
|
||||
Returns:
|
||||
[Any]: attribute value
|
||||
"""
|
||||
if name in self.field_metadata:
|
||||
return self.field_metadata.get(name)
|
||||
|
||||
@property
|
||||
def is_array_of_objects(self) -> bool:
|
||||
if self.type == "array" and self.items:
|
||||
if self.items.get("type") == "object":
|
||||
return True
|
||||
return False
|
||||
|
||||
def _get_one_of_values(self) -> List[List["FieldToRender"]]:
|
||||
"""An object field can have multiple kind of values if it's a oneOf.
|
||||
This functions returns all the possible one of values the field can take.
|
||||
Returns:
|
||||
[list]: List of oneof values.
|
||||
"""
|
||||
if not self.oneOf:
|
||||
return []
|
||||
one_of_values = []
|
||||
for one_of_value in self.oneOf:
|
||||
properties = get_object_fields(one_of_value)
|
||||
one_of_values.append(properties)
|
||||
return one_of_values
|
||||
|
||||
def _get_array_items(self) -> List["FieldToRender"]:
|
||||
"""If the field is an array of objects, retrieve fields of these objects.
|
||||
Returns:
|
||||
[list]: List of fields
|
||||
"""
|
||||
if self.is_array_of_objects:
|
||||
required_fields = self.items.get("required", [])
|
||||
return parse_fields(required_fields, self.items["properties"])
|
||||
return []
|
||||
|
||||
def _get_required_comment(self) -> str:
|
||||
return "REQUIRED" if self.required else "OPTIONAL"
|
||||
|
||||
def _get_type_comment(self) -> str:
|
||||
if isinstance(self.type, list):
|
||||
return ", ".join(self.type)
|
||||
return self.type if self.type else None
|
||||
|
||||
def _get_secret_comment(self) -> str:
|
||||
return "SECRET (please store in environment variables)" if self.airbyte_secret else None
|
||||
|
||||
def _get_description_comment(self) -> str:
|
||||
return self.description if self.description else None
|
||||
|
||||
def _get_example_comment(self) -> str:
|
||||
example_comment = None
|
||||
if self.examples:
|
||||
if isinstance(self.examples, list):
|
||||
if len(self.examples) > 1:
|
||||
example_comment = f"Examples: {', '.join([str(example) for example in self.examples])}"
|
||||
else:
|
||||
example_comment = f"Example: {self.examples[0]}"
|
||||
else:
|
||||
example_comment = f"Example: {self.examples}"
|
||||
return example_comment
|
||||
|
||||
def _get_default(self) -> str:
|
||||
if self.const:
|
||||
return self.const
|
||||
if self.airbyte_secret:
|
||||
return f"${{{self.name.upper()}}}"
|
||||
return self.default
|
||||
|
||||
@staticmethod
|
||||
def _build_comment(comment_functions: Callable) -> str:
|
||||
return " | ".join(filter(None, [comment_fn() for comment_fn in comment_functions])).replace("\n", "")
|
||||
|
||||
|
||||
def parse_fields(required_fields: List[str], fields: dict) -> List["FieldToRender"]:
|
||||
return [FieldToRender(f_name, f_name in required_fields, f_metadata) for f_name, f_metadata in fields.items()]
|
||||
|
||||
|
||||
def get_object_fields(field_metadata: dict) -> List["FieldToRender"]:
|
||||
if field_metadata.get("properties"):
|
||||
required_fields = field_metadata.get("required", [])
|
||||
return parse_fields(required_fields, field_metadata["properties"])
|
||||
return []
|
||||
|
||||
|
||||
class BaseRenderer(abc.ABC):
|
||||
@property
|
||||
@abc.abstractmethod
|
||||
def TEMPLATE(
|
||||
self,
|
||||
) -> Template: # pragma: no cover
|
||||
pass
|
||||
|
||||
def __init__(self, resource_name: str) -> None:
|
||||
self.resource_name = resource_name
|
||||
|
||||
@classmethod
|
||||
def get_output_path(cls, project_path: str, definition_type: str, resource_name: str) -> Path:
|
||||
"""Get rendered file output path
|
||||
Args:
|
||||
project_path (str): Current project path.
|
||||
definition_type (str): Current definition_type.
|
||||
resource_name (str): Current resource_name.
|
||||
Returns:
|
||||
Path: Full path to the output path.
|
||||
"""
|
||||
directory = os.path.join(project_path, f"{definition_type}s", slugify(resource_name, separator="_"))
|
||||
if not os.path.exists(directory):
|
||||
os.makedirs(directory)
|
||||
return Path(os.path.join(directory, "configuration.yaml"))
|
||||
|
||||
@staticmethod
|
||||
def _confirm_overwrite(output_path):
|
||||
"""User input to determine if the configuration paqth should be overwritten.
|
||||
Args:
|
||||
output_path (str): Path of the configuration file to overwrite
|
||||
Returns:
|
||||
bool: Boolean representing if the configuration file is to be overwritten
|
||||
"""
|
||||
overwrite = True
|
||||
if output_path.is_file():
|
||||
overwrite = click.confirm(
|
||||
f"The configuration octavia-cli is about to create already exists, do you want to replace it? ({output_path})"
|
||||
)
|
||||
return overwrite
|
||||
|
||||
@abc.abstractmethod
|
||||
def _render(self): # pragma: no cover
|
||||
"""Runs the template rendering.
|
||||
Raises:
|
||||
NotImplementedError: Must be implemented on subclasses.
|
||||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
def write_yaml(self, project_path: Path) -> str:
|
||||
"""Write rendered specification to a YAML file in local project path.
|
||||
Args:
|
||||
project_path (str): Path to directory hosting the octavia project.
|
||||
Returns:
|
||||
str: Path to the rendered specification.
|
||||
"""
|
||||
output_path = self.get_output_path(project_path, self.definition.type, self.resource_name)
|
||||
if self._confirm_overwrite(output_path):
|
||||
with open(output_path, "w") as f:
|
||||
rendered_yaml = self._render()
|
||||
f.write(rendered_yaml)
|
||||
return output_path
|
||||
|
||||
def import_configuration(self, project_path: str, configuration: dict) -> Path:
|
||||
"""Import the resource configuration. Save the yaml file to disk and return its path.
|
||||
Args:
|
||||
project_path (str): Current project path.
|
||||
configuration (dict): The configuration of the resource.
|
||||
Returns:
|
||||
Path: Path to the resource configuration.
|
||||
"""
|
||||
rendered = self._render()
|
||||
data = yaml.safe_load(rendered)
|
||||
data["configuration"] = configuration
|
||||
output_path = self.get_output_path(project_path, self.definition.type, self.resource_name)
|
||||
if self._confirm_overwrite(output_path):
|
||||
with open(output_path, "wb") as f:
|
||||
yaml.safe_dump(data, f, default_flow_style=False, sort_keys=False, allow_unicode=True, encoding="utf-8")
|
||||
return output_path
|
||||
|
||||
|
||||
class ConnectorSpecificationRenderer(BaseRenderer):
|
||||
TEMPLATE = JINJA_ENV.get_template("source_or_destination.yaml.j2")
|
||||
|
||||
def __init__(self, resource_name: str, definition: BaseDefinition) -> None:
|
||||
"""Connector specification renderer constructor.
|
||||
Args:
|
||||
resource_name (str): Name of the source or destination.
|
||||
definition (BaseDefinition): The definition related to a source or a destination.
|
||||
"""
|
||||
super().__init__(resource_name)
|
||||
self.definition = definition
|
||||
|
||||
def _parse_connection_specification(self, schema: dict) -> List[List["FieldToRender"]]:
|
||||
"""Create a renderable structure from the specification schema
|
||||
Returns:
|
||||
List[List["FieldToRender"]]: List of list of fields to render.
|
||||
"""
|
||||
if schema.get("oneOf"):
|
||||
roots = []
|
||||
for one_of_value in schema.get("oneOf"):
|
||||
required_fields = one_of_value.get("required", [])
|
||||
roots.append(parse_fields(required_fields, one_of_value["properties"]))
|
||||
return roots
|
||||
else:
|
||||
required_fields = schema.get("required", [])
|
||||
return [parse_fields(required_fields, schema["properties"])]
|
||||
|
||||
def _render(self) -> str:
|
||||
parsed_schema = self._parse_connection_specification(self.definition.specification.connection_specification)
|
||||
return self.TEMPLATE.render(
|
||||
{"resource_name": self.resource_name, "definition": self.definition, "configuration_fields": parsed_schema}
|
||||
)
|
||||
|
||||
|
||||
class ConnectionRenderer(BaseRenderer):
|
||||
|
||||
TEMPLATE = JINJA_ENV.get_template("connection.yaml.j2")
|
||||
definition = ConnectionDefinition
|
||||
KEYS_TO_REMOVE_FROM_REMOTE_CONFIGURATION = [
|
||||
"connection_id",
|
||||
"name",
|
||||
"source_id",
|
||||
"destination_id",
|
||||
"latest_sync_job_created_at",
|
||||
"latest_sync_job_status",
|
||||
"source",
|
||||
"destination",
|
||||
"is_syncing",
|
||||
"operation_ids",
|
||||
"catalog_id",
|
||||
"catalog_diff",
|
||||
]
|
||||
|
||||
def __init__(self, connection_name: str, source: resources.Source, destination: resources.Destination) -> None:
|
||||
"""Connection renderer constructor.
|
||||
Args:
|
||||
connection_name (str): Name of the connection to render.
|
||||
source (resources.Source): Connection's source.
|
||||
destination (resources.Destination): Connections's destination.
|
||||
"""
|
||||
super().__init__(connection_name)
|
||||
self.source = source
|
||||
self.destination = destination
|
||||
|
||||
@staticmethod
|
||||
def catalog_to_yaml(catalog: AirbyteCatalog) -> str:
|
||||
"""Convert the source catalog to a YAML string.
|
||||
Args:
|
||||
catalog (AirbyteCatalog): Source's catalog.
|
||||
Returns:
|
||||
str: Catalog rendered as yaml.
|
||||
"""
|
||||
return yaml.dump(catalog.to_dict(), Dumper=CatalogDumper, default_flow_style=False)
|
||||
|
||||
def _render(self) -> str:
|
||||
yaml_catalog = self.catalog_to_yaml(self.source.catalog)
|
||||
return self.TEMPLATE.render(
|
||||
{
|
||||
"connection_name": self.resource_name,
|
||||
"source_configuration_path": self.source.configuration_path,
|
||||
"destination_configuration_path": self.destination.configuration_path,
|
||||
"catalog": yaml_catalog,
|
||||
"supports_normalization": self.destination.definition.normalization_config.supported,
|
||||
"supports_dbt": self.destination.definition.supports_dbt,
|
||||
}
|
||||
)
|
||||
|
||||
def import_configuration(self, project_path: Path, configuration: dict) -> Path:
|
||||
"""Import the connection configuration. Save the yaml file to disk and return its path.
|
||||
Args:
|
||||
project_path (str): Current project path.
|
||||
configuration (dict): The configuration of the connection.
|
||||
Returns:
|
||||
Path: Path to the connection configuration.
|
||||
"""
|
||||
rendered = self._render()
|
||||
data = yaml.safe_load(rendered)
|
||||
data["configuration"] = {k: v for k, v in configuration.items() if k not in self.KEYS_TO_REMOVE_FROM_REMOTE_CONFIGURATION}
|
||||
if "operations" in data["configuration"] and len(data["configuration"]["operations"]) == 0:
|
||||
data["configuration"].pop("operations")
|
||||
[
|
||||
operation.pop(field_to_remove, "")
|
||||
for field_to_remove in ["workspace_id", "operation_id"]
|
||||
for operation in data["configuration"].get("operations", {})
|
||||
]
|
||||
output_path = self.get_output_path(project_path, self.definition.type, self.resource_name)
|
||||
if self._confirm_overwrite(output_path):
|
||||
with open(output_path, "wb") as f:
|
||||
yaml.safe_dump(data, f, default_flow_style=False, sort_keys=False, allow_unicode=True, encoding="utf-8")
|
||||
return output_path
|
||||
@@ -1,50 +0,0 @@
|
||||
# Configuration for connection {{ connection_name }}
|
||||
definition_type: connection
|
||||
resource_name: "{{ connection_name }}"
|
||||
source_configuration_path: {{ source_configuration_path }}
|
||||
destination_configuration_path: {{ destination_configuration_path }}
|
||||
|
||||
# EDIT THE CONFIGURATION BELOW!
|
||||
configuration:
|
||||
status: active # REQUIRED | string | Allowed values: active, inactive, deprecated
|
||||
skip_reset: false # OPTIONAL | boolean | Flag to check if the connection should be reset after a connection update
|
||||
namespace_definition: source # OPTIONAL | string | Allowed values: source, destination, customformat
|
||||
namespace_format: "${SOURCE_NAMESPACE}" # OPTIONAL | string | Used when namespaceDefinition is 'customformat'. If blank then behaves like namespaceDefinition = 'destination'. If "${SOURCE_NAMESPACE}" then behaves like namespaceDefinition = 'source'.
|
||||
prefix: "" # REQUIRED | Prefix that will be prepended to the name of each stream when it is written to the destination
|
||||
resource_requirements: # OPTIONAL | object | Resource requirements to run workers (blank for unbounded allocations)
|
||||
cpu_limit: "" # OPTIONAL
|
||||
cpu_request: "" # OPTIONAL
|
||||
memory_limit: "" # OPTIONAL
|
||||
memory_request: "" # OPTIONAL
|
||||
schedule_type: basic # OPTIONAL | string | Allowed values: basic, cron, manual
|
||||
schedule_data: # OPTIONAL | object
|
||||
basic_schedule:
|
||||
time_unit: hours # REQUIRED | string | Allowed values: minutes, hours, days, weeks, months
|
||||
units: 1 # REQUIRED | integer
|
||||
# cron:
|
||||
# cron_time_zone: "UTC" # REQUIRED | string
|
||||
# cron_expression: "* */2 * * * ?" # REQUIRED | string
|
||||
{%- if supports_normalization or supports_dbt%}
|
||||
# operations:
|
||||
{%- endif %}
|
||||
{%- if supports_normalization %}
|
||||
## -------- Uncomment and edit the block below if you want to enable Airbyte normalization --------
|
||||
# - name: "Normalization"
|
||||
# operator_configuration:
|
||||
# normalization:
|
||||
# option: "basic"
|
||||
# operator_type: "normalization"
|
||||
{%- endif %}
|
||||
{%- if supports_dbt %}
|
||||
## -------- Uncomment and edit the block below if you want to declare a custom transformation --------
|
||||
# - name: "My dbt transformations" # REQUIRED | string
|
||||
# operator_configuration:
|
||||
# dbt:
|
||||
# dbt_arguments: "run" # REQUIRED | string | Entrypoint arguments for dbt cli to run the project
|
||||
# docker_image: "fishtownanalytics/dbt:0.19.1" # REQUIRED | string | Docker image URL with dbt installed
|
||||
# git_repo_branch: "your-repo-branch-name" # OPTIONAL | string | Git branch name
|
||||
# git_repo_url: "https://github.com/<your git repo>" # REQUIRED | string | Git repository URL of the custom transformation project
|
||||
# operator_type: dbt # REQUIRED | string | Allowed values: dbt, normalization
|
||||
{%- endif %}
|
||||
sync_catalog: # OPTIONAL | object | 🚨 ONLY edit streams.config, streams.stream should not be edited as schema cannot be changed.
|
||||
{{ catalog | indent(4)}}
|
||||
@@ -1,79 +0,0 @@
|
||||
# Configuration for {{ definition.docker_repository }}
|
||||
# Documentation about this connector can be found at {{ definition.documentation_url }}
|
||||
resource_name: "{{ resource_name}}"
|
||||
definition_type: {{ definition.type}}
|
||||
definition_id: {{ definition.id }}
|
||||
definition_image: {{ definition.docker_repository }}
|
||||
definition_version: {{ definition.docker_image_tag }}
|
||||
|
||||
{%- macro render_field(field, is_commented) %}
|
||||
{%- if is_commented %}# {% endif %}{{ field.name }}:{% if field.default %} {% if field.airbyte_secret %}{{ field.default }}{% else %}{{ field.default | tojson() }}{% endif %}{% endif %} # {{ field.comment }}
|
||||
{%- endmacro %}
|
||||
|
||||
{%- macro render_sub_fields(sub_fields, is_commented) %}
|
||||
{%- for f in sub_fields %}
|
||||
{%- if f.type == "object" and not f.oneOf %}
|
||||
{{- render_object_field(f)|indent(2, False) }}
|
||||
{%- elif f.oneOf %}}
|
||||
{{- render_one_of(f) }}
|
||||
{%- elif f.is_array_of_objects %}}
|
||||
{{- render_array_of_objects(f) }}
|
||||
{%- else %}
|
||||
{{ render_field(f, is_commented) }}
|
||||
{%- endif %}
|
||||
{%- endfor %}
|
||||
{%- endmacro %}
|
||||
|
||||
{%- macro render_array_sub_fields(sub_fields, is_commented) %}
|
||||
{%- for f in sub_fields %}
|
||||
{% if loop.first %}- {% else %} {% endif %}{{ render_field(f, is_commented) }}
|
||||
{%- endfor %}
|
||||
{%- endmacro %}
|
||||
|
||||
|
||||
{%- macro render_one_of(field) %}
|
||||
{{ field.name }}:
|
||||
{%- for one_of_value in field.one_of_values %}
|
||||
{%- if loop.first %}
|
||||
## -------- Pick one valid structure among the examples below: --------
|
||||
{{- render_sub_fields(one_of_value, False)|indent(2, False) }}
|
||||
{%- else %}
|
||||
## -------- Another valid structure for {{ field.name }}: --------
|
||||
{{- render_sub_fields(one_of_value, True)|indent(2, False) }}
|
||||
{%- endif %}
|
||||
{%- endfor %}
|
||||
{%- endmacro %}
|
||||
|
||||
{%- macro render_object_field(field) %}
|
||||
{{ field.name }}:
|
||||
{{- render_sub_fields(field.object_properties, is_commented=False)|indent(2, False)}}
|
||||
{%- endmacro %}
|
||||
|
||||
{%- macro render_array_of_objects(field) %}
|
||||
{{ field.name }}:
|
||||
{{- render_array_sub_fields(field.array_items, is_commented=False)|indent(2, False)}}
|
||||
{%- endmacro %}
|
||||
|
||||
{%- macro render_root(root, is_commented) %}
|
||||
{%- for f in root %}
|
||||
{%- if f.type == "object" and not f.oneOf %}
|
||||
{{- render_object_field(f)|indent(2, False) }}
|
||||
{%- elif f.oneOf %}
|
||||
{{- render_one_of(f)|indent(2, False) }}
|
||||
{%- elif f.is_array_of_objects %}
|
||||
{{- render_array_of_objects(f)|indent(2, False) }}
|
||||
{%- else %}
|
||||
{{ render_field(f, is_commented=is_commented) }}
|
||||
{%- endif %}
|
||||
{%- endfor %}
|
||||
{%- endmacro %}
|
||||
|
||||
# EDIT THE CONFIGURATION BELOW!
|
||||
configuration:
|
||||
{%- for root in configuration_fields %}
|
||||
{%- if loop.first %}
|
||||
{{- render_root(root, is_commented=False)}}
|
||||
{%- else %}
|
||||
{{- render_root(root, is_commented=True)}}
|
||||
{%- endif %}
|
||||
{% endfor %}
|
||||
@@ -1,23 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import yaml
|
||||
|
||||
|
||||
# This custom Dumper allows the list indentation expected by our prettier formatter:
|
||||
# Normal dumper behavior
|
||||
# my_list:
|
||||
# - bar: test2
|
||||
# foo: test
|
||||
# - bar: test4
|
||||
# foo: test3
|
||||
# Custom behavior to match prettier's rules:
|
||||
# my_list:
|
||||
# - bar: test2
|
||||
# foo: test
|
||||
# - bar: test4
|
||||
# foo: test3
|
||||
class CatalogDumper(yaml.Dumper):
|
||||
def increase_indent(self, flow=False, indentless=False):
|
||||
return super(CatalogDumper, self).increase_indent(flow, False)
|
||||
@@ -1,3 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
@@ -1,108 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import uuid
|
||||
from typing import List, Optional, Tuple, Type, Union
|
||||
|
||||
import airbyte_api_client
|
||||
import click
|
||||
from octavia_cli.base_commands import OctaviaCommand
|
||||
|
||||
from .resources import Connection, Destination, Source
|
||||
|
||||
COMMON_HELP_MESSAGE_PREFIX = "Get a JSON representation of a remote"
|
||||
|
||||
|
||||
def build_help_message(resource_type: str) -> str:
|
||||
"""Helper function to build help message consistently for all the commands in this module.
|
||||
|
||||
Args:
|
||||
resource_type (str): source, destination or connection
|
||||
|
||||
Returns:
|
||||
str: The generated help message.
|
||||
"""
|
||||
return f"Get a JSON representation of a remote {resource_type}."
|
||||
|
||||
|
||||
def get_resource_id_or_name(resource: str) -> Tuple[Optional[str], Optional[str]]:
|
||||
"""Helper function to detect if the resource argument passed to the CLI is a resource ID or name.
|
||||
|
||||
Args:
|
||||
resource (str): the resource ID or name passed as an argument to the CLI.
|
||||
|
||||
Returns:
|
||||
Tuple[Optional[str], Optional[str]]: the resource_id and resource_name, the not detected kind is set to None.
|
||||
"""
|
||||
resource_id, resource_name = None, None
|
||||
try:
|
||||
uuid.UUID(resource)
|
||||
resource_id = resource
|
||||
except ValueError:
|
||||
resource_name = resource
|
||||
return resource_id, resource_name
|
||||
|
||||
|
||||
def get_json_representation(
|
||||
api_client: airbyte_api_client.ApiClient,
|
||||
workspace_id: str,
|
||||
ResourceClass: Type[Union[Source, Destination, Connection]],
|
||||
resource_to_get: str,
|
||||
) -> str:
|
||||
"""Helper function to retrieve a resource json representation and avoid repeating the same logic for Source/Destination and connection.
|
||||
|
||||
|
||||
Args:
|
||||
api_client (airbyte_api_client.ApiClient): The Airbyte API client.
|
||||
workspace_id (str): Current workspace id.
|
||||
ResourceClass (Type[Union[Source, Destination, Connection]]): Resource class to use
|
||||
resource_to_get (str): resource name or id to get JSON representation for.
|
||||
|
||||
Returns:
|
||||
str: The resource's JSON representation.
|
||||
"""
|
||||
resource_id, resource_name = get_resource_id_or_name(resource_to_get)
|
||||
resource = ResourceClass(api_client, workspace_id, resource_id=resource_id, resource_name=resource_name)
|
||||
return resource.to_json()
|
||||
|
||||
|
||||
@click.group(
|
||||
"get",
|
||||
help=f'{build_help_message("source, destination or connection")} ID or name can be used as argument. Example: \'octavia get source "My Pokemon source"\' or \'octavia get source cb5413b2-4159-46a2-910a-dc282a439d2d\'',
|
||||
)
|
||||
@click.pass_context
|
||||
def get(ctx: click.Context): # pragma: no cover
|
||||
pass
|
||||
|
||||
|
||||
@get.command(cls=OctaviaCommand, name="source", help=build_help_message("source"))
|
||||
@click.argument("resource", type=click.STRING)
|
||||
@click.pass_context
|
||||
def source(ctx: click.Context, resource: str):
|
||||
click.echo(get_json_representation(ctx.obj["API_CLIENT"], ctx.obj["WORKSPACE_ID"], Source, resource))
|
||||
|
||||
|
||||
@get.command(cls=OctaviaCommand, name="destination", help=build_help_message("destination"))
|
||||
@click.argument("resource", type=click.STRING)
|
||||
@click.pass_context
|
||||
def destination(ctx: click.Context, resource: str):
|
||||
click.echo(get_json_representation(ctx.obj["API_CLIENT"], ctx.obj["WORKSPACE_ID"], Destination, resource))
|
||||
|
||||
|
||||
@get.command(cls=OctaviaCommand, name="connection", help=build_help_message("connection"))
|
||||
@click.argument("resource", type=click.STRING)
|
||||
@click.pass_context
|
||||
def connection(ctx: click.Context, resource: str):
|
||||
click.echo(get_json_representation(ctx.obj["API_CLIENT"], ctx.obj["WORKSPACE_ID"], Connection, resource))
|
||||
|
||||
|
||||
AVAILABLE_COMMANDS: List[click.Command] = [source, destination, connection]
|
||||
|
||||
|
||||
def add_commands_to_list():
|
||||
for command in AVAILABLE_COMMANDS:
|
||||
get.add_command(command)
|
||||
|
||||
|
||||
add_commands_to_list()
|
||||
@@ -1,193 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import abc
|
||||
import json
|
||||
from typing import Optional, Union
|
||||
|
||||
import airbyte_api_client
|
||||
import click
|
||||
from airbyte_api_client.api import destination_api, source_api, web_backend_api
|
||||
from airbyte_api_client.model.destination_id_request_body import DestinationIdRequestBody
|
||||
from airbyte_api_client.model.destination_read import DestinationRead
|
||||
from airbyte_api_client.model.source_id_request_body import SourceIdRequestBody
|
||||
from airbyte_api_client.model.source_read import SourceRead
|
||||
from airbyte_api_client.model.web_backend_connection_read import WebBackendConnectionRead
|
||||
from airbyte_api_client.model.web_backend_connection_request_body import WebBackendConnectionRequestBody
|
||||
from airbyte_api_client.model.workspace_id_request_body import WorkspaceIdRequestBody
|
||||
|
||||
|
||||
class DuplicateResourceError(click.ClickException):
|
||||
pass
|
||||
|
||||
|
||||
class ResourceNotFoundError(click.ClickException):
|
||||
pass
|
||||
|
||||
|
||||
class BaseResource(abc.ABC):
|
||||
@property
|
||||
@abc.abstractmethod
|
||||
def api(
|
||||
self,
|
||||
): # pragma: no cover
|
||||
pass
|
||||
|
||||
@property
|
||||
@abc.abstractmethod
|
||||
def name(
|
||||
self,
|
||||
) -> str: # pragma: no cover
|
||||
pass
|
||||
|
||||
@property
|
||||
@abc.abstractmethod
|
||||
def get_function_name(
|
||||
self,
|
||||
) -> str: # pragma: no cover
|
||||
pass
|
||||
|
||||
@property
|
||||
def _get_fn(self):
|
||||
return getattr(self.api, self.get_function_name)
|
||||
|
||||
@property
|
||||
@abc.abstractmethod
|
||||
def get_payload(
|
||||
self,
|
||||
): # pragma: no cover
|
||||
pass
|
||||
|
||||
@property
|
||||
@abc.abstractmethod
|
||||
def list_for_workspace_function_name(
|
||||
self,
|
||||
) -> str: # pragma: no cover
|
||||
pass
|
||||
|
||||
@property
|
||||
def _list_for_workspace_fn(self):
|
||||
return getattr(self.api, self.list_for_workspace_function_name)
|
||||
|
||||
@property
|
||||
def list_for_workspace_payload(
|
||||
self,
|
||||
):
|
||||
return WorkspaceIdRequestBody(workspace_id=self.workspace_id)
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
api_client: airbyte_api_client.ApiClient,
|
||||
workspace_id: str,
|
||||
resource_id: Optional[str] = None,
|
||||
resource_name: Optional[str] = None,
|
||||
):
|
||||
if resource_id is None and resource_name is None:
|
||||
raise ValueError("resource_id and resource_name keyword arguments can't be both None.")
|
||||
if resource_id is not None and resource_name is not None:
|
||||
raise ValueError("resource_id and resource_name keyword arguments can't be both set.")
|
||||
self.resource_id = resource_id
|
||||
self.resource_name = resource_name
|
||||
self.api_instance = self.api(api_client)
|
||||
self.workspace_id = workspace_id
|
||||
|
||||
def _find_by_resource_name(
|
||||
self,
|
||||
) -> Union[WebBackendConnectionRead, SourceRead, DestinationRead]:
|
||||
"""Retrieve a remote resource from its name by listing the available resources on the Airbyte instance.
|
||||
|
||||
Raises:
|
||||
ResourceNotFoundError: Raised if no resource was found with the current resource_name.
|
||||
DuplicateResourceError: Raised if multiple resources were found with the current resource_name.
|
||||
|
||||
Returns:
|
||||
Union[WebBackendConnectionRead, SourceRead, DestinationRead]: The remote resource model instance.
|
||||
"""
|
||||
|
||||
api_response = self._list_for_workspace_fn(self.api_instance, self.list_for_workspace_payload)
|
||||
matching_resources = []
|
||||
for resource in getattr(api_response, f"{self.name}s"):
|
||||
if resource.name == self.resource_name:
|
||||
matching_resources.append(resource)
|
||||
if not matching_resources:
|
||||
raise ResourceNotFoundError(f"The {self.name} {self.resource_name} was not found in your current Airbyte workspace.")
|
||||
if len(matching_resources) > 1:
|
||||
raise DuplicateResourceError(
|
||||
f"{len(matching_resources)} {self.name}s with the name {self.resource_name} were found in your current Airbyte workspace."
|
||||
)
|
||||
return matching_resources[0]
|
||||
|
||||
def _find_by_resource_id(
|
||||
self,
|
||||
) -> Union[WebBackendConnectionRead, SourceRead, DestinationRead]:
|
||||
"""Retrieve a remote resource from its id by calling the get endpoint of the resource type.
|
||||
|
||||
Returns:
|
||||
Union[WebBackendConnectionRead, SourceRead, DestinationRead]: The remote resource model instance.
|
||||
"""
|
||||
return self._get_fn(self.api_instance, self.get_payload)
|
||||
|
||||
def get_remote_resource(self) -> Union[WebBackendConnectionRead, SourceRead, DestinationRead]:
|
||||
"""Retrieve a remote resource with a resource_name or a resource_id
|
||||
|
||||
Returns:
|
||||
Union[WebBackendConnectionRead, SourceRead, DestinationRead]: The remote resource model instance.
|
||||
"""
|
||||
if self.resource_id is not None:
|
||||
return self._find_by_resource_id()
|
||||
else:
|
||||
return self._find_by_resource_name()
|
||||
|
||||
def to_json(self) -> str:
|
||||
"""Get the JSON representation of the remote resource model instance.
|
||||
|
||||
Returns:
|
||||
str: The JSON representation of the remote resource model instance.
|
||||
"""
|
||||
return json.dumps(self.get_remote_resource().to_dict())
|
||||
|
||||
|
||||
class Source(BaseResource):
|
||||
name = "source"
|
||||
api = source_api.SourceApi
|
||||
get_function_name = "get_source"
|
||||
list_for_workspace_function_name = "list_sources_for_workspace"
|
||||
|
||||
@property
|
||||
def get_payload(self) -> Optional[SourceIdRequestBody]:
|
||||
"""Defines the payload to retrieve the remote source according to its resource_id.
|
||||
Returns:
|
||||
SourceIdRequestBody: The SourceIdRequestBody payload.
|
||||
"""
|
||||
return SourceIdRequestBody(self.resource_id)
|
||||
|
||||
|
||||
class Destination(BaseResource):
|
||||
name = "destination"
|
||||
api = destination_api.DestinationApi
|
||||
get_function_name = "get_destination"
|
||||
list_for_workspace_function_name = "list_destinations_for_workspace"
|
||||
|
||||
@property
|
||||
def get_payload(self) -> Optional[DestinationIdRequestBody]:
|
||||
"""Defines the payload to retrieve the remote destination according to its resource_id.
|
||||
Returns:
|
||||
DestinationIdRequestBody: The DestinationIdRequestBody payload.
|
||||
"""
|
||||
return DestinationIdRequestBody(self.resource_id)
|
||||
|
||||
|
||||
class Connection(BaseResource):
|
||||
name = "connection"
|
||||
api = web_backend_api.WebBackendApi
|
||||
get_function_name = "web_backend_get_connection"
|
||||
list_for_workspace_function_name = "web_backend_list_connections_for_workspace"
|
||||
|
||||
@property
|
||||
def get_payload(self) -> Optional[WebBackendConnectionRequestBody]:
|
||||
"""Defines the payload to retrieve the remote connection according to its resource_id.
|
||||
Returns:
|
||||
WebBackendConnectionRequestBody: The WebBackendConnectionRequestBody payload.
|
||||
"""
|
||||
return WebBackendConnectionRequestBody(with_refreshed_catalog=False, connection_id=self.resource_id)
|
||||
@@ -1,3 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2021 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
@@ -1,58 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import importlib.resources as pkg_resources
|
||||
import os
|
||||
from pathlib import Path
|
||||
from typing import Iterable, Tuple
|
||||
|
||||
import click
|
||||
from octavia_cli.base_commands import OctaviaCommand
|
||||
|
||||
from . import example_files
|
||||
|
||||
DIRECTORIES_TO_CREATE = {"connections", "destinations", "sources"}
|
||||
DEFAULT_API_HEADERS_FILE_CONTENT = pkg_resources.read_text(example_files, "example_api_http_headers.yaml")
|
||||
API_HTTP_HEADERS_TARGET_PATH = Path("api_http_headers.yaml")
|
||||
|
||||
|
||||
def create_api_headers_configuration_file() -> bool:
|
||||
if not API_HTTP_HEADERS_TARGET_PATH.is_file():
|
||||
with open(API_HTTP_HEADERS_TARGET_PATH, "w") as file:
|
||||
file.write(DEFAULT_API_HEADERS_FILE_CONTENT)
|
||||
return True
|
||||
return False
|
||||
|
||||
|
||||
def create_directories(directories_to_create: Iterable[str]) -> Tuple[Iterable[str], Iterable[str]]:
|
||||
created_directories = []
|
||||
not_created_directories = []
|
||||
for directory in directories_to_create:
|
||||
try:
|
||||
os.mkdir(directory)
|
||||
created_directories.append(directory)
|
||||
except FileExistsError:
|
||||
not_created_directories.append(directory)
|
||||
return created_directories, not_created_directories
|
||||
|
||||
|
||||
@click.command(cls=OctaviaCommand, help="Initialize required directories for the project.")
|
||||
@click.pass_context
|
||||
def init(ctx: click.Context):
|
||||
click.echo("🔨 - Initializing the project.")
|
||||
created_directories, not_created_directories = create_directories(DIRECTORIES_TO_CREATE)
|
||||
if created_directories:
|
||||
message = f"✅ - Created the following directories: {', '.join(created_directories)}."
|
||||
click.echo(click.style(message, fg="green"))
|
||||
if not_created_directories:
|
||||
message = f"❓ - Already existing directories: {', '.join(not_created_directories) }."
|
||||
click.echo(click.style(message, fg="yellow", bold=True))
|
||||
|
||||
created_api_http_headers_file = create_api_headers_configuration_file()
|
||||
if created_api_http_headers_file:
|
||||
message = f"✅ - Created API HTTP headers file in {API_HTTP_HEADERS_TARGET_PATH}"
|
||||
click.echo(click.style(message, fg="green", bold=True))
|
||||
else:
|
||||
message = "❓ - API HTTP headers file already exists, skipping."
|
||||
click.echo(click.style(message, fg="yellow", bold=True))
|
||||
@@ -1,3 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
@@ -1,4 +0,0 @@
|
||||
# This file is an example file with API HTTP headers used to pass to the octavia CLI API client.
|
||||
# It can be helpful to reach out to secured airbyte instances (ex. proxy auth server)
|
||||
headers:
|
||||
Content-Type: application/json
|
||||
@@ -1,3 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2021 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
@@ -1,84 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
from typing import List
|
||||
|
||||
import click
|
||||
from octavia_cli.base_commands import OctaviaCommand
|
||||
|
||||
from .listings import Connections, DestinationConnectorsDefinitions, Destinations, SourceConnectorsDefinitions, Sources
|
||||
|
||||
|
||||
@click.group("list", help="List existing Airbyte resources.")
|
||||
@click.pass_context
|
||||
def _list(ctx: click.Context): # pragma: no cover
|
||||
pass
|
||||
|
||||
|
||||
@click.group("connectors", help="List sources and destinations connectors available on your Airbyte instance.")
|
||||
@click.pass_context
|
||||
def connectors(ctx: click.Context): # pragma: no cover
|
||||
pass
|
||||
|
||||
|
||||
@click.group("workspace", help="Latest information on workspace's sources and destinations.")
|
||||
@click.pass_context
|
||||
def workspace(ctx: click.Context): # pragma: no cover
|
||||
pass
|
||||
|
||||
|
||||
@connectors.command(cls=OctaviaCommand, name="sources", help="List all the source connectors currently available on your Airbyte instance.")
|
||||
@click.pass_context
|
||||
def sources_connectors(ctx: click.Context):
|
||||
api_client = ctx.obj["API_CLIENT"]
|
||||
definitions = SourceConnectorsDefinitions(api_client)
|
||||
click.echo(definitions)
|
||||
|
||||
|
||||
@connectors.command(
|
||||
cls=OctaviaCommand, name="destinations", help="List all the destination connectors currently available on your Airbyte instance"
|
||||
)
|
||||
@click.pass_context
|
||||
def destinations_connectors(ctx: click.Context):
|
||||
api_client = ctx.obj["API_CLIENT"]
|
||||
definitions = DestinationConnectorsDefinitions(api_client)
|
||||
click.echo(definitions)
|
||||
|
||||
|
||||
@workspace.command(cls=OctaviaCommand, help="List existing sources in a workspace.")
|
||||
@click.pass_context
|
||||
def sources(ctx: click.Context):
|
||||
api_client = ctx.obj["API_CLIENT"]
|
||||
workspace_id = ctx.obj["WORKSPACE_ID"]
|
||||
sources = Sources(api_client, workspace_id)
|
||||
click.echo(sources)
|
||||
|
||||
|
||||
@workspace.command(cls=OctaviaCommand, help="List existing destinations in a workspace.")
|
||||
@click.pass_context
|
||||
def destinations(ctx: click.Context):
|
||||
api_client = ctx.obj["API_CLIENT"]
|
||||
workspace_id = ctx.obj["WORKSPACE_ID"]
|
||||
destinations = Destinations(api_client, workspace_id)
|
||||
click.echo(destinations)
|
||||
|
||||
|
||||
@workspace.command(cls=OctaviaCommand, help="List existing connections in a workspace.")
|
||||
@click.pass_context
|
||||
def connections(ctx: click.Context):
|
||||
api_client = ctx.obj["API_CLIENT"]
|
||||
workspace_id = ctx.obj["WORKSPACE_ID"]
|
||||
connections = Connections(api_client, workspace_id)
|
||||
click.echo(connections)
|
||||
|
||||
|
||||
AVAILABLE_COMMANDS: List[click.Command] = [connectors, workspace]
|
||||
|
||||
|
||||
def add_commands_to_list():
|
||||
for command in AVAILABLE_COMMANDS:
|
||||
_list.add_command(command)
|
||||
|
||||
|
||||
add_commands_to_list()
|
||||
@@ -1,59 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
from typing import List
|
||||
|
||||
|
||||
def compute_columns_width(data: List[List[str]], padding: int = 2) -> List[int]:
|
||||
"""Compute columns width for display purposes:
|
||||
Find size for each columns in the data and add padding.
|
||||
Args:
|
||||
data (List[List[str]]): Tabular data containing rows and columns.
|
||||
padding (int): Number of character to adds to create space between columns.
|
||||
Returns:
|
||||
columns_width (List[int]): The computed columns widths for each column according to input data.
|
||||
"""
|
||||
columns_width = [0 for _ in data[0]]
|
||||
for row in data:
|
||||
for i, col in enumerate(row):
|
||||
current_col_width = len(col) + padding
|
||||
if current_col_width > columns_width[i]:
|
||||
columns_width[i] = current_col_width
|
||||
return columns_width
|
||||
|
||||
|
||||
def camelcased_to_uppercased_spaced(camelcased: str) -> str:
|
||||
"""Util function to transform a camelCase string to a UPPERCASED SPACED string
|
||||
e.g: dockerImageName -> DOCKER IMAGE NAME
|
||||
Args:
|
||||
camelcased (str): The camel cased string to convert.
|
||||
|
||||
Returns:
|
||||
(str): The converted UPPERCASED SPACED string
|
||||
"""
|
||||
return "".join(map(lambda x: x if x.islower() else " " + x, camelcased)).upper()
|
||||
|
||||
|
||||
def display_as_table(data: List[List[str]]) -> str:
|
||||
"""Formats tabular input data into a displayable table with columns.
|
||||
Args:
|
||||
data (List[List[str]]): Tabular data containing rows and columns.
|
||||
Returns:
|
||||
table (str): String representation of input tabular data.
|
||||
"""
|
||||
columns_width = compute_columns_width(data)
|
||||
table = "\n".join(["".join(col.ljust(columns_width[i]) for i, col in enumerate(row)) for row in data])
|
||||
return table
|
||||
|
||||
|
||||
def format_column_names(camelcased_column_names: List[str]) -> List[str]:
|
||||
"""Format camel cased column names to uppercased spaced column names
|
||||
|
||||
Args:
|
||||
camelcased_column_names (List[str]): Column names in camel case.
|
||||
|
||||
Returns:
|
||||
(List[str]): Column names in uppercase with spaces.
|
||||
"""
|
||||
return [camelcased_to_uppercased_spaced(column_name) for column_name in camelcased_column_names]
|
||||
@@ -1,111 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import abc
|
||||
from typing import List
|
||||
|
||||
import airbyte_api_client
|
||||
import octavia_cli.list.formatting as formatting
|
||||
from airbyte_api_client.api import connection_api, destination_api, destination_definition_api, source_api, source_definition_api
|
||||
from airbyte_api_client.model.workspace_id_request_body import WorkspaceIdRequestBody
|
||||
|
||||
|
||||
class BaseListing(abc.ABC):
|
||||
COMMON_LIST_FUNCTION_KWARGS = {"_check_return_type": False}
|
||||
|
||||
@property
|
||||
@abc.abstractmethod
|
||||
def api(
|
||||
self,
|
||||
): # pragma: no cover
|
||||
pass
|
||||
|
||||
@property
|
||||
@abc.abstractmethod
|
||||
def fields_to_display(
|
||||
self,
|
||||
) -> List[str]: # pragma: no cover
|
||||
pass
|
||||
|
||||
@property
|
||||
@abc.abstractmethod
|
||||
def list_field_in_response(
|
||||
self,
|
||||
) -> str: # pragma: no cover
|
||||
pass
|
||||
|
||||
@property
|
||||
@abc.abstractmethod
|
||||
def list_function_name(
|
||||
self,
|
||||
) -> str: # pragma: no cover
|
||||
pass
|
||||
|
||||
@property
|
||||
def _list_fn(self):
|
||||
return getattr(self.api, self.list_function_name)
|
||||
|
||||
@property
|
||||
def list_function_kwargs(self) -> dict:
|
||||
return {}
|
||||
|
||||
def __init__(self, api_client: airbyte_api_client.ApiClient):
|
||||
self.api_instance = self.api(api_client)
|
||||
|
||||
def _parse_response(self, api_response) -> List[List[str]]:
|
||||
items = [[item[field] for field in self.fields_to_display] for item in api_response[self.list_field_in_response]]
|
||||
return items
|
||||
|
||||
def get_listing(self) -> List[List[str]]:
|
||||
api_response = self._list_fn(self.api_instance, **self.list_function_kwargs, **self.COMMON_LIST_FUNCTION_KWARGS)
|
||||
return self._parse_response(api_response)
|
||||
|
||||
def __repr__(self):
|
||||
items = [formatting.format_column_names(self.fields_to_display)] + self.get_listing()
|
||||
return formatting.display_as_table(items)
|
||||
|
||||
|
||||
class SourceConnectorsDefinitions(BaseListing):
|
||||
api = source_definition_api.SourceDefinitionApi
|
||||
fields_to_display = ["name", "dockerRepository", "dockerImageTag", "sourceDefinitionId"]
|
||||
list_field_in_response = "source_definitions"
|
||||
list_function_name = "list_source_definitions"
|
||||
|
||||
|
||||
class DestinationConnectorsDefinitions(BaseListing):
|
||||
api = destination_definition_api.DestinationDefinitionApi
|
||||
fields_to_display = ["name", "dockerRepository", "dockerImageTag", "destinationDefinitionId"]
|
||||
list_field_in_response = "destination_definitions"
|
||||
list_function_name = "list_destination_definitions"
|
||||
|
||||
|
||||
class WorkspaceListing(BaseListing, abc.ABC):
|
||||
def __init__(self, api_client: airbyte_api_client.ApiClient, workspace_id: str):
|
||||
self.workspace_id = workspace_id
|
||||
super().__init__(api_client)
|
||||
|
||||
@property
|
||||
def list_function_kwargs(self) -> dict:
|
||||
return {"workspace_id_request_body": WorkspaceIdRequestBody(workspace_id=self.workspace_id)}
|
||||
|
||||
|
||||
class Sources(WorkspaceListing):
|
||||
api = source_api.SourceApi
|
||||
fields_to_display = ["name", "sourceName", "sourceId"]
|
||||
list_field_in_response = "sources"
|
||||
list_function_name = "list_sources_for_workspace"
|
||||
|
||||
|
||||
class Destinations(WorkspaceListing):
|
||||
api = destination_api.DestinationApi
|
||||
fields_to_display = ["name", "destinationName", "destinationId"]
|
||||
list_field_in_response = "destinations"
|
||||
list_function_name = "list_destinations_for_workspace"
|
||||
|
||||
|
||||
class Connections(WorkspaceListing):
|
||||
api = connection_api.ConnectionApi
|
||||
fields_to_display = ["name", "connectionId", "status", "sourceId", "destinationId"]
|
||||
list_field_in_response = "connections"
|
||||
list_function_name = "list_connections_for_workspace"
|
||||
@@ -1,91 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import os
|
||||
from typing import Optional
|
||||
|
||||
import analytics
|
||||
import click
|
||||
|
||||
|
||||
def build_user_agent(octavia_version: str) -> str:
|
||||
"""Build user-agent for the API client according to octavia version.
|
||||
|
||||
Args:
|
||||
octavia_version (str): Current octavia version.
|
||||
|
||||
Returns:
|
||||
str: the user-agent string.
|
||||
"""
|
||||
return f"octavia-cli/{octavia_version}"
|
||||
|
||||
|
||||
class TelemetryClient:
|
||||
|
||||
WRITE_KEY = "ER8EjdRVFut7n05XPaaTKrSEnjLscyKr"
|
||||
|
||||
def __init__(self, send_data: bool = False) -> None:
|
||||
"""Create a TelemetryClient instance.
|
||||
|
||||
Args:
|
||||
send_data (bool, optional): Whether the telemetry should be sent. Defaults to False.
|
||||
"""
|
||||
self.segment_client = analytics.Client(self.write_key, send=send_data)
|
||||
|
||||
@property
|
||||
def write_key(self) -> str:
|
||||
"""Retrieve the write key according to environment.
|
||||
Developer can set the OCTAVIA_TELEMETRY_WRITE_KEY env var to send telemetry to another Segment source.
|
||||
|
||||
Returns:
|
||||
str: The write key to use with the analytics client.
|
||||
"""
|
||||
return os.getenv("OCTAVIA_TELEMETRY_WRITE_KEY", TelemetryClient.WRITE_KEY)
|
||||
|
||||
def _create_command_name(self, ctx: click.Context, command_names: Optional[list] = None, extra_info_name: Optional[str] = None) -> str:
|
||||
"""Build the full command name by concatenating info names the context and its parents.
|
||||
|
||||
Args:
|
||||
ctx (click.Context): The click context from which we want to build the command name.
|
||||
command_names (Optional[list], optional): Previously builds commands name (used for recursion). Defaults to None.
|
||||
extra_info_name (Optional[str], optional): Extra info name if the context was not built yet. Defaults to None.
|
||||
|
||||
Returns:
|
||||
str: The full command name.
|
||||
"""
|
||||
if command_names is None:
|
||||
command_names = [ctx.info_name]
|
||||
else:
|
||||
command_names.insert(0, ctx.info_name)
|
||||
if ctx.parent is not None:
|
||||
self._create_command_name(ctx.parent, command_names)
|
||||
return " ".join(command_names) if not extra_info_name else " ".join(command_names + [extra_info_name])
|
||||
|
||||
def send_command_telemetry(
|
||||
self, ctx: click.Context, error: Optional[Exception] = None, extra_info_name: Optional[str] = None, is_help: bool = False
|
||||
):
|
||||
"""Send telemetry with the analytics client.
|
||||
The event name is the command name.
|
||||
The context has the octavia version.
|
||||
The properties hold success or failure of command run, error type if exists and other metadata.
|
||||
|
||||
Args:
|
||||
ctx (click.Context): Context from which the telemetry is built.
|
||||
error (Optional[Exception], optional): The error that was raised. Defaults to None.
|
||||
extra_info_name (Optional[str], optional): Extra info name if the context was not built yet. Defaults to None.
|
||||
"""
|
||||
user_id = ctx.obj.get("WORKSPACE_ID") if ctx.obj.get("ANONYMOUS_DATA_COLLECTION", True) is False else None
|
||||
anonymous_id = None if user_id else "anonymous"
|
||||
segment_context = {"app": {"name": "octavia-cli", "version": ctx.obj.get("OCTAVIA_VERSION")}}
|
||||
segment_properties = {
|
||||
"success": error is None,
|
||||
"is_help": is_help,
|
||||
"error_type": error.__class__.__name__ if error is not None else None,
|
||||
"project_is_initialized": ctx.obj.get("PROJECT_IS_INITIALIZED"),
|
||||
"airbyter": os.getenv("AIRBYTE_ROLE") == "airbyter",
|
||||
}
|
||||
command_name = self._create_command_name(ctx, extra_info_name=extra_info_name)
|
||||
self.segment_client.track(
|
||||
user_id=user_id, anonymous_id=anonymous_id, event=command_name, properties=segment_properties, context=segment_context
|
||||
)
|
||||
@@ -1,14 +0,0 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
set -uxe
|
||||
VERSION=$1
|
||||
GIT_REVISION=$2
|
||||
SCRIPT_DIR=$( cd -- "$( dirname -- "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )
|
||||
|
||||
docker run --privileged --rm tonistiigi/binfmt --install all # This installs the emulator to build multi-arch images
|
||||
set +e # Disable exit if the next command fails if the builder already exist.
|
||||
docker buildx create --name octavia_builder > /dev/null 2>&1
|
||||
set -e # The previous command can fail safely if
|
||||
docker buildx use octavia_builder
|
||||
docker buildx inspect --bootstrap
|
||||
docker buildx build --push --tag airbyte/octavia-cli:${VERSION} --platform=linux/arm64,linux/amd64 --label "io.airbyte.git-revision=${GIT_REVISION}" ${SCRIPT_DIR}
|
||||
@@ -1,7 +0,0 @@
|
||||
[pytest]
|
||||
log_cli = 1
|
||||
log_cli_level = INFO
|
||||
log_cli_format = %(asctime)s [%(levelname)8s] %(message)s (%(filename)s:%(lineno)s)
|
||||
log_cli_date_format=%Y-%m-%d %H:%M:%S
|
||||
markers =
|
||||
integration: marks tests as integration test (deselect with '-m "not integration"')
|
||||
@@ -1,64 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import os
|
||||
import pathlib
|
||||
|
||||
from setuptools import find_packages, setup
|
||||
|
||||
# The directory containing this file
|
||||
HERE = pathlib.Path(__file__).parent
|
||||
|
||||
# The text of the README file
|
||||
README = (HERE / "README.md").read_text()
|
||||
|
||||
setup(
|
||||
name="octavia-cli",
|
||||
version="0.50.0",
|
||||
description="A command line interface to manage Airbyte configurations",
|
||||
long_description=README,
|
||||
author="Airbyte",
|
||||
author_email="contact@airbyte.io",
|
||||
license="MIT",
|
||||
url="https://github.com/airbytehq/airbyte",
|
||||
classifiers=[
|
||||
# This information is used when browsing on PyPi.
|
||||
# Dev Status
|
||||
"Development Status :: 3 - Alpha",
|
||||
# Project Audience
|
||||
"Intended Audience :: Developers",
|
||||
"Topic :: Scientific/Engineering",
|
||||
"Topic :: Software Development :: Libraries :: Python Modules",
|
||||
"License :: OSI Approved :: MIT License",
|
||||
# Python Version Support
|
||||
"Programming Language :: Python :: 3.8",
|
||||
],
|
||||
keywords="airbyte cli command-line-interface configuration",
|
||||
project_urls={
|
||||
"Documentation": "https://docs.airbyte.io/",
|
||||
"Source": "https://github.com/airbytehq/airbyte",
|
||||
"Tracker": "https://github.com/airbytehq/airbyte/issues",
|
||||
},
|
||||
packages=find_packages(exclude=("unit_tests", "integration_tests", "docs")),
|
||||
package_data={"octavia_cli.generate": ["templates/*.j2"], "octavia_cli.init.example_files": ["example_api_http_headers.yaml"]},
|
||||
install_requires=[
|
||||
"click~=8.0.3",
|
||||
f"airbyte_api_client @ file://{os.getcwd()}/build/airbyte_api_client",
|
||||
"jinja2~=3.0.3",
|
||||
"deepdiff~=5.7.0",
|
||||
"pyyaml~=6.0",
|
||||
"analytics-python~=1.4.0",
|
||||
"python-slugify~=6.1.2",
|
||||
"urllib3<2",
|
||||
],
|
||||
python_requires=">=3.9.11",
|
||||
extras_require={
|
||||
"tests": ["MyPy~=0.812", "pytest~=6.2.5", "pytest-cov", "pytest-mock", "pytest-recording", "requests-mock", "pre-commit"],
|
||||
"sphinx-docs": [
|
||||
"Sphinx~=4.2",
|
||||
"sphinx-rtd-theme~=1.0",
|
||||
],
|
||||
},
|
||||
entry_points={"console_scripts": ["octavia=octavia_cli.entrypoint:octavia"]},
|
||||
)
|
||||
@@ -1,3 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2021 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
@@ -1,15 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import pytest
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_api_client(mocker):
|
||||
return mocker.Mock()
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_telemetry_client(mocker):
|
||||
return mocker.Mock()
|
||||
@@ -1,3 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
@@ -1,233 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import pytest
|
||||
from click.testing import CliRunner
|
||||
from octavia_cli._import import commands
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def patch_click(mocker):
|
||||
mocker.patch.object(commands, "click")
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def context_object(mock_api_client, mock_telemetry_client):
|
||||
return {
|
||||
"PROJECT_IS_INITIALIZED": True,
|
||||
"API_CLIENT": mock_api_client,
|
||||
"WORKSPACE_ID": "workspace_id",
|
||||
"TELEMETRY_CLIENT": mock_telemetry_client,
|
||||
}
|
||||
|
||||
|
||||
def test_build_help_message():
|
||||
assert commands.build_help_message("source") == "Import an existing source to manage it with octavia-cli."
|
||||
|
||||
|
||||
@pytest.mark.parametrize("ResourceClass", [commands.UnmanagedSource, commands.UnmanagedDestination])
|
||||
def test_import_source_or_destination(mocker, context_object, ResourceClass):
|
||||
resource_type = ResourceClass.__name__.lower()
|
||||
mocker.patch.object(commands.click, "style")
|
||||
mocker.patch.object(commands.click, "echo")
|
||||
mocker.patch.object(commands, "get_json_representation")
|
||||
mocker.patch.object(
|
||||
commands.json,
|
||||
"loads",
|
||||
mocker.Mock(
|
||||
return_value={
|
||||
"name": "foo",
|
||||
"connection_configuration": "bar",
|
||||
f"{resource_type}_definition_id": f"{resource_type}_definition_id",
|
||||
f"{resource_type}_id": f"my_{resource_type}_id",
|
||||
}
|
||||
),
|
||||
)
|
||||
mocker.patch.object(commands.definitions, "factory")
|
||||
mocker.patch.object(commands.renderers, "ConnectorSpecificationRenderer")
|
||||
expected_managed_resource, expected_state = (mocker.Mock(), mocker.Mock())
|
||||
mocker.patch.object(
|
||||
commands.resources,
|
||||
"factory",
|
||||
mocker.Mock(return_value=mocker.Mock(manage=mocker.Mock(return_value=(expected_managed_resource, expected_state)))),
|
||||
)
|
||||
commands.import_source_or_destination(context_object["API_CLIENT"], context_object["WORKSPACE_ID"], ResourceClass, "resource_to_get")
|
||||
commands.get_json_representation.assert_called_with(
|
||||
context_object["API_CLIENT"], context_object["WORKSPACE_ID"], ResourceClass, "resource_to_get"
|
||||
)
|
||||
commands.json.loads.assert_called_with(commands.get_json_representation.return_value)
|
||||
remote_configuration = commands.json.loads.return_value
|
||||
commands.definitions.factory.assert_called_with(
|
||||
resource_type, context_object["API_CLIENT"], context_object["WORKSPACE_ID"], f"{resource_type}_definition_id"
|
||||
)
|
||||
commands.renderers.ConnectorSpecificationRenderer.assert_called_with("foo", commands.definitions.factory.return_value)
|
||||
renderer = commands.renderers.ConnectorSpecificationRenderer.return_value
|
||||
renderer.import_configuration.assert_called_with(project_path=".", configuration=remote_configuration["connection_configuration"])
|
||||
commands.resources.factory.assert_called_with(
|
||||
context_object["API_CLIENT"], context_object["WORKSPACE_ID"], renderer.import_configuration.return_value
|
||||
)
|
||||
commands.resources.factory.return_value.manage.assert_called_with(remote_configuration[f"{resource_type}_id"])
|
||||
commands.click.style.assert_has_calls(
|
||||
[
|
||||
mocker.call(
|
||||
f"✅ - Imported {resource_type} {expected_managed_resource.name} in {renderer.import_configuration.return_value}. State stored in {expected_state.path}",
|
||||
fg="green",
|
||||
),
|
||||
mocker.call(f"⚠️ - Please update any secrets stored in {renderer.import_configuration.return_value}", fg="yellow"),
|
||||
]
|
||||
)
|
||||
assert commands.click.echo.call_count == 2
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"source_exists, source_was_created, destination_exists, destination_was_created",
|
||||
[
|
||||
(True, True, True, True),
|
||||
(False, False, False, False),
|
||||
(True, False, True, False),
|
||||
(True, True, False, False),
|
||||
(True, True, True, False),
|
||||
],
|
||||
)
|
||||
def test_import_connection(mocker, context_object, source_exists, source_was_created, destination_exists, destination_was_created):
|
||||
mocker.patch.object(commands.click, "style")
|
||||
mocker.patch.object(commands.click, "echo")
|
||||
mocker.patch.object(commands, "get_json_representation")
|
||||
mocker.patch.object(
|
||||
commands.json,
|
||||
"loads",
|
||||
mocker.Mock(
|
||||
return_value={
|
||||
"source": {"name": "my_source"},
|
||||
"destination": {"name": "my_destination"},
|
||||
"name": "my_connection",
|
||||
"connection_id": "my_connection_id",
|
||||
}
|
||||
),
|
||||
)
|
||||
remote_configuration = commands.json.loads.return_value
|
||||
mocker.patch.object(commands.definitions, "factory")
|
||||
mock_source_configuration_path = mocker.Mock(is_file=mocker.Mock(return_value=source_exists))
|
||||
mock_destination_configuration_path = mocker.Mock(is_file=mocker.Mock(return_value=destination_exists))
|
||||
|
||||
mocker.patch.object(
|
||||
commands.renderers.ConnectorSpecificationRenderer,
|
||||
"get_output_path",
|
||||
mocker.Mock(side_effect=[mock_source_configuration_path, mock_destination_configuration_path]),
|
||||
)
|
||||
mocker.patch.object(commands.renderers, "ConnectionRenderer")
|
||||
mock_managed_source = mocker.Mock(was_created=source_was_created)
|
||||
mock_managed_destination = mocker.Mock(was_created=destination_was_created)
|
||||
mock_remote_connection, mock_connection_state = mocker.Mock(), mocker.Mock()
|
||||
mock_managed_connection = mocker.Mock(manage=mocker.Mock(return_value=(mock_remote_connection, mock_connection_state)))
|
||||
|
||||
mocker.patch.object(
|
||||
commands.resources, "factory", mocker.Mock(side_effect=[mock_managed_source, mock_managed_destination, mock_managed_connection])
|
||||
)
|
||||
if all([source_exists, destination_exists, source_was_created, destination_was_created]):
|
||||
|
||||
commands.import_connection(context_object["API_CLIENT"], context_object["WORKSPACE_ID"], "resource_to_get")
|
||||
commands.get_json_representation.assert_called_with(
|
||||
context_object["API_CLIENT"], context_object["WORKSPACE_ID"], commands.UnmanagedConnection, "resource_to_get"
|
||||
)
|
||||
commands.renderers.ConnectorSpecificationRenderer.get_output_path.assert_has_calls(
|
||||
[
|
||||
mocker.call(project_path=".", definition_type="source", resource_name="my_source"),
|
||||
mocker.call(project_path=".", definition_type="destination", resource_name="my_destination"),
|
||||
]
|
||||
)
|
||||
commands.resources.factory.assert_has_calls(
|
||||
[
|
||||
mocker.call(context_object["API_CLIENT"], context_object["WORKSPACE_ID"], mock_source_configuration_path),
|
||||
mocker.call(context_object["API_CLIENT"], context_object["WORKSPACE_ID"], mock_destination_configuration_path),
|
||||
mocker.call(
|
||||
context_object["API_CLIENT"],
|
||||
context_object["WORKSPACE_ID"],
|
||||
commands.renderers.ConnectionRenderer.return_value.import_configuration.return_value,
|
||||
),
|
||||
]
|
||||
)
|
||||
commands.renderers.ConnectionRenderer.assert_called_with(
|
||||
remote_configuration["name"], mock_managed_source, mock_managed_destination
|
||||
)
|
||||
commands.renderers.ConnectionRenderer.return_value.import_configuration.assert_called_with(".", remote_configuration)
|
||||
new_configuration_path = commands.renderers.ConnectionRenderer.return_value.import_configuration.return_value
|
||||
commands.click.style.assert_called_with(
|
||||
f"✅ - Imported connection {mock_remote_connection.name} in {new_configuration_path}. State stored in {mock_connection_state.path}",
|
||||
fg="green",
|
||||
)
|
||||
commands.click.echo.assert_called_with(commands.click.style.return_value)
|
||||
if not source_exists or not destination_exists:
|
||||
with pytest.raises(
|
||||
commands.MissingResourceDependencyError,
|
||||
match="is not managed by octavia-cli, please import and apply it before importing your connection.",
|
||||
):
|
||||
commands.import_connection(context_object["API_CLIENT"], context_object["WORKSPACE_ID"], "resource_to_get")
|
||||
if source_exists and destination_exists and (not source_was_created or not destination_was_created):
|
||||
with pytest.raises(commands.resources.NonExistingResourceError, match="Please run octavia apply before creating this connection."):
|
||||
commands.import_connection(context_object["API_CLIENT"], context_object["WORKSPACE_ID"], "resource_to_get")
|
||||
|
||||
|
||||
@pytest.mark.parametrize("command", [commands.source, commands.destination, commands.connection, commands.all])
|
||||
def test_import_not_initialized(command):
|
||||
runner = CliRunner()
|
||||
result = runner.invoke(command, obj={"PROJECT_IS_INITIALIZED": False})
|
||||
assert result.exit_code == 1
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"command, ResourceClass, import_function",
|
||||
[
|
||||
(commands.source, commands.UnmanagedSource, "import_source_or_destination"),
|
||||
(commands.destination, commands.UnmanagedDestination, "import_source_or_destination"),
|
||||
(commands.connection, None, "import_connection"),
|
||||
],
|
||||
)
|
||||
def test_import_commands(mocker, context_object, ResourceClass, command, import_function):
|
||||
runner = CliRunner()
|
||||
mock_import_function = mocker.Mock()
|
||||
mocker.patch.object(commands, import_function, mock_import_function)
|
||||
result = runner.invoke(command, ["resource_to_import"], obj=context_object)
|
||||
if import_function == "import_source_or_destination":
|
||||
mock_import_function.assert_called_with(
|
||||
context_object["API_CLIENT"], context_object["WORKSPACE_ID"], ResourceClass, "resource_to_import"
|
||||
)
|
||||
else:
|
||||
mock_import_function.assert_called_with(context_object["API_CLIENT"], context_object["WORKSPACE_ID"], "resource_to_import")
|
||||
assert result.exit_code == 0
|
||||
|
||||
|
||||
def test_import_all(mocker, context_object):
|
||||
runner = CliRunner()
|
||||
mock_manager = mocker.Mock()
|
||||
mocker.patch.object(commands, "import_source_or_destination", mock_manager.import_source_or_destination)
|
||||
mocker.patch.object(commands, "import_connection", mock_manager.import_connection)
|
||||
mocker.patch.object(
|
||||
commands, "UnmanagedSources", return_value=mocker.Mock(get_listing=mocker.Mock(return_value=[("_", "_", "source_resource_id")]))
|
||||
)
|
||||
mocker.patch.object(
|
||||
commands,
|
||||
"UnmanagedDestinations",
|
||||
return_value=mocker.Mock(get_listing=mocker.Mock(return_value=[("_", "_", "destination_resource_id")])),
|
||||
)
|
||||
mocker.patch.object(
|
||||
commands,
|
||||
"UnmanagedConnections",
|
||||
return_value=mocker.Mock(get_listing=mocker.Mock(return_value=[("_", "connection_resource_id", "_", "_", "_")])),
|
||||
)
|
||||
result = runner.invoke(commands.all, obj=context_object)
|
||||
|
||||
commands.UnmanagedSources.return_value.get_listing.assert_called_once()
|
||||
commands.UnmanagedDestinations.return_value.get_listing.assert_called_once()
|
||||
commands.UnmanagedConnections.return_value.get_listing.assert_called_once()
|
||||
assert result.exit_code == 0
|
||||
assert mock_manager.mock_calls[0] == mocker.call.import_source_or_destination(
|
||||
context_object["API_CLIENT"], "workspace_id", commands.UnmanagedSource, "source_resource_id"
|
||||
)
|
||||
assert mock_manager.mock_calls[1] == mocker.call.import_source_or_destination(
|
||||
context_object["API_CLIENT"], "workspace_id", commands.UnmanagedDestination, "destination_resource_id"
|
||||
)
|
||||
assert mock_manager.mock_calls[2] == mocker.call.import_connection(
|
||||
context_object["API_CLIENT"], "workspace_id", "connection_resource_id"
|
||||
)
|
||||
@@ -1,203 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import os
|
||||
|
||||
import pytest
|
||||
from octavia_cli import api_http_headers
|
||||
|
||||
|
||||
class TestApiHttpHeader:
|
||||
@pytest.mark.parametrize(
|
||||
"header_name, header_value, expected_error, expected_name, expected_value",
|
||||
[
|
||||
("foo", "bar", None, "foo", "bar"),
|
||||
(" foo ", " bar ", None, "foo", "bar"),
|
||||
("", "bar", AttributeError, None, None),
|
||||
("foo", "", AttributeError, None, None),
|
||||
],
|
||||
)
|
||||
def test_init(self, header_name, header_value, expected_error, expected_name, expected_value):
|
||||
if expected_error is None:
|
||||
api_http_header = api_http_headers.ApiHttpHeader(header_name, header_value)
|
||||
assert api_http_header.name == expected_name and api_http_header.value == expected_value
|
||||
else:
|
||||
with pytest.raises(expected_error):
|
||||
api_http_headers.ApiHttpHeader(header_name, header_value)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def api_http_header_env_var():
|
||||
os.environ["API_HTTP_HEADER_IN_ENV_VAR"] = "bar"
|
||||
yield "bar"
|
||||
del os.environ["API_HTTP_HEADER_IN_ENV_VAR"]
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"yaml_document, expected_api_http_headers, expected_error",
|
||||
[
|
||||
(
|
||||
"""
|
||||
headers:
|
||||
Content-Type: ${API_HTTP_HEADER_IN_ENV_VAR}
|
||||
""",
|
||||
[api_http_headers.ApiHttpHeader("Content-Type", "bar")],
|
||||
None,
|
||||
),
|
||||
(
|
||||
"""
|
||||
headers:
|
||||
Content-Type: application/json
|
||||
""",
|
||||
[api_http_headers.ApiHttpHeader("Content-Type", "application/json")],
|
||||
None,
|
||||
),
|
||||
(
|
||||
"""
|
||||
headers:
|
||||
Content-Type: application/csv
|
||||
Content-Type: application/json
|
||||
""",
|
||||
[api_http_headers.ApiHttpHeader("Content-Type", "application/json")],
|
||||
None,
|
||||
),
|
||||
(
|
||||
"""
|
||||
headers:
|
||||
Content-Type: application/json
|
||||
Authorization: Bearer XXX
|
||||
""",
|
||||
[
|
||||
api_http_headers.ApiHttpHeader("Content-Type", "application/json"),
|
||||
api_http_headers.ApiHttpHeader("Authorization", "Bearer XXX"),
|
||||
],
|
||||
None,
|
||||
),
|
||||
("no_headers: foo", None, api_http_headers.InvalidApiHttpHeadersFileError),
|
||||
("", None, api_http_headers.InvalidApiHttpHeadersFileError),
|
||||
(
|
||||
"""
|
||||
some random words
|
||||
- some dashes:
|
||||
- and_next
|
||||
""".strip(),
|
||||
None,
|
||||
api_http_headers.InvalidApiHttpHeadersFileError,
|
||||
),
|
||||
],
|
||||
)
|
||||
def test_deserialize_file_based_headers(api_http_header_env_var, tmp_path, yaml_document, expected_api_http_headers, expected_error):
|
||||
yaml_file_path = tmp_path / "api_http_headers.yaml"
|
||||
yaml_file_path.write_text(yaml_document)
|
||||
if expected_error is None:
|
||||
file_based_headers = api_http_headers.deserialize_file_based_headers(yaml_file_path)
|
||||
assert file_based_headers == expected_api_http_headers
|
||||
else:
|
||||
with pytest.raises(expected_error):
|
||||
api_http_headers.deserialize_file_based_headers(yaml_file_path)
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"option_based_headers, expected_option_based_headers",
|
||||
[
|
||||
([("Content-Type", "application/json")], [api_http_headers.ApiHttpHeader("Content-Type", "application/json")]),
|
||||
(
|
||||
[("Content-Type", "application/yaml"), ("Content-Type", "application/json")],
|
||||
[api_http_headers.ApiHttpHeader("Content-Type", "application/json")],
|
||||
),
|
||||
(
|
||||
[("Content-Type", "application/json"), ("Authorization", "Bearer XXX")],
|
||||
[
|
||||
api_http_headers.ApiHttpHeader("Content-Type", "application/json"),
|
||||
api_http_headers.ApiHttpHeader("Authorization", "Bearer XXX"),
|
||||
],
|
||||
),
|
||||
([], []),
|
||||
],
|
||||
)
|
||||
def test_deserialize_option_based_headers(option_based_headers, expected_option_based_headers):
|
||||
assert api_http_headers.deserialize_option_based_headers(option_based_headers) == expected_option_based_headers
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"yaml_document, option_based_raw_headers, expected_merged_headers",
|
||||
[
|
||||
(
|
||||
"""
|
||||
headers:
|
||||
Content-Type: application/csv
|
||||
""",
|
||||
[("Content-Type", "application/json")],
|
||||
[api_http_headers.ApiHttpHeader("Content-Type", "application/json")],
|
||||
),
|
||||
(
|
||||
None,
|
||||
[("Content-Type", "application/json")],
|
||||
[api_http_headers.ApiHttpHeader("Content-Type", "application/json")],
|
||||
),
|
||||
(
|
||||
"""
|
||||
headers:
|
||||
Content-Type: application/json
|
||||
""",
|
||||
[],
|
||||
[api_http_headers.ApiHttpHeader("Content-Type", "application/json")],
|
||||
),
|
||||
(
|
||||
"""
|
||||
headers:
|
||||
Content-Type: application/json
|
||||
""",
|
||||
None,
|
||||
[api_http_headers.ApiHttpHeader("Content-Type", "application/json")],
|
||||
),
|
||||
(
|
||||
"""
|
||||
headers:
|
||||
Content-Type: application/json
|
||||
""",
|
||||
[("Authorization", "Bearer XXX")],
|
||||
[
|
||||
api_http_headers.ApiHttpHeader("Content-Type", "application/json"),
|
||||
api_http_headers.ApiHttpHeader("Authorization", "Bearer XXX"),
|
||||
],
|
||||
),
|
||||
(
|
||||
"""
|
||||
headers:
|
||||
Content-Type: application/json
|
||||
Foo: Bar
|
||||
""",
|
||||
[("Authorization", "Bearer XXX")],
|
||||
[
|
||||
api_http_headers.ApiHttpHeader("Content-Type", "application/json"),
|
||||
api_http_headers.ApiHttpHeader("Foo", "Bar"),
|
||||
api_http_headers.ApiHttpHeader("Authorization", "Bearer XXX"),
|
||||
],
|
||||
),
|
||||
],
|
||||
)
|
||||
def test_merge_api_headers(tmp_path, mocker, yaml_document, option_based_raw_headers, expected_merged_headers):
|
||||
mocker.patch.object(api_http_headers.click, "echo")
|
||||
if yaml_document is not None:
|
||||
yaml_file_path = tmp_path / "api_http_headers.yaml"
|
||||
yaml_file_path.write_text(yaml_document)
|
||||
else:
|
||||
yaml_file_path = None
|
||||
assert api_http_headers.merge_api_headers(option_based_raw_headers, yaml_file_path) == expected_merged_headers
|
||||
if option_based_raw_headers and yaml_file_path:
|
||||
api_http_headers.click.echo.assert_called_with(
|
||||
"ℹ️ - You passed API HTTP headers in a file and in options at the same time. Option based headers will override file based headers."
|
||||
)
|
||||
|
||||
|
||||
def test_set_api_headers_on_api_client(mocker, mock_api_client):
|
||||
headers = [api_http_headers.ApiHttpHeader("foo", "bar"), api_http_headers.ApiHttpHeader("bar", "foo")]
|
||||
api_http_headers.set_api_headers_on_api_client(mock_api_client, headers)
|
||||
mock_api_client.set_default_header.assert_has_calls(
|
||||
[
|
||||
mocker.call(headers[0].name, headers[0].value),
|
||||
mocker.call(headers[1].name, headers[1].value),
|
||||
]
|
||||
)
|
||||
@@ -1,3 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2021 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
@@ -1,300 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import pytest
|
||||
from click.testing import CliRunner
|
||||
from octavia_cli.apply import commands
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def patch_click(mocker):
|
||||
mocker.patch.object(commands, "click")
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def context_object(mock_api_client, mock_telemetry_client):
|
||||
return {
|
||||
"PROJECT_IS_INITIALIZED": True,
|
||||
"API_CLIENT": mock_api_client,
|
||||
"WORKSPACE_ID": "workspace_id",
|
||||
"TELEMETRY_CLIENT": mock_telemetry_client,
|
||||
}
|
||||
|
||||
|
||||
def test_apply_not_initialized():
|
||||
runner = CliRunner()
|
||||
result = runner.invoke(commands.apply, obj={"PROJECT_IS_INITIALIZED": False})
|
||||
assert result.exit_code == 1
|
||||
|
||||
|
||||
def test_apply_without_custom_configuration_file(mocker, context_object):
|
||||
runner = CliRunner()
|
||||
local_files = ["foo", "bar"]
|
||||
mocker.patch.object(commands, "find_local_configuration_files", mocker.Mock(return_value=local_files))
|
||||
mock_resources_to_apply = [mocker.Mock(), mocker.Mock()]
|
||||
mocker.patch.object(commands, "get_resources_to_apply", mocker.Mock(return_value=mock_resources_to_apply))
|
||||
mocker.patch.object(commands, "apply_single_resource")
|
||||
result = runner.invoke(commands.apply, obj=context_object)
|
||||
assert result.exit_code == 0
|
||||
commands.find_local_configuration_files.assert_called_once()
|
||||
commands.get_resources_to_apply.assert_called_once_with(local_files, context_object["API_CLIENT"], context_object["WORKSPACE_ID"])
|
||||
commands.apply_single_resource([mocker.call(r, False) for r in commands.get_resources_to_apply.return_value])
|
||||
|
||||
|
||||
def test_apply_with_custom_configuration_file(mocker, context_object):
|
||||
runner = CliRunner()
|
||||
mocker.patch.object(commands, "find_local_configuration_files")
|
||||
mocker.patch.object(commands, "get_resources_to_apply")
|
||||
mocker.patch.object(commands, "apply_single_resource")
|
||||
result = runner.invoke(commands.apply, ["--file", "foo", "--file", "bar"], obj=context_object)
|
||||
assert result.exit_code == 0
|
||||
commands.find_local_configuration_files.assert_not_called()
|
||||
commands.get_resources_to_apply.assert_called_with(("foo", "bar"), context_object["API_CLIENT"], context_object["WORKSPACE_ID"])
|
||||
|
||||
|
||||
def test_apply_with_custom_configuration_file_force(mocker, context_object):
|
||||
runner = CliRunner()
|
||||
mocker.patch.object(commands, "find_local_configuration_files")
|
||||
mocker.patch.object(commands, "get_resources_to_apply", mocker.Mock(return_value=[mocker.Mock()]))
|
||||
mocker.patch.object(commands, "apply_single_resource")
|
||||
result = runner.invoke(commands.apply, ["--file", "foo", "--file", "bar", "--force"], obj=context_object)
|
||||
assert result.exit_code == 0
|
||||
commands.apply_single_resource.assert_called_with(commands.get_resources_to_apply.return_value[0], True)
|
||||
|
||||
|
||||
def test_get_resource_to_apply(mocker, mock_api_client):
|
||||
local_files_priorities = [("foo", 2), ("bar", 1)]
|
||||
mock_resource_factory = mocker.Mock()
|
||||
mock_resource_factory.side_effect = [mocker.Mock(APPLY_PRIORITY=priority) for _, priority in local_files_priorities]
|
||||
mocker.patch.object(commands, "resource_factory", mock_resource_factory)
|
||||
|
||||
resources_to_apply = commands.get_resources_to_apply([f[0] for f in local_files_priorities], mock_api_client, "workspace_id")
|
||||
assert resources_to_apply == sorted(resources_to_apply, key=lambda r: r.APPLY_PRIORITY)
|
||||
assert commands.resource_factory.call_count == len(local_files_priorities)
|
||||
commands.resource_factory.assert_has_calls([mocker.call(mock_api_client, "workspace_id", path) for path, _ in local_files_priorities])
|
||||
|
||||
|
||||
@pytest.mark.parametrize("resource_was_created", [True, False])
|
||||
def test_apply_single_resource(patch_click, mocker, resource_was_created):
|
||||
mocker.patch.object(commands, "update_resource", mocker.Mock(return_value=["updated"]))
|
||||
mocker.patch.object(commands, "create_resource", mocker.Mock(return_value=["created"]))
|
||||
resource = mocker.Mock(was_created=resource_was_created, resource_name="my_resource_name")
|
||||
force = mocker.Mock()
|
||||
commands.apply_single_resource(resource, force)
|
||||
if resource_was_created:
|
||||
commands.update_resource.assert_called_once_with(resource, force)
|
||||
commands.create_resource.assert_not_called()
|
||||
expected_message = (
|
||||
"🐙 - my_resource_name exists on your Airbyte instance according to your state file, let's check if we need to update it!"
|
||||
)
|
||||
expected_message_color = "yellow"
|
||||
expected_echo_calls = [mocker.call(commands.click.style.return_value), mocker.call("\n".join(["updated"]))]
|
||||
else:
|
||||
commands.update_resource.assert_not_called()
|
||||
commands.create_resource.assert_called_once_with(resource)
|
||||
expected_message = "🐙 - my_resource_name does not exists on your Airbyte instance, let's create it!"
|
||||
expected_message_color = "green"
|
||||
expected_echo_calls = [mocker.call(commands.click.style.return_value), mocker.call("\n".join(["created"]))]
|
||||
commands.click.style.assert_called_with(expected_message, fg=expected_message_color)
|
||||
commands.click.echo.assert_has_calls(expected_echo_calls)
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"force,user_validation,local_file_changed,expect_update,expected_reason",
|
||||
[
|
||||
pytest.param(
|
||||
True, True, True, True, "🚨 - Running update because the force mode is activated.", id="1 - Check if force has the top priority."
|
||||
),
|
||||
pytest.param(
|
||||
True,
|
||||
False,
|
||||
True,
|
||||
True,
|
||||
"🚨 - Running update because the force mode is activated.",
|
||||
id="2 - Check if force has the top priority.",
|
||||
),
|
||||
pytest.param(
|
||||
True,
|
||||
False,
|
||||
False,
|
||||
True,
|
||||
"🚨 - Running update because the force mode is activated.",
|
||||
id="3 - Check if force has the top priority.",
|
||||
),
|
||||
pytest.param(
|
||||
True,
|
||||
True,
|
||||
False,
|
||||
True,
|
||||
"🚨 - Running update because the force mode is activated.",
|
||||
id="4 - Check if force has the top priority.",
|
||||
),
|
||||
pytest.param(
|
||||
False,
|
||||
True,
|
||||
True,
|
||||
True,
|
||||
"🟢 - Running update because you validated the changes.",
|
||||
id="Check if user validation has priority over local file change.",
|
||||
),
|
||||
pytest.param(
|
||||
False,
|
||||
False,
|
||||
True,
|
||||
False,
|
||||
"🔴 - Did not update because you refused the changes.",
|
||||
id="Check if user validation has priority over local file change.",
|
||||
),
|
||||
pytest.param(
|
||||
False,
|
||||
None,
|
||||
True,
|
||||
True,
|
||||
"🟡 - Running update because a local file change was detected and a secret field might have been edited.",
|
||||
id="Check if local_file_changed runs even if user validation is None.",
|
||||
),
|
||||
pytest.param(
|
||||
False,
|
||||
None,
|
||||
False,
|
||||
False,
|
||||
"😴 - Did not update because no change detected.",
|
||||
id="Check no update if no local change and user validation is None.",
|
||||
),
|
||||
],
|
||||
)
|
||||
def test_should_update_resource(patch_click, mocker, force, user_validation, local_file_changed, expect_update, expected_reason):
|
||||
should_update, update_reason = commands.should_update_resource(force, user_validation, local_file_changed)
|
||||
assert should_update == expect_update
|
||||
assert update_reason == commands.click.style.return_value
|
||||
commands.click.style.assert_called_with(expected_reason, fg="green")
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"diff,expected_number_calls_to_display_diff_line",
|
||||
[("", 0), ("First diff line", 1), ("First diff line\nSecond diff line", 2), ("First diff line\nSecond diff line\nThird diff line", 3)],
|
||||
)
|
||||
def test_prompt_for_diff_validation(patch_click, mocker, diff, expected_number_calls_to_display_diff_line):
|
||||
mocker.patch.object(commands, "display_diff_line")
|
||||
output = commands.prompt_for_diff_validation("my_resource", diff)
|
||||
assert commands.display_diff_line.call_count == expected_number_calls_to_display_diff_line
|
||||
if diff and expected_number_calls_to_display_diff_line > 0:
|
||||
commands.display_diff_line.assert_has_calls([mocker.call(line) for line in diff.split("\n")])
|
||||
commands.click.style.assert_has_calls(
|
||||
[
|
||||
mocker.call(
|
||||
"👀 - Here's the computed diff (🚨 remind that diff on secret fields are not displayed):", fg="magenta", bold=True
|
||||
),
|
||||
mocker.call("❓ - Do you want to update my_resource?", bold=True),
|
||||
]
|
||||
)
|
||||
commands.click.echo.assert_called_with(commands.click.style.return_value)
|
||||
assert output == commands.click.confirm.return_value
|
||||
else:
|
||||
assert output is False
|
||||
|
||||
|
||||
def test_create_resource(patch_click, mocker):
|
||||
mock_created_resource = mocker.Mock()
|
||||
mock_state = mocker.Mock()
|
||||
mock_resource = mocker.Mock(create=mocker.Mock(return_value=(mock_created_resource, mock_state)))
|
||||
output_messages = commands.create_resource(mock_resource)
|
||||
mock_resource.create.assert_called_once()
|
||||
assert output_messages == [commands.click.style.return_value, commands.click.style.return_value]
|
||||
commands.click.style.assert_has_calls(
|
||||
[
|
||||
mocker.call(f"🎉 - Successfully created {mock_created_resource.name} on your Airbyte instance!", fg="green", bold=True),
|
||||
mocker.call(f"💾 - New state for {mock_created_resource.name} saved at {mock_state.path}", fg="yellow"),
|
||||
]
|
||||
)
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"force,diff,local_file_changed,expect_prompt,user_validation,expect_update",
|
||||
[
|
||||
pytest.param(True, True, True, False, False, True, id="Force, diff, local file change -> no prompt, no validation, expect update."),
|
||||
pytest.param(
|
||||
True, True, False, False, False, True, id="Force, diff, no local file change -> no prompt, no validation, expect update."
|
||||
),
|
||||
pytest.param(
|
||||
True, False, False, False, False, True, id="Force, no diff, no local file change -> no prompt, no validation, expect update."
|
||||
),
|
||||
pytest.param(
|
||||
True, False, True, False, False, True, id="Force, no diff, local file change -> no prompt, no validation, expect update."
|
||||
),
|
||||
pytest.param(
|
||||
False, True, True, True, True, True, id="No force, diff, local file change -> expect prompt, validation, expect update."
|
||||
),
|
||||
pytest.param(
|
||||
False, True, True, True, False, False, id="No force, diff, local file change -> expect prompt, no validation, no update."
|
||||
),
|
||||
pytest.param(
|
||||
False, True, False, True, True, True, id="No force, diff, no local file change -> expect prompt, validation, expect update."
|
||||
),
|
||||
pytest.param(
|
||||
False, True, False, True, False, False, id="No force, diff, no local file change -> expect prompt, no validation, no update."
|
||||
),
|
||||
pytest.param(
|
||||
False, False, True, False, False, True, id="No force, no diff, local file change -> no prompt, no validation, expect update."
|
||||
),
|
||||
pytest.param(
|
||||
False, False, False, False, False, False, id="No force, no diff, no local file change -> no prompt, no validation, no update."
|
||||
),
|
||||
],
|
||||
)
|
||||
def test_update_resource(patch_click, mocker, force, diff, local_file_changed, expect_prompt, user_validation, expect_update):
|
||||
mock_updated_resource = mocker.Mock()
|
||||
mock_state = mocker.Mock()
|
||||
mock_resource = mocker.Mock(
|
||||
get_diff_with_remote_resource=mocker.Mock(return_value=diff),
|
||||
resource_name="my_resource",
|
||||
local_file_changed=local_file_changed,
|
||||
update=mocker.Mock(return_value=(mock_updated_resource, mock_state)),
|
||||
)
|
||||
mocker.patch.object(commands, "prompt_for_diff_validation", mocker.Mock(return_value=user_validation))
|
||||
|
||||
output_messages = commands.update_resource(mock_resource, force)
|
||||
commands.click.echo.assert_called_once()
|
||||
|
||||
if expect_prompt:
|
||||
commands.prompt_for_diff_validation.assert_called_once_with("my_resource", diff)
|
||||
else:
|
||||
commands.prompt_for_diff_validation.assert_not_called()
|
||||
if expect_update:
|
||||
mock_resource.update.assert_called_once()
|
||||
else:
|
||||
mock_resource.update.assert_not_called()
|
||||
|
||||
if expect_update:
|
||||
assert output_messages == [
|
||||
commands.click.style.return_value,
|
||||
commands.click.style.return_value,
|
||||
]
|
||||
commands.click.style.assert_has_calls(
|
||||
[
|
||||
mocker.call(f"🎉 - Successfully updated {mock_updated_resource.name} on your Airbyte instance!", fg="green", bold=True),
|
||||
mocker.call(f"💾 - New state for {mock_updated_resource.name} stored at {mock_state.path}.", fg="yellow"),
|
||||
]
|
||||
)
|
||||
else:
|
||||
assert output_messages == []
|
||||
|
||||
|
||||
def test_find_local_configuration_files(mocker):
|
||||
project_directories = ["sources", "connections", "destinations"]
|
||||
mocker.patch.object(commands, "REQUIRED_PROJECT_DIRECTORIES", project_directories)
|
||||
mocker.patch.object(commands, "glob", mocker.Mock(return_value=["foo.yaml"]))
|
||||
configuration_files = commands.find_local_configuration_files()
|
||||
assert isinstance(configuration_files, list)
|
||||
commands.glob.assert_has_calls([mocker.call(f"./{directory}/**/configuration.yaml") for directory in project_directories])
|
||||
assert configuration_files == ["foo.yaml" for _ in range(len(project_directories))]
|
||||
|
||||
|
||||
def test_find_local_configuration_files_no_file_found(patch_click, mocker):
|
||||
project_directories = ["sources", "connections", "destinations"]
|
||||
mocker.patch.object(commands, "REQUIRED_PROJECT_DIRECTORIES", project_directories)
|
||||
mocker.patch.object(commands, "glob", mocker.Mock(return_value=[]))
|
||||
configuration_files = commands.find_local_configuration_files()
|
||||
assert not configuration_files
|
||||
commands.click.style.assert_called_once_with("😒 - No YAML file found to run apply.", fg="red")
|
||||
@@ -1,46 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import pytest
|
||||
from octavia_cli.apply import diff_helpers
|
||||
|
||||
|
||||
def test_hash_config():
|
||||
data_to_hash = {"example": "foo"}
|
||||
assert diff_helpers.hash_config(data_to_hash) == "8d621bd700ff9a864bc603f56b4ec73536110b37d814dd4629767e898da70bef"
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"obj, expected_output",
|
||||
[
|
||||
(diff_helpers.SECRET_MASK, True),
|
||||
("not secret", False),
|
||||
({}, False),
|
||||
],
|
||||
)
|
||||
def test_exclude_secrets_from_diff(obj, expected_output):
|
||||
assert diff_helpers.exclude_secrets_from_diff(obj, "foo") == expected_output
|
||||
|
||||
|
||||
def test_compute_diff(mocker):
|
||||
mocker.patch.object(diff_helpers, "DeepDiff")
|
||||
diff = diff_helpers.compute_diff("foo", "bar")
|
||||
assert diff == diff_helpers.DeepDiff.return_value
|
||||
diff_helpers.DeepDiff.assert_called_with("foo", "bar", view="tree", exclude_obj_callback=diff_helpers.exclude_secrets_from_diff)
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"diff_line,expected_message,expected_color",
|
||||
[
|
||||
("resource changed from", "E - resource changed from", "yellow"),
|
||||
("resource added", "+ - resource added", "green"),
|
||||
("resource removed", "- - resource removed", "red"),
|
||||
("whatever", " - whatever", None),
|
||||
],
|
||||
)
|
||||
def test_display_diff_line(mocker, diff_line, expected_message, expected_color):
|
||||
mocker.patch.object(diff_helpers, "click")
|
||||
diff_helpers.display_diff_line(diff_line)
|
||||
diff_helpers.click.style.assert_called_with(f"\t{expected_message}", fg=expected_color)
|
||||
diff_helpers.click.echo.assert_called_with(diff_helpers.click.style.return_value)
|
||||
@@ -1,956 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
from copy import deepcopy
|
||||
from unittest.mock import mock_open, patch
|
||||
|
||||
import pytest
|
||||
from airbyte_api_client import ApiException
|
||||
from airbyte_api_client.model.airbyte_catalog import AirbyteCatalog
|
||||
from airbyte_api_client.model.connection_schedule_data_basic_schedule import ConnectionScheduleDataBasicSchedule
|
||||
from airbyte_api_client.model.connection_schedule_type import ConnectionScheduleType
|
||||
from airbyte_api_client.model.connection_status import ConnectionStatus
|
||||
from airbyte_api_client.model.destination_definition_id_request_body import DestinationDefinitionIdRequestBody
|
||||
from airbyte_api_client.model.destination_definition_id_with_workspace_id import DestinationDefinitionIdWithWorkspaceId
|
||||
from airbyte_api_client.model.namespace_definition_type import NamespaceDefinitionType
|
||||
from airbyte_api_client.model.operation_create import OperationCreate
|
||||
from airbyte_api_client.model.operator_type import OperatorType
|
||||
from airbyte_api_client.model.resource_requirements import ResourceRequirements
|
||||
from airbyte_api_client.model.source_definition_id_request_body import SourceDefinitionIdRequestBody
|
||||
from airbyte_api_client.model.source_definition_id_with_workspace_id import SourceDefinitionIdWithWorkspaceId
|
||||
from airbyte_api_client.model.web_backend_operation_create_or_update import WebBackendOperationCreateOrUpdate
|
||||
from octavia_cli.apply import resources, yaml_loaders
|
||||
|
||||
|
||||
class TestResourceState:
|
||||
def test_init(self, mocker):
|
||||
mocker.patch.object(resources, "os")
|
||||
state = resources.ResourceState("config_path", "workspace_id", "resource_id", 123, "config_hash")
|
||||
assert state.configuration_path == "config_path"
|
||||
assert state.workspace_id == "workspace_id"
|
||||
assert state.resource_id == "resource_id"
|
||||
assert state.generation_timestamp == 123
|
||||
assert state.configuration_hash == "config_hash"
|
||||
assert state.path == resources.os.path.join.return_value
|
||||
resources.os.path.dirname.assert_called_with("config_path")
|
||||
resources.os.path.join.assert_called_with(resources.os.path.dirname.return_value, "state_workspace_id.yaml")
|
||||
|
||||
@pytest.fixture
|
||||
def state(self):
|
||||
return resources.ResourceState("config_path", "workspace_id", "resource_id", 123, "config_hash")
|
||||
|
||||
def test_as_dict(self, state):
|
||||
assert state.as_dict() == {
|
||||
"configuration_path": state.configuration_path,
|
||||
"resource_id": state.resource_id,
|
||||
"generation_timestamp": state.generation_timestamp,
|
||||
"configuration_hash": state.configuration_hash,
|
||||
"workspace_id": state.workspace_id,
|
||||
}
|
||||
|
||||
def test_save(self, mocker, state):
|
||||
mocker.patch.object(resources, "yaml")
|
||||
mocker.patch.object(state, "as_dict")
|
||||
|
||||
expected_content = state.as_dict.return_value
|
||||
with patch("builtins.open", mock_open()) as mock_file:
|
||||
state._save()
|
||||
mock_file.assert_called_with(state.path, "w")
|
||||
resources.yaml.dump.assert_called_with(expected_content, mock_file.return_value)
|
||||
|
||||
def test_create(self, mocker):
|
||||
mocker.patch.object(resources.time, "time", mocker.Mock(return_value=0))
|
||||
mocker.patch.object(resources.ResourceState, "_save")
|
||||
state = resources.ResourceState.create("config_path", "my_hash", "workspace_id", "resource_id")
|
||||
assert isinstance(state, resources.ResourceState)
|
||||
resources.ResourceState._save.assert_called_once()
|
||||
assert state.configuration_path == "config_path"
|
||||
assert state.resource_id == "resource_id"
|
||||
assert state.generation_timestamp == 0
|
||||
assert state.configuration_hash == "my_hash"
|
||||
|
||||
def test_delete(self, mocker, state):
|
||||
mocker.patch.object(resources.os, "remove")
|
||||
state.delete()
|
||||
resources.os.remove.assert_called_with(state.path)
|
||||
|
||||
def test_from_file(self, mocker):
|
||||
mocker.patch.object(resources, "yaml")
|
||||
resources.yaml.safe_load.return_value = {
|
||||
"configuration_path": "config_path",
|
||||
"resource_id": "resource_id",
|
||||
"generation_timestamp": 0,
|
||||
"configuration_hash": "my_hash",
|
||||
"workspace_id": "workspace_id",
|
||||
}
|
||||
with patch("builtins.open", mock_open(read_data="data")) as mock_file:
|
||||
state = resources.ResourceState.from_file("state_workspace_id.yaml")
|
||||
resources.yaml.safe_load.assert_called_with(mock_file.return_value)
|
||||
assert isinstance(state, resources.ResourceState)
|
||||
assert state.configuration_path == "config_path"
|
||||
assert state.resource_id == "resource_id"
|
||||
assert state.generation_timestamp == 0
|
||||
assert state.configuration_hash == "my_hash"
|
||||
|
||||
def test__get_path_from_configuration_and_workspace_id(self, mocker):
|
||||
mocker.patch.object(resources.os.path, "dirname", mocker.Mock(return_value="my_dir"))
|
||||
state_path = resources.ResourceState._get_path_from_configuration_and_workspace_id("config_path", "workspace_id")
|
||||
assert state_path == "my_dir/state_workspace_id.yaml"
|
||||
resources.os.path.dirname.assert_called_with("config_path")
|
||||
|
||||
def test_from_configuration_path_and_workspace(self, mocker):
|
||||
mocker.patch.object(resources.ResourceState, "_get_path_from_configuration_and_workspace_id")
|
||||
mocker.patch.object(resources.ResourceState, "from_file")
|
||||
state = resources.ResourceState.from_configuration_path_and_workspace("config_path", "workspace_id")
|
||||
assert state == resources.ResourceState.from_file.return_value
|
||||
resources.ResourceState.from_file.assert_called_with(
|
||||
resources.ResourceState._get_path_from_configuration_and_workspace_id.return_value
|
||||
)
|
||||
resources.ResourceState._get_path_from_configuration_and_workspace_id.assert_called_with("config_path", "workspace_id")
|
||||
|
||||
def test_migrate(self, mocker):
|
||||
mocker.patch.object(resources.ResourceState, "from_file")
|
||||
mocker.patch.object(resources.ResourceState, "create")
|
||||
new_state = resources.ResourceState.migrate("old_state_path", "workspace_id")
|
||||
resources.ResourceState.from_file.assert_called_with("old_state_path")
|
||||
old_state = resources.ResourceState.from_file.return_value
|
||||
resources.ResourceState.create.assert_called_with(
|
||||
old_state.configuration_path, old_state.configuration_hash, "workspace_id", old_state.resource_id
|
||||
)
|
||||
old_state.delete.assert_called_once()
|
||||
assert new_state == resources.ResourceState.create.return_value
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def local_configuration():
|
||||
return {
|
||||
"exotic_attribute": "foo",
|
||||
"configuration": {"foo": "bar"},
|
||||
"resource_name": "bar",
|
||||
"definition_id": "bar",
|
||||
"definition_image": "fooo",
|
||||
"definition_version": "barrr",
|
||||
}
|
||||
|
||||
|
||||
class TestBaseResource:
|
||||
@pytest.fixture
|
||||
def patch_base_class(self, mocker):
|
||||
# Mock abstract methods to enable instantiating abstract class
|
||||
mocker.patch.object(resources.BaseResource, "__abstractmethods__", set())
|
||||
mocker.patch.object(resources.BaseResource, "create_function_name", "create_resource")
|
||||
mocker.patch.object(resources.BaseResource, "resource_id_field", "resource_id")
|
||||
mocker.patch.object(resources.BaseResource, "update_function_name", "update_resource")
|
||||
mocker.patch.object(resources.BaseResource, "get_function_name", "get_resource")
|
||||
mocker.patch.object(resources.BaseResource, "resource_type", "universal_resource")
|
||||
mocker.patch.object(resources.BaseResource, "api")
|
||||
|
||||
def test_init_no_remote_resource(self, mocker, patch_base_class, mock_api_client, local_configuration):
|
||||
mocker.patch.object(resources.BaseResource, "_get_state_from_file", mocker.Mock(return_value=None))
|
||||
mocker.patch.object(resources, "hash_config")
|
||||
resource = resources.BaseResource(mock_api_client, "workspace_id", local_configuration, "bar.yaml")
|
||||
assert resource.APPLY_PRIORITY == 0
|
||||
assert resource.workspace_id == "workspace_id"
|
||||
assert resource.raw_configuration == local_configuration
|
||||
assert resource.configuration_path == "bar.yaml"
|
||||
assert resource.api_instance == resource.api.return_value
|
||||
resource.api.assert_called_with(mock_api_client)
|
||||
assert resource.state == resource._get_state_from_file.return_value
|
||||
assert resource.remote_resource is None
|
||||
assert resource.was_created is False
|
||||
assert resource.local_file_changed is True
|
||||
assert resource.resource_id is None
|
||||
|
||||
def test_init_with_remote_resource_not_changed(self, mocker, patch_base_class, mock_api_client, local_configuration):
|
||||
mocker.patch.object(
|
||||
resources.BaseResource, "_get_state_from_file", mocker.Mock(return_value=mocker.Mock(configuration_hash="my_hash"))
|
||||
)
|
||||
mocker.patch.object(resources.BaseResource, "_get_remote_resource", mocker.Mock(return_value={"resource_id": "my_resource_id"}))
|
||||
|
||||
mocker.patch.object(resources, "hash_config", mocker.Mock(return_value="my_hash"))
|
||||
resource = resources.BaseResource(mock_api_client, "workspace_id", local_configuration, "bar.yaml")
|
||||
assert resource.was_created is True
|
||||
assert resource.local_file_changed is False
|
||||
assert resource.resource_id == resource.state.resource_id
|
||||
|
||||
def test_init_with_remote_resource_changed(self, mocker, patch_base_class, mock_api_client, local_configuration):
|
||||
mocker.patch.object(
|
||||
resources.BaseResource,
|
||||
"_get_state_from_file",
|
||||
mocker.Mock(return_value=mocker.Mock(configuration_hash="my_state_hash")),
|
||||
)
|
||||
mocker.patch.object(resources.BaseResource, "_get_remote_resource", mocker.Mock(return_value={"resource_id": "my_resource_id"}))
|
||||
mocker.patch.object(resources, "hash_config", mocker.Mock(return_value="my_new_hash"))
|
||||
resource = resources.BaseResource(mock_api_client, "workspace_id", local_configuration, "bar.yaml")
|
||||
assert resource.was_created is True
|
||||
assert resource.local_file_changed is True
|
||||
assert resource.resource_id == resource.state.resource_id
|
||||
|
||||
@pytest.fixture
|
||||
def resource(self, patch_base_class, mock_api_client, local_configuration):
|
||||
return resources.BaseResource(mock_api_client, "workspace_id", local_configuration, "bar.yaml")
|
||||
|
||||
def test_get_remote_resource(self, resource, mocker):
|
||||
mocker.patch.object(resource, "_get_fn")
|
||||
remote_resource = resource._get_remote_resource()
|
||||
assert remote_resource == resource._get_fn.return_value
|
||||
resource._get_fn.assert_called_with(resource.api_instance, resource.get_payload)
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"state_path_is_file, legacy_state_path_is_file, confirm_migration",
|
||||
[(True, False, False), (False, True, True), (False, True, False), (False, False, False)],
|
||||
)
|
||||
def test_get_state_from_file(self, mocker, resource, state_path_is_file, legacy_state_path_is_file, confirm_migration):
|
||||
mocker.patch.object(resources, "os")
|
||||
mocker.patch.object(resources.click, "confirm", mocker.Mock(return_value=confirm_migration))
|
||||
mock_expected_state_path = mocker.Mock(is_file=mocker.Mock(return_value=state_path_is_file))
|
||||
mock_expected_legacy_state_path = mocker.Mock(is_file=mocker.Mock(return_value=legacy_state_path_is_file))
|
||||
mocker.patch.object(resources, "Path", mocker.Mock(side_effect=[mock_expected_state_path, mock_expected_legacy_state_path]))
|
||||
mocker.patch.object(resources, "ResourceState")
|
||||
|
||||
if legacy_state_path_is_file and not confirm_migration:
|
||||
with pytest.raises(resources.InvalidStateError):
|
||||
state = resource._get_state_from_file(resource.configuration_path, resource.workspace_id)
|
||||
else:
|
||||
state = resource._get_state_from_file(resource.configuration_path, resource.workspace_id)
|
||||
|
||||
resources.os.path.dirname.assert_called_with(resource.configuration_path)
|
||||
resources.os.path.join.assert_has_calls(
|
||||
[
|
||||
mocker.call(resources.os.path.dirname.return_value, f"state_{resource.workspace_id}.yaml"),
|
||||
mocker.call(resources.os.path.dirname.return_value, "state.yaml"),
|
||||
]
|
||||
)
|
||||
resources.Path.assert_called_with(resources.os.path.join.return_value)
|
||||
mock_expected_state_path.is_file.assert_called_once()
|
||||
if state_path_is_file:
|
||||
resources.ResourceState.from_file.assert_called_with(mock_expected_state_path)
|
||||
assert state == resources.ResourceState.from_file.return_value
|
||||
mock_expected_legacy_state_path.is_file.assert_not_called()
|
||||
elif legacy_state_path_is_file:
|
||||
if confirm_migration:
|
||||
mock_expected_legacy_state_path.is_file.assert_called_once()
|
||||
resources.ResourceState.migrate.assert_called_with(mock_expected_legacy_state_path, resource.workspace_id)
|
||||
assert state == resources.ResourceState.migrate.return_value
|
||||
else:
|
||||
assert state is None
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"was_created",
|
||||
[True, False],
|
||||
)
|
||||
def test_get_diff_with_remote_resource(self, patch_base_class, mocker, mock_api_client, local_configuration, was_created):
|
||||
mocker.patch.object(resources.BaseResource, "_get_remote_comparable_configuration")
|
||||
mocker.patch.object(resources.BaseResource, "was_created", was_created)
|
||||
resource = resources.BaseResource(mock_api_client, "workspace_id", local_configuration, "bar.yaml")
|
||||
mocker.patch.object(resources, "compute_diff")
|
||||
if was_created:
|
||||
diff = resource.get_diff_with_remote_resource()
|
||||
resources.compute_diff.assert_called_with(resource._get_remote_comparable_configuration.return_value, resource.configuration)
|
||||
assert diff == resources.compute_diff.return_value.pretty.return_value
|
||||
else:
|
||||
with pytest.raises(resources.NonExistingResourceError):
|
||||
resource.get_diff_with_remote_resource()
|
||||
|
||||
def test_create_or_update(self, mocker, resource):
|
||||
expected_results = {resource.resource_id_field: "resource_id"}
|
||||
operation_fn = mocker.Mock(return_value=expected_results)
|
||||
mocker.patch.object(resources, "ResourceState")
|
||||
payload = "foo"
|
||||
result, state = resource._create_or_update(operation_fn, payload)
|
||||
assert result == expected_results
|
||||
assert state == resources.ResourceState.create.return_value
|
||||
resources.ResourceState.create.assert_called_with(
|
||||
resource.configuration_path, resource.configuration_hash, resource.workspace_id, "resource_id"
|
||||
)
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"response_status,expected_error",
|
||||
[(404, ApiException), (422, resources.InvalidConfigurationError)],
|
||||
)
|
||||
def test_create_or_update_error(self, mocker, resource, response_status, expected_error):
|
||||
operation_fn = mocker.Mock(side_effect=ApiException(status=response_status))
|
||||
mocker.patch.object(resources, "ResourceState")
|
||||
with pytest.raises(expected_error):
|
||||
resource._create_or_update(operation_fn, "foo")
|
||||
|
||||
def test_create(self, mocker, resource):
|
||||
mocker.patch.object(resource, "_create_or_update")
|
||||
assert resource.create() == resource._create_or_update.return_value
|
||||
resource._create_or_update.assert_called_with(resource._create_fn, resource.create_payload)
|
||||
|
||||
def test_update(self, mocker, resource):
|
||||
mocker.patch.object(resource, "_create_or_update")
|
||||
assert resource.update() == resource._create_or_update.return_value
|
||||
resource._create_or_update.assert_called_with(resource._update_fn, resource.update_payload)
|
||||
|
||||
def test_manage(self, mocker, resource):
|
||||
mocker.patch.object(resources, "ResourceState")
|
||||
remote_resource, new_state = resource.manage("resource_id")
|
||||
resources.ResourceState.create.assert_called_with(
|
||||
resource.configuration_path, resource.configuration_hash, resource.workspace_id, "resource_id"
|
||||
)
|
||||
assert new_state == resources.ResourceState.create.return_value
|
||||
assert remote_resource == resource.remote_resource
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"configuration, invalid_keys, expect_error",
|
||||
[
|
||||
({"valid_key": "foo", "invalidKey": "bar"}, {"invalidKey"}, True),
|
||||
({"valid_key": "foo", "invalidKey": "bar", "secondInvalidKey": "bar"}, {"invalidKey", "secondInvalidKey"}, True),
|
||||
({"valid_key": "foo", "validKey": "bar"}, {"invalidKey"}, False),
|
||||
],
|
||||
)
|
||||
def test__check_for_invalid_configuration_keys(self, configuration, invalid_keys, expect_error):
|
||||
if not expect_error:
|
||||
result = resources.BaseResource._check_for_invalid_configuration_keys(configuration, invalid_keys, "Invalid configuration keys")
|
||||
assert result is None
|
||||
else:
|
||||
with pytest.raises(resources.InvalidConfigurationError, match="Invalid configuration keys") as error_info:
|
||||
resources.BaseResource._check_for_invalid_configuration_keys(configuration, invalid_keys, "Invalid configuration keys")
|
||||
assert all([invalid_key in str(error_info) for invalid_key in invalid_keys])
|
||||
|
||||
|
||||
class TestSourceAndDestination:
|
||||
@pytest.fixture
|
||||
def patch_source_and_destination(self, mocker):
|
||||
mocker.patch.object(resources.SourceAndDestination, "__abstractmethods__", set())
|
||||
mocker.patch.object(resources.SourceAndDestination, "api")
|
||||
mocker.patch.object(resources.SourceAndDestination, "create_function_name", "create")
|
||||
mocker.patch.object(resources.SourceAndDestination, "update_function_name", "update")
|
||||
mocker.patch.object(resources.SourceAndDestination, "get_function_name", "get")
|
||||
mocker.patch.object(resources.SourceAndDestination, "_get_state_from_file", mocker.Mock(return_value=None))
|
||||
mocker.patch.object(resources, "hash_config")
|
||||
|
||||
def test_init(self, patch_source_and_destination, mocker, mock_api_client, local_configuration):
|
||||
assert resources.SourceAndDestination.__base__ == resources.BaseResource
|
||||
resource = resources.SourceAndDestination(mock_api_client, "workspace_id", local_configuration, "bar.yaml")
|
||||
assert resource.definition_id == local_configuration["definition_id"]
|
||||
assert resource.definition_image == local_configuration["definition_image"]
|
||||
assert resource.definition_version == local_configuration["definition_version"]
|
||||
|
||||
def test_get_remote_comparable_configuration(self, patch_source_and_destination, mocker, mock_api_client, local_configuration):
|
||||
mocker.patch.object(resources.Source, "remote_resource")
|
||||
resource = resources.Source(mock_api_client, "workspace_id", local_configuration, "bar.yaml")
|
||||
assert resource._get_remote_comparable_configuration() == resource.remote_resource.connection_configuration
|
||||
|
||||
|
||||
class TestSource:
|
||||
@pytest.mark.parametrize(
|
||||
"state",
|
||||
[None, resources.ResourceState("config_path", "workspace_id", "resource_id", 123, "abc")],
|
||||
)
|
||||
def test_init(self, mocker, mock_api_client, local_configuration, state):
|
||||
assert resources.Source.__base__ == resources.SourceAndDestination
|
||||
mocker.patch.object(resources.Source, "resource_id", "foo")
|
||||
source = resources.Source(mock_api_client, "workspace_id", local_configuration, "bar.yaml")
|
||||
mocker.patch.object(source, "state", state)
|
||||
assert source.api == resources.source_api.SourceApi
|
||||
assert source.create_function_name == "create_source"
|
||||
assert source.resource_id_field == "source_id"
|
||||
assert source.update_function_name == "update_source"
|
||||
assert source.resource_type == "source"
|
||||
assert source.APPLY_PRIORITY == 0
|
||||
assert source.create_payload == resources.SourceCreate(
|
||||
source.definition_id, source.configuration, source.workspace_id, source.resource_name
|
||||
)
|
||||
assert source.update_payload == resources.SourceUpdate(
|
||||
source_id=source.resource_id, connection_configuration=source.configuration, name=source.resource_name
|
||||
)
|
||||
if state is None:
|
||||
assert source.get_payload is None
|
||||
else:
|
||||
assert source.get_payload == resources.SourceIdRequestBody(state.resource_id)
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"resource_id",
|
||||
[None, "foo"],
|
||||
)
|
||||
def test_source_discover_schema_request_body(self, mocker, mock_api_client, resource_id, local_configuration):
|
||||
mocker.patch.object(resources, "SourceDiscoverSchemaRequestBody")
|
||||
mocker.patch.object(resources.Source, "resource_id", resource_id)
|
||||
source = resources.Source(mock_api_client, "workspace_id", local_configuration, "bar.yaml")
|
||||
if resource_id is None:
|
||||
with pytest.raises(resources.NonExistingResourceError):
|
||||
source.source_discover_schema_request_body
|
||||
resources.SourceDiscoverSchemaRequestBody.assert_not_called()
|
||||
else:
|
||||
assert source.source_discover_schema_request_body == resources.SourceDiscoverSchemaRequestBody.return_value
|
||||
resources.SourceDiscoverSchemaRequestBody.assert_called_with(source.resource_id)
|
||||
|
||||
def test_catalog(self, mocker, mock_api_client, local_configuration):
|
||||
mocker.patch.object(resources.Source, "source_discover_schema_request_body")
|
||||
source = resources.Source(mock_api_client, "workspace_id", local_configuration, "bar.yaml")
|
||||
source.api_instance = mocker.Mock()
|
||||
catalog = source.catalog
|
||||
assert catalog == source.api_instance.discover_schema_for_source.return_value.catalog
|
||||
source.api_instance.discover_schema_for_source.assert_called_with(source.source_discover_schema_request_body)
|
||||
|
||||
def test_definition(self, mocker, mock_api_client, local_configuration):
|
||||
mocker.patch.object(resources.source_definition_api, "SourceDefinitionApi")
|
||||
mock_api_instance = resources.source_definition_api.SourceDefinitionApi.return_value
|
||||
source = resources.Source(mock_api_client, "workspace_id", local_configuration, "bar.yaml")
|
||||
assert source.definition == mock_api_instance.get_source_definition.return_value
|
||||
resources.source_definition_api.SourceDefinitionApi.assert_called_with(mock_api_client)
|
||||
expected_payload = SourceDefinitionIdRequestBody(source_definition_id=source.definition_id)
|
||||
mock_api_instance.get_source_definition.assert_called_with(expected_payload)
|
||||
|
||||
def test_definition_specification(self, mocker, mock_api_client, local_configuration):
|
||||
mocker.patch.object(resources.source_definition_specification_api, "SourceDefinitionSpecificationApi")
|
||||
mock_api_instance = resources.source_definition_specification_api.SourceDefinitionSpecificationApi.return_value
|
||||
source = resources.Source(mock_api_client, "workspace_id", local_configuration, "bar.yaml")
|
||||
assert source.definition_specification == mock_api_instance.get_source_definition_specification.return_value
|
||||
resources.source_definition_specification_api.SourceDefinitionSpecificationApi.assert_called_with(mock_api_client)
|
||||
expected_payload = SourceDefinitionIdWithWorkspaceId(source_definition_id=source.definition_id, workspace_id=source.workspace_id)
|
||||
mock_api_instance.get_source_definition_specification.assert_called_with(expected_payload)
|
||||
|
||||
|
||||
class TestDestination:
|
||||
@pytest.mark.parametrize(
|
||||
"state",
|
||||
[None, resources.ResourceState("config_path", "workspace_id", "resource_id", 123, "abc")],
|
||||
)
|
||||
def test_init(self, mocker, mock_api_client, local_configuration, state):
|
||||
assert resources.Destination.__base__ == resources.SourceAndDestination
|
||||
mocker.patch.object(resources.Destination, "resource_id", "foo")
|
||||
destination = resources.Destination(mock_api_client, "workspace_id", local_configuration, "bar.yaml")
|
||||
mocker.patch.object(destination, "state", state)
|
||||
assert destination.api == resources.destination_api.DestinationApi
|
||||
assert destination.create_function_name == "create_destination"
|
||||
assert destination.resource_id_field == "destination_id"
|
||||
assert destination.update_function_name == "update_destination"
|
||||
assert destination.resource_type == "destination"
|
||||
assert destination.APPLY_PRIORITY == 0
|
||||
assert destination.create_payload == resources.DestinationCreate(
|
||||
destination.workspace_id, destination.resource_name, destination.definition_id, destination.configuration
|
||||
)
|
||||
assert destination.update_payload == resources.DestinationUpdate(
|
||||
destination_id=destination.resource_id, connection_configuration=destination.configuration, name=destination.resource_name
|
||||
)
|
||||
if state is None:
|
||||
assert destination.get_payload is None
|
||||
else:
|
||||
assert destination.get_payload == resources.DestinationIdRequestBody(state.resource_id)
|
||||
|
||||
def test_definition(self, mocker, mock_api_client, local_configuration):
|
||||
mocker.patch.object(resources.destination_definition_api, "DestinationDefinitionApi")
|
||||
mock_api_instance = resources.destination_definition_api.DestinationDefinitionApi.return_value
|
||||
destination = resources.Destination(mock_api_client, "workspace_id", local_configuration, "bar.yaml")
|
||||
assert destination.definition == mock_api_instance.get_destination_definition.return_value
|
||||
resources.destination_definition_api.DestinationDefinitionApi.assert_called_with(mock_api_client)
|
||||
expected_payload = DestinationDefinitionIdRequestBody(destination_definition_id=destination.definition_id)
|
||||
mock_api_instance.get_destination_definition.assert_called_with(expected_payload)
|
||||
|
||||
def test_definition_specification(self, mocker, mock_api_client, local_configuration):
|
||||
mocker.patch.object(resources.destination_definition_specification_api, "DestinationDefinitionSpecificationApi")
|
||||
mock_api_instance = resources.destination_definition_specification_api.DestinationDefinitionSpecificationApi.return_value
|
||||
destination = resources.Destination(mock_api_client, "workspace_id", local_configuration, "bar.yaml")
|
||||
assert destination.definition_specification == mock_api_instance.get_destination_definition_specification.return_value
|
||||
resources.destination_definition_specification_api.DestinationDefinitionSpecificationApi.assert_called_with(mock_api_client)
|
||||
expected_payload = DestinationDefinitionIdWithWorkspaceId(
|
||||
destination_definition_id=destination.definition_id, workspace_id=destination.workspace_id
|
||||
)
|
||||
mock_api_instance.get_destination_definition_specification.assert_called_with(expected_payload)
|
||||
|
||||
|
||||
class TestConnection:
|
||||
@pytest.fixture
|
||||
def connection_configuration(self):
|
||||
return {
|
||||
"definition_type": "connection",
|
||||
"resource_name": "my_connection",
|
||||
"source_configuration_path": "my_source_configuration_path",
|
||||
"destination_configuration_path": "my_destination_configuration_path",
|
||||
"configuration": {
|
||||
"namespace_definition": "customformat",
|
||||
"namespace_format": "foo",
|
||||
"prefix": "foo",
|
||||
"sync_catalog": {
|
||||
"streams": [
|
||||
{
|
||||
"stream": {
|
||||
"name": "name_example",
|
||||
"json_schema": {},
|
||||
"supported_sync_modes": ["incremental"],
|
||||
"source_defined_cursor": True,
|
||||
"default_cursor_field": ["default_cursor_field"],
|
||||
"source_defined_primary_key": [["string_example"]],
|
||||
"namespace": "namespace_example",
|
||||
},
|
||||
"config": {
|
||||
"sync_mode": "incremental",
|
||||
"cursor_field": ["cursor_field_example"],
|
||||
"destination_sync_mode": "append_dedup",
|
||||
"primary_key": [["string_example"]],
|
||||
"alias_name": "alias_name_example",
|
||||
"selected": True,
|
||||
},
|
||||
}
|
||||
]
|
||||
},
|
||||
"schedule_type": "basic",
|
||||
"schedule_data": {"units": 1, "time_unit": "days"},
|
||||
"status": "active",
|
||||
"resource_requirements": {"cpu_request": "foo", "cpu_limit": "foo", "memory_request": "foo", "memory_limit": "foo"},
|
||||
},
|
||||
}
|
||||
|
||||
@pytest.fixture
|
||||
def connection_configuration_with_manual_schedule(self, connection_configuration):
|
||||
connection_configuration_with_manual_schedule = deepcopy(connection_configuration)
|
||||
connection_configuration_with_manual_schedule["configuration"]["schedule_type"] = "manual"
|
||||
connection_configuration_with_manual_schedule["configuration"]["schedule_data"] = None
|
||||
return connection_configuration_with_manual_schedule
|
||||
|
||||
@pytest.fixture
|
||||
def connection_configuration_with_normalization(self, connection_configuration):
|
||||
connection_configuration_with_normalization = deepcopy(connection_configuration)
|
||||
connection_configuration_with_normalization["configuration"]["operations"] = [
|
||||
{"name": "Normalization", "operator_configuration": {"normalization": {"option": "basic"}, "operator_type": "normalization"}}
|
||||
]
|
||||
return connection_configuration_with_normalization
|
||||
|
||||
@pytest.fixture
|
||||
def legacy_connection_configurations(self):
|
||||
return [
|
||||
{
|
||||
"definition_type": "connection",
|
||||
"resource_name": "my_connection",
|
||||
"source_id": "my_source",
|
||||
"destination_id": "my_destination",
|
||||
"configuration": {
|
||||
"namespaceDefinition": "customformat",
|
||||
"namespaceFormat": "foo",
|
||||
"prefix": "foo",
|
||||
"syncCatalog": {
|
||||
"streams": [
|
||||
{
|
||||
"stream": {
|
||||
"name": "name_example",
|
||||
"json_schema": {},
|
||||
"supported_sync_modes": ["incremental"],
|
||||
"source_defined_cursor": True,
|
||||
"default_cursor_field": ["default_cursor_field"],
|
||||
"source_defined_primary_key": [["string_example"]],
|
||||
"namespace": "namespace_example",
|
||||
},
|
||||
"config": {
|
||||
"sync_mode": "incremental",
|
||||
"cursor_field": ["cursor_field_example"],
|
||||
"destination_sync_mode": "append_dedup",
|
||||
"primary_key": [["string_example"]],
|
||||
"alias_name": "alias_name_example",
|
||||
"selected": True,
|
||||
},
|
||||
}
|
||||
]
|
||||
},
|
||||
"schedule": {"units": 1, "time_unit": "days"},
|
||||
"status": "active",
|
||||
"resourceRequirements": {"cpu_request": "foo", "cpu_limit": "foo", "memory_request": "foo", "memory_limit": "foo"},
|
||||
},
|
||||
},
|
||||
{
|
||||
"definition_type": "connection",
|
||||
"resource_name": "my_connection",
|
||||
"source_id": "my_source",
|
||||
"destination_id": "my_destination",
|
||||
"configuration": {
|
||||
"namespace_definition": "customformat",
|
||||
"namespace_format": "foo",
|
||||
"prefix": "foo",
|
||||
"sync_catalog": {
|
||||
"streams": [
|
||||
{
|
||||
"stream": {
|
||||
"name": "name_example",
|
||||
"jsonSchema": {},
|
||||
"supportedSyncModes": ["incremental"],
|
||||
"sourceDefinedCursor": True,
|
||||
"defaultCursorField": ["default_cursor_field"],
|
||||
"sourceDefinedPrimary_key": [["string_example"]],
|
||||
"namespace": "namespace_example",
|
||||
},
|
||||
"config": {
|
||||
"syncMode": "incremental",
|
||||
"cursorField": ["cursor_field_example"],
|
||||
"destinationSyncMode": "append_dedup",
|
||||
"primaryKey": [["string_example"]],
|
||||
"aliasName": "alias_name_example",
|
||||
"selected": True,
|
||||
},
|
||||
}
|
||||
]
|
||||
},
|
||||
"schedule": {"units": 1, "time_unit": "days"},
|
||||
"status": "active",
|
||||
"resource_requirements": {"cpu_request": "foo", "cpu_limit": "foo", "memory_request": "foo", "memory_limit": "foo"},
|
||||
},
|
||||
},
|
||||
{
|
||||
"definition_type": "connection",
|
||||
"resource_name": "my_connection",
|
||||
"source_id": "my_source",
|
||||
"destination_id": "my_destination",
|
||||
"configuration": {
|
||||
"namespace_definition": "customformat",
|
||||
"namespace_format": "foo",
|
||||
"prefix": "foo",
|
||||
"sync_catalog": {
|
||||
"streams": [
|
||||
{
|
||||
"stream": {},
|
||||
"config": {},
|
||||
}
|
||||
]
|
||||
},
|
||||
"schedule": {"units": 1, "time_unit": "days"},
|
||||
"status": "active",
|
||||
"resource_requirements": {"cpu_request": "foo", "cpu_limit": "foo", "memory_request": "foo", "memory_limit": "foo"},
|
||||
},
|
||||
},
|
||||
]
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"state",
|
||||
[None, resources.ResourceState("config_path", "workspace_id", "resource_id", 123, "abc")],
|
||||
)
|
||||
def test_init(self, mocker, mock_api_client, state, connection_configuration):
|
||||
assert resources.Connection.__base__ == resources.BaseResource
|
||||
mocker.patch.object(resources.Connection, "resource_id", "foo")
|
||||
connection = resources.Connection(mock_api_client, "workspace_id", connection_configuration, "bar.yaml")
|
||||
mocker.patch.object(connection, "state", state)
|
||||
assert connection.api == resources.web_backend_api.WebBackendApi
|
||||
assert connection.create_function_name == "web_backend_create_connection"
|
||||
assert connection.update_function_name == "web_backend_update_connection"
|
||||
assert connection.resource_id_field == "connection_id"
|
||||
assert connection.resource_type == "connection"
|
||||
assert connection.APPLY_PRIORITY == 1
|
||||
|
||||
assert connection.update_payload == resources.WebBackendConnectionUpdate(
|
||||
connection_id=connection.resource_id, **connection.configuration
|
||||
)
|
||||
if state is None:
|
||||
assert connection.get_payload is None
|
||||
else:
|
||||
assert connection.get_payload == resources.WebBackendConnectionRequestBody(
|
||||
connection_id=state.resource_id, with_refreshed_catalog=False
|
||||
)
|
||||
|
||||
@pytest.mark.parametrize("file_not_found_error", [False, True])
|
||||
def test_source_id(self, mocker, mock_api_client, connection_configuration, file_not_found_error):
|
||||
assert resources.Connection.__base__ == resources.BaseResource
|
||||
mocker.patch.object(resources.Connection, "resource_id", "foo")
|
||||
if file_not_found_error:
|
||||
mocker.patch.object(
|
||||
resources.ResourceState, "from_configuration_path_and_workspace", mocker.Mock(side_effect=FileNotFoundError())
|
||||
)
|
||||
else:
|
||||
mocker.patch.object(
|
||||
resources.ResourceState,
|
||||
"from_configuration_path_and_workspace",
|
||||
mocker.Mock(return_value=mocker.Mock(resource_id="expected_source_id")),
|
||||
)
|
||||
|
||||
connection = resources.Connection(mock_api_client, "workspace_id", connection_configuration, "bar.yaml")
|
||||
if file_not_found_error:
|
||||
with pytest.raises(resources.MissingStateError):
|
||||
connection.source_id
|
||||
else:
|
||||
source_id = connection.source_id
|
||||
assert source_id == "expected_source_id"
|
||||
resources.ResourceState.from_configuration_path_and_workspace.assert_called_with(
|
||||
connection_configuration["source_configuration_path"], connection.workspace_id
|
||||
)
|
||||
|
||||
@pytest.mark.parametrize("file_not_found_error", [False, True])
|
||||
def test_destination_id(self, mocker, mock_api_client, connection_configuration, file_not_found_error):
|
||||
assert resources.Connection.__base__ == resources.BaseResource
|
||||
mocker.patch.object(resources.Connection, "resource_id", "foo")
|
||||
if file_not_found_error:
|
||||
mocker.patch.object(
|
||||
resources.ResourceState, "from_configuration_path_and_workspace", mocker.Mock(side_effect=FileNotFoundError())
|
||||
)
|
||||
else:
|
||||
mocker.patch.object(
|
||||
resources.ResourceState,
|
||||
"from_configuration_path_and_workspace",
|
||||
mocker.Mock(return_value=mocker.Mock(resource_id="expected_destination_id")),
|
||||
)
|
||||
|
||||
connection = resources.Connection(mock_api_client, "workspace_id", connection_configuration, "bar.yaml")
|
||||
if file_not_found_error:
|
||||
with pytest.raises(resources.MissingStateError):
|
||||
connection.destination_id
|
||||
else:
|
||||
destination_id = connection.destination_id
|
||||
assert destination_id == "expected_destination_id"
|
||||
resources.ResourceState.from_configuration_path_and_workspace.assert_called_with(
|
||||
connection_configuration["destination_configuration_path"], connection.workspace_id
|
||||
)
|
||||
|
||||
def test_create_payload_no_normalization(self, mocker, mock_api_client, connection_configuration):
|
||||
assert resources.Connection.__base__ == resources.BaseResource
|
||||
mocker.patch.object(resources.Connection, "resource_id", "foo")
|
||||
mocker.patch.object(resources.Connection, "source_id", "source_id")
|
||||
mocker.patch.object(resources.Connection, "destination_id", "destination_id")
|
||||
connection = resources.Connection(mock_api_client, "workspace_id", connection_configuration, "bar.yaml")
|
||||
assert connection.create_payload == resources.WebBackendConnectionCreate(
|
||||
name=connection.resource_name,
|
||||
source_id=connection.source_id,
|
||||
destination_id=connection.destination_id,
|
||||
**connection.configuration,
|
||||
)
|
||||
assert "operations" not in connection.create_payload
|
||||
|
||||
def test_create_payload_with_normalization(self, mocker, mock_api_client, connection_configuration_with_normalization):
|
||||
assert resources.Connection.__base__ == resources.BaseResource
|
||||
mocker.patch.object(resources.Connection, "resource_id", "foo")
|
||||
mocker.patch.object(resources.Connection, "source_id", "source_id")
|
||||
mocker.patch.object(resources.Connection, "destination_id", "destination_id")
|
||||
connection = resources.Connection(mock_api_client, "workspace_id", connection_configuration_with_normalization, "bar.yaml")
|
||||
assert connection.create_payload == resources.WebBackendConnectionCreate(
|
||||
name=connection.resource_name,
|
||||
source_id=connection.source_id,
|
||||
destination_id=connection.destination_id,
|
||||
**connection.configuration,
|
||||
)
|
||||
assert isinstance(connection.create_payload["operations"][0], OperationCreate)
|
||||
|
||||
def test_update_payload_no_normalization(self, mocker, mock_api_client, connection_configuration):
|
||||
assert resources.Connection.__base__ == resources.BaseResource
|
||||
mocker.patch.object(resources.Connection, "resource_id", "foo")
|
||||
mocker.patch.object(resources.Connection, "source_id", "source_id")
|
||||
mocker.patch.object(resources.Connection, "destination_id", "destination_id")
|
||||
connection = resources.Connection(mock_api_client, "workspace_id", connection_configuration, "bar.yaml")
|
||||
assert connection.update_payload == resources.WebBackendConnectionUpdate(
|
||||
connection_id=connection.resource_id,
|
||||
**connection.configuration,
|
||||
)
|
||||
assert "operations" not in connection.update_payload
|
||||
|
||||
def test_update_payload_with_normalization(self, mocker, mock_api_client, connection_configuration_with_normalization):
|
||||
assert resources.Connection.__base__ == resources.BaseResource
|
||||
mocker.patch.object(resources.Connection, "resource_id", "foo")
|
||||
mocker.patch.object(resources.Connection, "source_id", "source_id")
|
||||
mocker.patch.object(resources.Connection, "destination_id", "destination_id")
|
||||
connection = resources.Connection(mock_api_client, "workspace_id", connection_configuration_with_normalization, "bar.yaml")
|
||||
assert connection.update_payload == resources.WebBackendConnectionUpdate(
|
||||
connection_id=connection.resource_id,
|
||||
**connection.configuration,
|
||||
)
|
||||
assert isinstance(connection.update_payload["operations"][0], WebBackendOperationCreateOrUpdate)
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"remote_resource",
|
||||
[
|
||||
{
|
||||
"name": "foo",
|
||||
"source_id": "bar",
|
||||
"destination_id": "fooo",
|
||||
"connection_id": "baar",
|
||||
"operation_ids": "foooo",
|
||||
"foo": "bar",
|
||||
},
|
||||
{
|
||||
"name": "foo",
|
||||
"source_id": "bar",
|
||||
"destination_id": "fooo",
|
||||
"connection_id": "baar",
|
||||
"operation_ids": "foooo",
|
||||
"foo": "bar",
|
||||
"operations": [],
|
||||
},
|
||||
{
|
||||
"name": "foo",
|
||||
"source_id": "bar",
|
||||
"destination_id": "fooo",
|
||||
"connection_id": "baar",
|
||||
"operation_ids": "foooo",
|
||||
"foo": "bar",
|
||||
"operations": [{"workspace_id": "foo", "operation_id": "foo", "operator_configuration": {"normalization": "foo"}}],
|
||||
},
|
||||
{
|
||||
"name": "foo",
|
||||
"source_id": "bar",
|
||||
"destination_id": "fooo",
|
||||
"connection_id": "baar",
|
||||
"operation_ids": "foooo",
|
||||
"foo": "bar",
|
||||
"operations": [{"workspace_id": "foo", "operation_id": "foo", "operator_configuration": {"dbt": "foo"}}],
|
||||
},
|
||||
],
|
||||
)
|
||||
def test_get_remote_comparable_configuration(self, mocker, mock_api_client, connection_configuration, remote_resource):
|
||||
mocker.patch.object(
|
||||
resources.Connection,
|
||||
"remote_resource",
|
||||
mocker.Mock(to_dict=mocker.Mock(return_value=remote_resource)),
|
||||
)
|
||||
resource = resources.Connection(mock_api_client, "workspace_id", connection_configuration, "bar.yaml")
|
||||
comparable = resource._get_remote_comparable_configuration()
|
||||
resource.remote_resource.to_dict.assert_called_once()
|
||||
|
||||
assert isinstance(comparable, dict)
|
||||
assert all([k not in comparable for k in resource.remote_root_level_keys_to_filter_out_for_comparison])
|
||||
if "operations" in remote_resource and "operations" in comparable:
|
||||
assert all([k not in comparable["operations"][0] for k in resource.remote_operation_level_keys_to_filter_out])
|
||||
if remote_resource["operations"][0]["operator_configuration"].get("normalization") is not None:
|
||||
assert "dbt" not in remote_resource["operations"][0]["operator_configuration"]
|
||||
if remote_resource["operations"][0]["operator_configuration"].get("dbt") is not None:
|
||||
assert "normalization" not in remote_resource["operations"][0]["operator_configuration"]
|
||||
if "operations" in remote_resource and len(remote_resource["operations"]) == 0:
|
||||
assert "operations" not in comparable
|
||||
|
||||
def test_create(self, mocker, mock_api_client, connection_configuration):
|
||||
mocker.patch.object(resources.Connection, "_create_or_update")
|
||||
mocker.patch.object(resources.Connection, "source_id", "source_id")
|
||||
mocker.patch.object(resources.Connection, "destination_id", "destination_id")
|
||||
resource = resources.Connection(mock_api_client, "workspace_id", connection_configuration, "bar.yaml")
|
||||
create_result = resource.create()
|
||||
assert create_result == resource._create_or_update.return_value
|
||||
resource._create_or_update.assert_called_with(resource._create_fn, resource.create_payload)
|
||||
|
||||
def test_update(self, mocker, mock_api_client, connection_configuration):
|
||||
mocker.patch.object(resources.Connection, "_create_or_update")
|
||||
resource = resources.Connection(mock_api_client, "workspace_id", connection_configuration, "bar.yaml")
|
||||
resource.state = mocker.Mock(resource_id="foo")
|
||||
update_result = resource.update()
|
||||
assert update_result == resource._create_or_update.return_value
|
||||
resource._create_or_update.assert_called_with(resource._update_fn, resource.update_payload)
|
||||
|
||||
def test__deserialize_raw_configuration(self, mock_api_client, connection_configuration, connection_configuration_with_manual_schedule):
|
||||
resource = resources.Connection(mock_api_client, "workspace_id", connection_configuration, "bar.yaml")
|
||||
configuration = resource._deserialize_raw_configuration()
|
||||
assert isinstance(configuration["sync_catalog"], AirbyteCatalog)
|
||||
assert configuration["namespace_definition"] == NamespaceDefinitionType(
|
||||
connection_configuration["configuration"]["namespace_definition"]
|
||||
)
|
||||
assert configuration["schedule_type"] == ConnectionScheduleType(connection_configuration["configuration"]["schedule_type"])
|
||||
assert (
|
||||
configuration["schedule_data"].to_dict()
|
||||
== ConnectionScheduleDataBasicSchedule(**connection_configuration["configuration"]["schedule_data"]).to_dict()
|
||||
)
|
||||
assert configuration["resource_requirements"] == ResourceRequirements(
|
||||
**connection_configuration["configuration"]["resource_requirements"]
|
||||
)
|
||||
assert configuration["status"] == ConnectionStatus(connection_configuration["configuration"]["status"])
|
||||
assert list(configuration.keys()) == [
|
||||
"namespace_definition",
|
||||
"namespace_format",
|
||||
"prefix",
|
||||
"sync_catalog",
|
||||
"schedule_type",
|
||||
"schedule_data",
|
||||
"status",
|
||||
"resource_requirements",
|
||||
"non_breaking_changes_preference",
|
||||
"geography",
|
||||
]
|
||||
|
||||
resource = resources.Connection(mock_api_client, "workspace_id", connection_configuration_with_manual_schedule, "bar.yaml")
|
||||
configuration = resource._deserialize_raw_configuration()
|
||||
assert configuration["schedule_type"] == ConnectionScheduleType(
|
||||
connection_configuration_with_manual_schedule["configuration"]["schedule_type"]
|
||||
)
|
||||
assert configuration["schedule_data"] is None
|
||||
|
||||
def test__deserialize_operations(self, mock_api_client, connection_configuration):
|
||||
resource = resources.Connection(mock_api_client, "workspace_id", connection_configuration, "bar.yaml")
|
||||
operations = [
|
||||
{
|
||||
"operator_configuration": {"operator_type": "normalization", "normalization": {"option": "basic"}},
|
||||
"name": "operation-with-normalization",
|
||||
},
|
||||
{
|
||||
"operator_configuration": {
|
||||
"operator_type": "dbt",
|
||||
"dbt": {
|
||||
"dbt_arguments": "run",
|
||||
"docker_image": "fishtownanalytics/dbt:0.19.1",
|
||||
"git_repo_branch": "my-branch-name",
|
||||
"git_repo_url": "https://github.com/airbytehq/airbyte",
|
||||
},
|
||||
},
|
||||
"name": "operation-with-custom_dbt",
|
||||
},
|
||||
]
|
||||
deserialized_operations = resource._deserialize_operations(operations, OperationCreate)
|
||||
assert len(deserialized_operations) == 2
|
||||
assert all([isinstance(o, OperationCreate) for o in deserialized_operations])
|
||||
assert "normalization" in deserialized_operations[0]["operator_configuration"] and deserialized_operations[0][
|
||||
"operator_configuration"
|
||||
]["operator_type"] == OperatorType("normalization")
|
||||
assert "dbt" in deserialized_operations[1]["operator_configuration"]
|
||||
assert deserialized_operations[1]["operator_configuration"]["operator_type"] == OperatorType("dbt")
|
||||
|
||||
with pytest.raises(ValueError):
|
||||
resource._deserialize_operations(
|
||||
[
|
||||
{
|
||||
"operator_configuration": {"operator_type": "not-supported", "normalization": {"option": "basic"}},
|
||||
"name": "operation-not-supported",
|
||||
},
|
||||
],
|
||||
OperationCreate,
|
||||
)
|
||||
|
||||
def test__create_configured_catalog(self, mock_api_client, connection_configuration):
|
||||
resource = resources.Connection(mock_api_client, "workspace_id", connection_configuration, "bar.yaml")
|
||||
created_catalog = resource._create_configured_catalog(connection_configuration["configuration"]["sync_catalog"])
|
||||
stream, config = (
|
||||
connection_configuration["configuration"]["sync_catalog"]["streams"][0]["stream"],
|
||||
connection_configuration["configuration"]["sync_catalog"]["streams"][0]["config"],
|
||||
)
|
||||
|
||||
assert len(created_catalog.streams) == len(connection_configuration["configuration"]["sync_catalog"]["streams"])
|
||||
assert created_catalog.streams[0].stream.name == stream["name"]
|
||||
assert created_catalog.streams[0].stream.json_schema == stream["json_schema"]
|
||||
assert created_catalog.streams[0].stream.supported_sync_modes == stream["supported_sync_modes"]
|
||||
assert created_catalog.streams[0].stream.source_defined_cursor == stream["source_defined_cursor"]
|
||||
assert created_catalog.streams[0].stream.namespace == stream["namespace"]
|
||||
assert created_catalog.streams[0].stream.source_defined_primary_key == stream["source_defined_primary_key"]
|
||||
assert created_catalog.streams[0].stream.default_cursor_field == stream["default_cursor_field"]
|
||||
|
||||
assert created_catalog.streams[0].config.sync_mode == config["sync_mode"]
|
||||
assert created_catalog.streams[0].config.cursor_field == config["cursor_field"]
|
||||
assert created_catalog.streams[0].config.destination_sync_mode == config["destination_sync_mode"]
|
||||
assert created_catalog.streams[0].config.primary_key == config["primary_key"]
|
||||
assert created_catalog.streams[0].config.alias_name == config["alias_name"]
|
||||
assert created_catalog.streams[0].config.selected == config["selected"]
|
||||
|
||||
def test__check_for_legacy_connection_configuration_keys(
|
||||
self, mock_api_client, connection_configuration, legacy_connection_configurations
|
||||
):
|
||||
resource = resources.Connection(mock_api_client, "workspace_id", connection_configuration, "bar.yaml")
|
||||
assert resource._check_for_legacy_connection_configuration_keys(connection_configuration["configuration"]) is None
|
||||
for legacy_configuration in legacy_connection_configurations:
|
||||
with pytest.raises(resources.InvalidConfigurationError):
|
||||
resource._check_for_legacy_connection_configuration_keys(legacy_configuration["configuration"])
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"local_configuration,resource_to_mock,expected_error",
|
||||
[
|
||||
({"definition_type": "source"}, "Source", None),
|
||||
({"definition_type": "destination"}, "Destination", None),
|
||||
({"definition_type": "connection"}, "Connection", None),
|
||||
({"definition_type": "not_existing"}, None, NotImplementedError),
|
||||
],
|
||||
)
|
||||
def test_factory(mocker, mock_api_client, local_configuration, resource_to_mock, expected_error):
|
||||
mocker.patch.object(resources, "yaml")
|
||||
if resource_to_mock is not None:
|
||||
mocker.patch.object(resources, resource_to_mock)
|
||||
resources.yaml.load.return_value = local_configuration
|
||||
with patch("builtins.open", mock_open(read_data="data")) as mock_file:
|
||||
if not expected_error:
|
||||
resource = resources.factory(mock_api_client, "workspace_id", "my_config.yaml")
|
||||
resources.yaml.load.assert_called_with(mock_file.return_value, yaml_loaders.EnvVarLoader)
|
||||
resource == getattr(resources, resource_to_mock).return_value
|
||||
mock_file.assert_called_with("my_config.yaml", "r")
|
||||
else:
|
||||
with pytest.raises(expected_error):
|
||||
resources.factory(mock_api_client, "workspace_id", "my_config.yaml")
|
||||
mock_file.assert_called_with("my_config.yaml", "r")
|
||||
@@ -1,37 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import os
|
||||
|
||||
import pytest
|
||||
import yaml
|
||||
from octavia_cli.apply import yaml_loaders
|
||||
|
||||
|
||||
def test_env_var_replacer(mocker):
|
||||
mocker.patch.object(yaml_loaders, "os")
|
||||
mock_node = mocker.Mock()
|
||||
assert yaml_loaders.env_var_replacer(mocker.Mock(), mock_node) == yaml_loaders.os.path.expandvars.return_value
|
||||
yaml_loaders.os.path.expandvars.assert_called_with(mock_node.value)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def test_env_vars():
|
||||
old_environ = dict(os.environ)
|
||||
secret_env_vars = {"MY_SECRET_PASSWORD": "🤫", "ANOTHER_SECRET_VALUE": "🔒"}
|
||||
os.environ.update(secret_env_vars)
|
||||
yield secret_env_vars
|
||||
os.environ.clear()
|
||||
os.environ.update(old_environ)
|
||||
|
||||
|
||||
def test_env_var_loader(test_env_vars):
|
||||
assert yaml_loaders.EnvVarLoader.yaml_implicit_resolvers[None] == [("!environment_variable", yaml_loaders.ENV_VAR_MATCHER_PATTERN)]
|
||||
assert yaml_loaders.EnvVarLoader.yaml_constructors["!environment_variable"] == yaml_loaders.env_var_replacer
|
||||
test_yaml = "my_secret_password: ${MY_SECRET_PASSWORD}\nanother_secret_value: ${ANOTHER_SECRET_VALUE}"
|
||||
deserialized = yaml.load(test_yaml, yaml_loaders.EnvVarLoader)
|
||||
assert deserialized == {
|
||||
"my_secret_password": test_env_vars["MY_SECRET_PASSWORD"],
|
||||
"another_secret_value": test_env_vars["ANOTHER_SECRET_VALUE"],
|
||||
}
|
||||
@@ -1,57 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import click
|
||||
import pytest
|
||||
from octavia_cli import base_commands
|
||||
|
||||
|
||||
class TestOctaviaCommand:
|
||||
@pytest.fixture
|
||||
def octavia_command(self):
|
||||
octavia_command = base_commands.OctaviaCommand("test_command")
|
||||
assert isinstance(octavia_command, click.Command)
|
||||
return octavia_command
|
||||
|
||||
def test_make_context(self, mocker, octavia_command):
|
||||
mock_parent_ctx = mocker.Mock()
|
||||
parent_make_context = mocker.Mock()
|
||||
mocker.patch.object(click.Command, "make_context", parent_make_context)
|
||||
made_context = octavia_command.make_context("my_info_name", ["arg1", "arg2"], parent=mock_parent_ctx, foo="foo", bar="bar")
|
||||
parent_make_context.assert_called_with("my_info_name", ["arg1", "arg2"], mock_parent_ctx, foo="foo", bar="bar")
|
||||
assert made_context == parent_make_context.return_value
|
||||
|
||||
@pytest.mark.parametrize("error", [Exception(), click.exceptions.Exit(0), click.exceptions.Exit(1)])
|
||||
def test_make_context_error(self, mocker, octavia_command, mock_telemetry_client, error):
|
||||
mock_parent_ctx = mocker.Mock(obj={"TELEMETRY_CLIENT": mock_telemetry_client})
|
||||
parent_make_context = mocker.Mock(side_effect=error)
|
||||
mocker.patch.object(click.Command, "make_context", parent_make_context)
|
||||
with pytest.raises(type(error)):
|
||||
octavia_command.make_context("my_info_name", ["arg1", "arg2"], parent=mock_parent_ctx, foo="foo", bar="bar")
|
||||
if isinstance(error, click.exceptions.Exit) and error.exit_code == 0:
|
||||
mock_telemetry_client.send_command_telemetry.assert_called_with(
|
||||
mock_parent_ctx, extra_info_name="my_info_name", is_help=True
|
||||
)
|
||||
else:
|
||||
mock_telemetry_client.send_command_telemetry.assert_called_with(
|
||||
mock_parent_ctx, error=error, extra_info_name="my_info_name"
|
||||
)
|
||||
|
||||
def test_invoke(self, mocker, octavia_command, mock_telemetry_client):
|
||||
mock_ctx = mocker.Mock(obj={"TELEMETRY_CLIENT": mock_telemetry_client})
|
||||
parent_invoke = mocker.Mock()
|
||||
mocker.patch.object(click.Command, "invoke", parent_invoke)
|
||||
result = octavia_command.invoke(mock_ctx)
|
||||
parent_invoke.assert_called_with(mock_ctx)
|
||||
mock_telemetry_client.send_command_telemetry.assert_called_with(mock_ctx)
|
||||
assert result == parent_invoke.return_value
|
||||
|
||||
def test_invoke_error(self, mocker, octavia_command, mock_telemetry_client):
|
||||
mock_ctx = mocker.Mock(obj={"TELEMETRY_CLIENT": mock_telemetry_client})
|
||||
error = Exception()
|
||||
parent_invoke = mocker.Mock(side_effect=error)
|
||||
mocker.patch.object(click.Command, "invoke", parent_invoke)
|
||||
with pytest.raises(Exception):
|
||||
octavia_command.invoke(mock_ctx)
|
||||
mock_telemetry_client.send_command_telemetry.assert_called_with(mock_ctx, error=error)
|
||||
@@ -1,85 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import os
|
||||
import shutil
|
||||
import tempfile
|
||||
from pathlib import Path
|
||||
|
||||
import airbyte_api_client
|
||||
import pytest
|
||||
from airbyte_api_client.model.workspace_id_request_body import WorkspaceIdRequestBody
|
||||
from octavia_cli import check_context
|
||||
from urllib3.exceptions import MaxRetryError
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_api_client(mocker):
|
||||
return mocker.Mock()
|
||||
|
||||
|
||||
def test_api_check_health_available(mock_api_client, mocker):
|
||||
mocker.patch.object(check_context, "health_api")
|
||||
mock_api_response = mocker.Mock(available=True)
|
||||
check_context.health_api.HealthApi.return_value.get_health_check.return_value = mock_api_response
|
||||
|
||||
assert check_context.check_api_health(mock_api_client) is None
|
||||
check_context.health_api.HealthApi.assert_called_with(mock_api_client)
|
||||
api_instance = check_context.health_api.HealthApi.return_value
|
||||
api_instance.get_health_check.assert_called()
|
||||
|
||||
|
||||
def test_api_check_health_unavailable(mock_api_client, mocker):
|
||||
mocker.patch.object(check_context, "health_api")
|
||||
mock_api_response = mocker.Mock(available=False)
|
||||
check_context.health_api.HealthApi.return_value.get_health_check.return_value = mock_api_response
|
||||
with pytest.raises(check_context.UnhealthyApiError):
|
||||
check_context.check_api_health(mock_api_client)
|
||||
|
||||
|
||||
def test_api_check_health_unreachable_api_exception(mock_api_client, mocker):
|
||||
mocker.patch.object(check_context, "health_api")
|
||||
check_context.health_api.HealthApi.return_value.get_health_check.side_effect = airbyte_api_client.ApiException()
|
||||
with pytest.raises(check_context.UnreachableAirbyteInstanceError):
|
||||
check_context.check_api_health(mock_api_client)
|
||||
|
||||
|
||||
def test_api_check_health_unreachable_max_retry_error(mock_api_client, mocker):
|
||||
mocker.patch.object(check_context, "health_api")
|
||||
check_context.health_api.HealthApi.return_value.get_health_check.side_effect = MaxRetryError("foo", "bar")
|
||||
with pytest.raises(check_context.UnreachableAirbyteInstanceError):
|
||||
check_context.check_api_health(mock_api_client)
|
||||
|
||||
|
||||
def test_check_workspace_exists(mock_api_client, mocker):
|
||||
mocker.patch.object(check_context, "workspace_api")
|
||||
mock_api_instance = mocker.Mock()
|
||||
check_context.workspace_api.WorkspaceApi.return_value = mock_api_instance
|
||||
assert check_context.check_workspace_exists(mock_api_client, "foo") is None
|
||||
check_context.workspace_api.WorkspaceApi.assert_called_with(mock_api_client)
|
||||
mock_api_instance.get_workspace.assert_called_with(WorkspaceIdRequestBody("foo"), _check_return_type=False)
|
||||
|
||||
|
||||
def test_check_workspace_exists_error(mock_api_client, mocker):
|
||||
mocker.patch.object(check_context, "workspace_api")
|
||||
check_context.workspace_api.WorkspaceApi.return_value.get_workspace.side_effect = airbyte_api_client.ApiException()
|
||||
with pytest.raises(check_context.WorkspaceIdError):
|
||||
check_context.check_workspace_exists(mock_api_client, "foo")
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def project_directories():
|
||||
dirpath = tempfile.mkdtemp()
|
||||
yield str(Path(dirpath).parent.absolute()), [os.path.basename(dirpath)]
|
||||
shutil.rmtree(dirpath)
|
||||
|
||||
|
||||
def test_check_is_initialized(mocker, project_directories):
|
||||
project_directory, sub_directories = project_directories
|
||||
mocker.patch.object(check_context, "REQUIRED_PROJECT_DIRECTORIES", sub_directories)
|
||||
assert check_context.check_is_initialized(project_directory)
|
||||
|
||||
|
||||
def test_check_not_initialized():
|
||||
assert not check_context.check_is_initialized(".")
|
||||
@@ -1,243 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
from typing import List, Optional
|
||||
|
||||
import click
|
||||
import pkg_resources
|
||||
import pytest
|
||||
from airbyte_api_client.model.workspace_id_request_body import WorkspaceIdRequestBody
|
||||
from click.testing import CliRunner
|
||||
from octavia_cli import entrypoint
|
||||
from octavia_cli.api_http_headers import ApiHttpHeader
|
||||
|
||||
|
||||
@click.command()
|
||||
@click.pass_context
|
||||
def dumb(ctx):
|
||||
pass
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"option_based_api_http_headers, api_http_headers_file_path",
|
||||
[
|
||||
([("foo", "bar")], "api_http_headers_file_path"),
|
||||
([], None),
|
||||
(None, None),
|
||||
],
|
||||
)
|
||||
def test_set_context_object(mocker, option_based_api_http_headers, api_http_headers_file_path):
|
||||
mocker.patch.object(entrypoint, "TelemetryClient")
|
||||
mocker.patch.object(entrypoint, "build_user_agent")
|
||||
mocker.patch.object(entrypoint, "merge_api_headers")
|
||||
mocker.patch.object(entrypoint, "get_api_client")
|
||||
mocker.patch.object(entrypoint, "get_workspace_id")
|
||||
mocker.patch.object(entrypoint, "check_is_initialized")
|
||||
mocker.patch.object(entrypoint, "get_anonymous_data_collection")
|
||||
mock_ctx = mocker.Mock(obj={})
|
||||
built_context = entrypoint.set_context_object(
|
||||
mock_ctx,
|
||||
"my_airbyte_url",
|
||||
"my_airbyte_username",
|
||||
"my_airbyte_password",
|
||||
"my_workspace_id",
|
||||
"enable_telemetry",
|
||||
option_based_api_http_headers,
|
||||
api_http_headers_file_path,
|
||||
)
|
||||
entrypoint.TelemetryClient.assert_called_with("enable_telemetry")
|
||||
mock_ctx.ensure_object.assert_called_with(dict)
|
||||
assert built_context.obj == {
|
||||
"OCTAVIA_VERSION": pkg_resources.require("octavia-cli")[0].version,
|
||||
"TELEMETRY_CLIENT": entrypoint.TelemetryClient.return_value,
|
||||
"WORKSPACE_ID": entrypoint.get_workspace_id.return_value,
|
||||
"API_CLIENT": entrypoint.get_api_client.return_value,
|
||||
"PROJECT_IS_INITIALIZED": entrypoint.check_is_initialized.return_value,
|
||||
"ANONYMOUS_DATA_COLLECTION": entrypoint.get_anonymous_data_collection.return_value,
|
||||
}
|
||||
entrypoint.build_user_agent.assert_called_with(built_context.obj["OCTAVIA_VERSION"])
|
||||
entrypoint.merge_api_headers.assert_called_with(option_based_api_http_headers, api_http_headers_file_path)
|
||||
entrypoint.get_api_client.assert_called_with(
|
||||
"my_airbyte_url",
|
||||
"my_airbyte_username",
|
||||
"my_airbyte_password",
|
||||
entrypoint.build_user_agent.return_value,
|
||||
entrypoint.merge_api_headers.return_value,
|
||||
)
|
||||
|
||||
|
||||
def test_set_context_object_error(mocker):
|
||||
mocker.patch.object(entrypoint, "TelemetryClient")
|
||||
mock_ctx = mocker.Mock(obj={})
|
||||
mock_ctx.ensure_object.side_effect = NotImplementedError()
|
||||
with pytest.raises(NotImplementedError):
|
||||
entrypoint.set_context_object(
|
||||
mock_ctx,
|
||||
"my_airbyte_url",
|
||||
"my_airbyte_username",
|
||||
"my_airbyte_password",
|
||||
"my_workspace_id",
|
||||
"enable_telemetry",
|
||||
[("foo", "bar")],
|
||||
"api_http_headers_file_path",
|
||||
)
|
||||
entrypoint.TelemetryClient.return_value.send_command_telemetry.assert_called_with(
|
||||
mock_ctx, error=mock_ctx.ensure_object.side_effect
|
||||
)
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"options, expected_exit_code",
|
||||
[
|
||||
(["--airbyte-url", "test-airbyte-url"], 0),
|
||||
(["--airbyte-url", "test-airbyte-url", "--enable-telemetry"], 0),
|
||||
(["--airbyte-url", "test-airbyte-url", "--enable-telemetry foo"], 2),
|
||||
(["--airbyte-url", "test-airbyte-url", "--disable-telemetry"], 0),
|
||||
(["--airbyte-url", "test-airbyte-url", "--api-http-headers-file-path", "path-does-not-exist"], 2),
|
||||
(["--airbyte-url", "test-airbyte-url", "--api-http-headers-file-path", "path-exists"], 0),
|
||||
(["--airbyte-url", "test-airbyte-url", "--api-http-header", "Content-Type", "application/json"], 0),
|
||||
(
|
||||
[
|
||||
"--airbyte-url",
|
||||
"test-airbyte-url",
|
||||
"--api-http-header",
|
||||
"Content-Type",
|
||||
"application/json",
|
||||
"--api-http-header",
|
||||
"Authorization",
|
||||
"'Bearer XXX'",
|
||||
],
|
||||
0,
|
||||
),
|
||||
(
|
||||
[
|
||||
"--airbyte-url",
|
||||
"test-airbyte-url",
|
||||
"--api-http-header",
|
||||
"Content-Type",
|
||||
"--api-http-header",
|
||||
"Authorization",
|
||||
"'Bearer XXX'",
|
||||
],
|
||||
2,
|
||||
),
|
||||
],
|
||||
)
|
||||
def test_octavia(tmp_path, mocker, options, expected_exit_code):
|
||||
if "path-exists" in options:
|
||||
tmp_file = tmp_path / "path_exists.yaml"
|
||||
tmp_file.write_text("foobar")
|
||||
options[options.index("path-exists")] = tmp_file
|
||||
|
||||
mocker.patch.object(entrypoint, "click")
|
||||
mocker.patch.object(
|
||||
entrypoint,
|
||||
"set_context_object",
|
||||
mocker.Mock(return_value=mocker.Mock(obj={"WORKSPACE_ID": "api-defined-workspace-id", "PROJECT_IS_INITIALIZED": True})),
|
||||
)
|
||||
entrypoint.octavia.add_command(dumb)
|
||||
runner = CliRunner()
|
||||
result = runner.invoke(entrypoint.octavia, options + ["dumb"], obj={})
|
||||
expected_message = "🐙 - Octavia is targetting your Airbyte instance running at test-airbyte-url on workspace api-defined-workspace-id."
|
||||
assert result.exit_code == expected_exit_code
|
||||
if expected_exit_code == 0:
|
||||
entrypoint.click.style.assert_called_with(expected_message, fg="green")
|
||||
entrypoint.click.echo.assert_called_with(entrypoint.click.style.return_value)
|
||||
|
||||
|
||||
def test_octavia_not_initialized(mocker):
|
||||
mocker.patch.object(entrypoint, "click")
|
||||
mocker.patch.object(
|
||||
entrypoint,
|
||||
"set_context_object",
|
||||
mocker.Mock(return_value=mocker.Mock(obj={"WORKSPACE_ID": "api-defined-workspace-id", "PROJECT_IS_INITIALIZED": False})),
|
||||
)
|
||||
entrypoint.octavia.add_command(dumb)
|
||||
runner = CliRunner()
|
||||
result = runner.invoke(entrypoint.octavia, ["--airbyte-url", "test-airbyte-url", "dumb"], obj={})
|
||||
entrypoint.click.style.assert_called_with("🐙 - Project is not yet initialized.", fg="red", bold=True)
|
||||
entrypoint.click.echo.assert_called_with(entrypoint.click.style.return_value)
|
||||
assert result.exit_code == 0
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"api_http_headers",
|
||||
[
|
||||
None,
|
||||
[],
|
||||
[ApiHttpHeader(name="Authorization", value="Basic dXNlcjE6cGFzc3dvcmQ=")],
|
||||
[ApiHttpHeader(name="Authorization", value="Basic dXNlcjE6cGFzc3dvcmQ="), ApiHttpHeader(name="Header", value="header_value")],
|
||||
],
|
||||
)
|
||||
def test_get_api_client(mocker, api_http_headers: Optional[List[str]]):
|
||||
mocker.patch.object(entrypoint, "airbyte_api_client")
|
||||
entrypoint.airbyte_api_client.Configuration.return_value.get_basic_auth_token.return_value = "my_basic_auth_token"
|
||||
mocker.patch.object(entrypoint, "check_api_health")
|
||||
mocker.patch.object(entrypoint, "set_api_headers_on_api_client")
|
||||
api_client = entrypoint.get_api_client("test-url", "test-username", "test-password", "test-user-agent", api_http_headers)
|
||||
entrypoint.airbyte_api_client.Configuration.assert_called_with(host="test-url/api", username="test-username", password="test-password")
|
||||
entrypoint.airbyte_api_client.ApiClient.assert_called_with(entrypoint.airbyte_api_client.Configuration.return_value)
|
||||
assert entrypoint.airbyte_api_client.ApiClient.return_value.user_agent == "test-user-agent"
|
||||
if api_http_headers:
|
||||
entrypoint.set_api_headers_on_api_client.assert_called_with(entrypoint.airbyte_api_client.ApiClient.return_value, api_http_headers)
|
||||
entrypoint.check_api_health.assert_called_with(entrypoint.airbyte_api_client.ApiClient.return_value)
|
||||
assert api_client == entrypoint.airbyte_api_client.ApiClient.return_value
|
||||
|
||||
|
||||
def test_get_workspace_id_user_defined(mocker):
|
||||
mock_api_client = mocker.Mock()
|
||||
mocker.patch.object(entrypoint, "check_workspace_exists")
|
||||
mocker.patch.object(entrypoint, "workspace_api")
|
||||
assert entrypoint.get_workspace_id(mock_api_client, "user-defined-workspace-id") == "user-defined-workspace-id"
|
||||
entrypoint.check_workspace_exists.assert_called_with(mock_api_client, "user-defined-workspace-id")
|
||||
|
||||
|
||||
def test_get_workspace_id_api_defined(mocker):
|
||||
mock_api_client = mocker.Mock()
|
||||
mocker.patch.object(entrypoint, "check_workspace_exists")
|
||||
mocker.patch.object(entrypoint, "workspace_api")
|
||||
mock_api_instance = entrypoint.workspace_api.WorkspaceApi.return_value
|
||||
mock_api_instance.list_workspaces.return_value = mocker.Mock(workspaces=[{"workspaceId": "api-defined-workspace-id"}])
|
||||
assert entrypoint.get_workspace_id(mock_api_client, None) == "api-defined-workspace-id"
|
||||
entrypoint.workspace_api.WorkspaceApi.assert_called_with(mock_api_client)
|
||||
mock_api_instance.list_workspaces.assert_called_with(_check_return_type=False)
|
||||
|
||||
|
||||
def test_get_anonymous_data_collection(mocker, mock_api_client):
|
||||
mocker.patch.object(entrypoint, "workspace_api")
|
||||
mock_api_instance = entrypoint.workspace_api.WorkspaceApi.return_value
|
||||
assert (
|
||||
entrypoint.get_anonymous_data_collection(mock_api_client, "my_workspace_id")
|
||||
== mock_api_instance.get_workspace.return_value.get.return_value
|
||||
)
|
||||
entrypoint.workspace_api.WorkspaceApi.assert_called_with(mock_api_client)
|
||||
mock_api_instance.get_workspace.assert_called_with(WorkspaceIdRequestBody("my_workspace_id"), _check_return_type=False)
|
||||
|
||||
|
||||
def test_commands_in_octavia_group():
|
||||
octavia_commands = entrypoint.octavia.commands.values()
|
||||
for command in entrypoint.AVAILABLE_COMMANDS:
|
||||
assert command in octavia_commands
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"command",
|
||||
[entrypoint.delete],
|
||||
)
|
||||
def test_not_implemented_commands(command):
|
||||
runner = CliRunner()
|
||||
result = runner.invoke(command)
|
||||
assert result.exit_code == 1
|
||||
assert result.output.endswith("not yet implemented.\n")
|
||||
|
||||
|
||||
def test_available_commands():
|
||||
assert entrypoint.AVAILABLE_COMMANDS == [
|
||||
entrypoint.list_commands._list,
|
||||
entrypoint.get_commands.get,
|
||||
entrypoint.import_commands._import,
|
||||
entrypoint.init_commands.init,
|
||||
entrypoint.generate_commands.generate,
|
||||
entrypoint.apply_commands.apply,
|
||||
]
|
||||
@@ -1,3 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2021 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
@@ -1,116 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import pytest
|
||||
from click.testing import CliRunner
|
||||
from octavia_cli.apply.resources import NonExistingResourceError
|
||||
from octavia_cli.generate import commands
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def context_object(mock_api_client, mock_telemetry_client):
|
||||
return {"PROJECT_IS_INITIALIZED": True, "API_CLIENT": mock_api_client, "WORKSPACE_ID": "foo", "TELEMETRY_CLIENT": mock_telemetry_client}
|
||||
|
||||
|
||||
def test_generate_initialized(mocker, context_object):
|
||||
runner = CliRunner()
|
||||
mocker.patch.object(commands, "definitions")
|
||||
mocker.patch.object(commands, "ConnectorSpecificationRenderer", mocker.Mock())
|
||||
mock_renderer = commands.ConnectorSpecificationRenderer.return_value
|
||||
mock_renderer.write_yaml.return_value = "expected_output_path"
|
||||
result = runner.invoke(commands.generate, ["source", "uuid", "my_source"], obj=context_object)
|
||||
assert result.exit_code == 0
|
||||
|
||||
|
||||
def test_generate_not_initialized(context_object):
|
||||
runner = CliRunner()
|
||||
context_object["PROJECT_IS_INITIALIZED"] = False
|
||||
result = runner.invoke(commands.generate, ["source", "uuid", "my_source"], obj=context_object)
|
||||
assert result.exit_code == 1
|
||||
|
||||
assert result.output == "Error: Your octavia project is not initialized, please run 'octavia init' before running this command.\n"
|
||||
|
||||
|
||||
def test_invalid_definition_type(context_object):
|
||||
runner = CliRunner()
|
||||
result = runner.invoke(commands.generate, ["random_definition", "uuid", "my_source"], obj=context_object)
|
||||
assert result.exit_code == 2
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"command,resource_name,definition_type",
|
||||
[
|
||||
(commands.source, "my_source", "source"),
|
||||
(commands.destination, "my_destination", "destination"),
|
||||
],
|
||||
)
|
||||
def test_generate_source_or_destination(mocker, context_object, command, resource_name, definition_type):
|
||||
runner = CliRunner()
|
||||
mocker.patch.object(commands, "definitions")
|
||||
mocker.patch.object(commands, "ConnectorSpecificationRenderer", mocker.Mock())
|
||||
mock_renderer = commands.ConnectorSpecificationRenderer.return_value
|
||||
mock_renderer.write_yaml.return_value = "expected_output_path"
|
||||
result = runner.invoke(command, ["uuid", resource_name], obj=context_object)
|
||||
assert result.exit_code == 0
|
||||
assert result.output == f"✅ - Created the {definition_type} template for {resource_name} in expected_output_path.\n"
|
||||
commands.definitions.factory.assert_called_with(definition_type, context_object["API_CLIENT"], context_object["WORKSPACE_ID"], "uuid")
|
||||
commands.ConnectorSpecificationRenderer.assert_called_with(resource_name, commands.definitions.factory.return_value)
|
||||
mock_renderer.write_yaml.assert_called_with(project_path=".")
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def tmp_source_path(tmp_path):
|
||||
source_path = tmp_path / "my_source.yaml"
|
||||
source_path.write_text("foo")
|
||||
return source_path
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def tmp_destination_path(tmp_path):
|
||||
destination_path = tmp_path / "my_destination.yaml"
|
||||
destination_path.write_text("foo")
|
||||
return destination_path
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"source_created,destination_created",
|
||||
[(True, True), (False, True), (True, False), (False, False)],
|
||||
)
|
||||
def test_generate_connection(mocker, context_object, tmp_source_path, tmp_destination_path, source_created, destination_created):
|
||||
runner = CliRunner()
|
||||
mock_source = mocker.Mock(was_created=source_created)
|
||||
mock_destination = mocker.Mock(was_created=destination_created)
|
||||
|
||||
mock_resource_factory = mocker.Mock(side_effect=[mock_source, mock_destination])
|
||||
mocker.patch.object(
|
||||
commands, "resources", mocker.Mock(factory=mock_resource_factory, NonExistingResourceError=NonExistingResourceError)
|
||||
)
|
||||
mocker.patch.object(commands, "ConnectionRenderer", mocker.Mock())
|
||||
mock_renderer = commands.ConnectionRenderer.return_value
|
||||
mock_renderer.write_yaml.return_value = "expected_output_path"
|
||||
cli_input = ["my_new_connection", "--source", tmp_source_path, "--destination", tmp_destination_path]
|
||||
result = runner.invoke(commands.connection, cli_input, obj=context_object)
|
||||
if source_created and destination_created:
|
||||
assert result.exit_code == 0
|
||||
assert result.output == "✅ - Created the connection template for my_new_connection in expected_output_path.\n"
|
||||
commands.resources.factory.assert_has_calls(
|
||||
[
|
||||
mocker.call(context_object["API_CLIENT"], context_object["WORKSPACE_ID"], tmp_source_path),
|
||||
mocker.call(context_object["API_CLIENT"], context_object["WORKSPACE_ID"], tmp_destination_path),
|
||||
]
|
||||
)
|
||||
commands.ConnectionRenderer.assert_called_with("my_new_connection", mock_source, mock_destination)
|
||||
mock_renderer.write_yaml.assert_called_with(project_path=".")
|
||||
elif not source_created:
|
||||
assert (
|
||||
result.output
|
||||
== f"Error: The source defined at {tmp_source_path} does not exists. Please run octavia apply before creating this connection.\n"
|
||||
)
|
||||
assert result.exit_code == 1
|
||||
elif not destination_created:
|
||||
assert (
|
||||
result.output
|
||||
== f"Error: The destination defined at {tmp_destination_path} does not exists. Please run octavia apply before creating this connection.\n"
|
||||
)
|
||||
assert result.exit_code == 1
|
||||
@@ -1,5 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
#
|
||||
@@ -1,122 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import pytest
|
||||
from airbyte_api_client.api import (
|
||||
destination_definition_api,
|
||||
destination_definition_specification_api,
|
||||
source_definition_api,
|
||||
source_definition_specification_api,
|
||||
)
|
||||
from airbyte_api_client.exceptions import ApiException
|
||||
from airbyte_api_client.model.destination_definition_id_request_body import DestinationDefinitionIdRequestBody
|
||||
from airbyte_api_client.model.source_definition_id_request_body import SourceDefinitionIdRequestBody
|
||||
from octavia_cli.generate.definitions import (
|
||||
BaseDefinition,
|
||||
DefinitionNotFoundError,
|
||||
DefinitionSpecification,
|
||||
DestinationDefinition,
|
||||
DestinationDefinitionSpecification,
|
||||
SourceDefinition,
|
||||
SourceDefinitionSpecification,
|
||||
factory,
|
||||
)
|
||||
|
||||
|
||||
class TestBaseDefinition:
|
||||
@pytest.fixture
|
||||
def patch_base_class(self, mocker):
|
||||
# Mock abstract methods to enable instantiating abstract class
|
||||
mocker.patch.object(BaseDefinition, "__abstractmethods__", set())
|
||||
mocker.patch.object(BaseDefinition, "api", mocker.Mock())
|
||||
mocker.patch.object(BaseDefinition, "get_function_name", "foo")
|
||||
|
||||
def test_init(self, patch_base_class, mock_api_client, mocker):
|
||||
mocker.patch.object(BaseDefinition, "_read", mocker.Mock())
|
||||
base_definition = BaseDefinition(mock_api_client, "my_definition_id")
|
||||
assert base_definition.specification is None
|
||||
assert base_definition.id == "my_definition_id"
|
||||
assert base_definition.api_instance == base_definition.api.return_value
|
||||
assert base_definition._api_data == base_definition._read.return_value
|
||||
base_definition.api.assert_called_with(mock_api_client)
|
||||
assert base_definition._get_fn_kwargs == {}
|
||||
assert base_definition._get_fn == getattr(base_definition.api, base_definition.get_function_name)
|
||||
|
||||
def test_get_attr(self, patch_base_class, mock_api_client):
|
||||
base_definition = BaseDefinition(mock_api_client, "my_definition_id")
|
||||
base_definition._api_data = {"foo": "bar"}
|
||||
assert base_definition.foo == "bar"
|
||||
with pytest.raises(AttributeError):
|
||||
base_definition.not_existing
|
||||
|
||||
def test_read_success(self, patch_base_class, mock_api_client, mocker):
|
||||
mocker.patch.object(BaseDefinition, "_get_fn", mocker.Mock())
|
||||
base_definition = BaseDefinition(mock_api_client, "my_definition_id")
|
||||
read_output = base_definition._read()
|
||||
assert read_output == base_definition._get_fn.return_value
|
||||
base_definition._get_fn.assert_called_with(base_definition.api_instance, **base_definition._get_fn_kwargs, _check_return_type=False)
|
||||
|
||||
@pytest.mark.parametrize("status_code", [404, 422])
|
||||
def test_read_error_not_found(self, status_code, patch_base_class, mock_api_client, mocker):
|
||||
mocker.patch.object(BaseDefinition, "_get_fn", mocker.Mock(side_effect=ApiException(status=status_code)))
|
||||
with pytest.raises(DefinitionNotFoundError):
|
||||
BaseDefinition(mock_api_client, "my_definition_id")
|
||||
|
||||
def test_read_error_other(self, patch_base_class, mock_api_client, mocker):
|
||||
expected_error = ApiException(status=42)
|
||||
mocker.patch.object(BaseDefinition, "_get_fn", mocker.Mock(side_effect=expected_error))
|
||||
with pytest.raises(ApiException) as e:
|
||||
BaseDefinition(mock_api_client, "my_definition_id")
|
||||
assert e == expected_error
|
||||
|
||||
|
||||
class TestSourceDefinition:
|
||||
def test_init(self, mock_api_client):
|
||||
assert SourceDefinition.__base__ == BaseDefinition
|
||||
source_definition = SourceDefinition(mock_api_client, "source_id")
|
||||
assert source_definition.api == source_definition_api.SourceDefinitionApi
|
||||
assert source_definition.type == "source"
|
||||
assert source_definition.get_function_name == "get_source_definition"
|
||||
assert source_definition._get_fn_kwargs == {"source_definition_id_request_body": SourceDefinitionIdRequestBody("source_id")}
|
||||
|
||||
|
||||
class TestDestinationDefinition:
|
||||
def test_init(self, mock_api_client):
|
||||
assert DestinationDefinition.__base__ == BaseDefinition
|
||||
destination_definition = DestinationDefinition(mock_api_client, "source_id")
|
||||
assert destination_definition.api == destination_definition_api.DestinationDefinitionApi
|
||||
assert destination_definition.type == "destination"
|
||||
assert destination_definition.get_function_name == "get_destination_definition"
|
||||
assert destination_definition._get_fn_kwargs == {
|
||||
"destination_definition_id_request_body": DestinationDefinitionIdRequestBody("source_id")
|
||||
}
|
||||
|
||||
|
||||
class TestSourceDefinitionSpecification:
|
||||
def test_init(self, mock_api_client):
|
||||
assert SourceDefinitionSpecification.__base__ == DefinitionSpecification
|
||||
source_specification = SourceDefinitionSpecification(mock_api_client, "workspace_id", "source_id")
|
||||
assert source_specification.api == source_definition_specification_api.SourceDefinitionSpecificationApi
|
||||
assert source_specification.get_function_name == "get_source_definition_specification"
|
||||
|
||||
|
||||
class TestDestinationDefinitionSpecification:
|
||||
def test_init(self, mock_api_client):
|
||||
assert DestinationDefinitionSpecification.__base__ == DefinitionSpecification
|
||||
destination_specification = DestinationDefinitionSpecification(mock_api_client, "workspace_id", "source_id")
|
||||
assert destination_specification.api == destination_definition_specification_api.DestinationDefinitionSpecificationApi
|
||||
assert destination_specification.get_function_name == "get_destination_definition_specification"
|
||||
|
||||
|
||||
def test_factory(mock_api_client):
|
||||
source_definition = factory("source", mock_api_client, "workspace_id", "source_definition_id")
|
||||
assert isinstance(source_definition, SourceDefinition)
|
||||
assert isinstance(source_definition.specification, SourceDefinitionSpecification)
|
||||
|
||||
destination_definition = factory("destination", mock_api_client, "workspace_id", "destination_definition_id")
|
||||
assert isinstance(destination_definition, DestinationDefinition)
|
||||
assert isinstance(destination_definition.specification, DestinationDefinitionSpecification)
|
||||
|
||||
with pytest.raises(ValueError):
|
||||
factory("random", mock_api_client, "workspace_id", "random_definition_id")
|
||||
@@ -1,415 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
from pathlib import Path
|
||||
from unittest.mock import mock_open, patch
|
||||
|
||||
import pytest
|
||||
import yaml
|
||||
from airbyte_api_client.model.airbyte_catalog import AirbyteCatalog
|
||||
from airbyte_api_client.model.airbyte_stream import AirbyteStream
|
||||
from airbyte_api_client.model.airbyte_stream_and_configuration import AirbyteStreamAndConfiguration
|
||||
from airbyte_api_client.model.airbyte_stream_configuration import AirbyteStreamConfiguration
|
||||
from airbyte_api_client.model.destination_sync_mode import DestinationSyncMode
|
||||
from airbyte_api_client.model.sync_mode import SyncMode
|
||||
from octavia_cli.generate import renderers, yaml_dumpers
|
||||
|
||||
|
||||
class TestFieldToRender:
|
||||
def test_init(self, mocker):
|
||||
mocker.patch.object(renderers.FieldToRender, "_get_one_of_values")
|
||||
mocker.patch.object(renderers, "get_object_fields")
|
||||
mocker.patch.object(renderers.FieldToRender, "_get_array_items")
|
||||
mocker.patch.object(renderers.FieldToRender, "_build_comment")
|
||||
mocker.patch.object(renderers.FieldToRender, "_get_default")
|
||||
|
||||
field_metadata = mocker.Mock()
|
||||
field_to_render = renderers.FieldToRender("field_name", True, field_metadata)
|
||||
assert field_to_render.name == "field_name"
|
||||
assert field_to_render.required
|
||||
assert field_to_render.field_metadata == field_metadata
|
||||
assert field_to_render.one_of_values == field_to_render._get_one_of_values.return_value
|
||||
assert field_to_render.object_properties == renderers.get_object_fields.return_value
|
||||
assert field_to_render.array_items == field_to_render._get_array_items.return_value
|
||||
assert field_to_render.comment == field_to_render._build_comment.return_value
|
||||
assert field_to_render.default == field_to_render._get_default.return_value
|
||||
field_to_render._build_comment.assert_called_with(
|
||||
[
|
||||
field_to_render._get_secret_comment,
|
||||
field_to_render._get_required_comment,
|
||||
field_to_render._get_type_comment,
|
||||
field_to_render._get_description_comment,
|
||||
field_to_render._get_example_comment,
|
||||
]
|
||||
)
|
||||
|
||||
def test_get_attr(self):
|
||||
field_to_render = renderers.FieldToRender("field_name", True, {"foo": "bar"})
|
||||
assert field_to_render.foo == "bar"
|
||||
assert field_to_render.not_existing is None
|
||||
|
||||
def test_is_array_of_objects(self):
|
||||
field_to_render = renderers.FieldToRender("field_name", True, {"foo": "bar"})
|
||||
field_to_render.type = "array"
|
||||
field_to_render.items = {"type": "object"}
|
||||
assert field_to_render.is_array_of_objects
|
||||
field_to_render.type = "array"
|
||||
field_to_render.items = {"type": "int"}
|
||||
assert not field_to_render.is_array_of_objects
|
||||
|
||||
def test__get_one_of_values(self, mocker):
|
||||
field_to_render = renderers.FieldToRender("field_name", True, {"foo": "bar"})
|
||||
field_to_render.oneOf = False
|
||||
assert field_to_render._get_one_of_values() == []
|
||||
|
||||
mocker.patch.object(renderers, "get_object_fields")
|
||||
one_of_value = mocker.Mock()
|
||||
field_to_render.oneOf = [one_of_value]
|
||||
one_of_values = field_to_render._get_one_of_values()
|
||||
renderers.get_object_fields.assert_called_once_with(one_of_value)
|
||||
assert one_of_values == [renderers.get_object_fields.return_value]
|
||||
|
||||
def test__get_array_items(self, mocker):
|
||||
mocker.patch.object(renderers, "parse_fields")
|
||||
mocker.patch.object(renderers.FieldToRender, "is_array_of_objects", False)
|
||||
|
||||
field_to_render = renderers.FieldToRender("field_name", True, {"foo": "bar"})
|
||||
assert field_to_render._get_array_items() == []
|
||||
field_to_render.items = {"required": [], "properties": []}
|
||||
mocker.patch.object(renderers.FieldToRender, "is_array_of_objects", True)
|
||||
assert field_to_render._get_array_items() == renderers.parse_fields.return_value
|
||||
renderers.parse_fields.assert_called_with([], [])
|
||||
|
||||
def test__get_required_comment(self):
|
||||
field_to_render = renderers.FieldToRender("field_name", True, {"foo": "bar"})
|
||||
field_to_render.required = True
|
||||
assert field_to_render._get_required_comment() == "REQUIRED"
|
||||
field_to_render.required = False
|
||||
assert field_to_render._get_required_comment() == "OPTIONAL"
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"_type,expected_comment",
|
||||
[("string", "string"), (["string", "null"], "string, null"), (None, None)],
|
||||
)
|
||||
def test__get_type_comment(self, _type, expected_comment):
|
||||
field_to_render = renderers.FieldToRender("field_name", True, {"foo": "bar"})
|
||||
field_to_render.type = _type
|
||||
assert field_to_render._get_type_comment() == expected_comment
|
||||
|
||||
def test__get_secret_comment(self):
|
||||
field_to_render = renderers.FieldToRender("field_name", True, {"foo": "bar"})
|
||||
field_to_render.airbyte_secret = True
|
||||
assert field_to_render._get_secret_comment() == "SECRET (please store in environment variables)"
|
||||
field_to_render.airbyte_secret = False
|
||||
assert field_to_render._get_secret_comment() is None
|
||||
|
||||
def test__get_description_comment(self):
|
||||
field_to_render = renderers.FieldToRender("field_name", True, {"foo": "bar"})
|
||||
field_to_render.description = "foo"
|
||||
assert field_to_render._get_description_comment() == "foo"
|
||||
field_to_render.description = None
|
||||
assert field_to_render._get_description_comment() is None
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"examples_value,expected_output",
|
||||
[
|
||||
(["foo", "bar"], "Examples: foo, bar"),
|
||||
(["foo"], "Example: foo"),
|
||||
("foo", "Example: foo"),
|
||||
([5432], "Example: 5432"),
|
||||
(None, None),
|
||||
],
|
||||
)
|
||||
def test__get_example_comment(self, examples_value, expected_output):
|
||||
field_to_render = renderers.FieldToRender("field_name", True, {"foo": "bar"})
|
||||
field_to_render.examples = examples_value
|
||||
assert field_to_render._get_example_comment() == expected_output
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"field_metadata,expected_default",
|
||||
[
|
||||
({"const": "foo", "default": "bar"}, "foo"),
|
||||
({"default": "bar"}, "bar"),
|
||||
({"airbyte_secret": True}, "${FIELD_NAME}"),
|
||||
({}, None),
|
||||
],
|
||||
)
|
||||
def test__get_default(self, field_metadata, expected_default):
|
||||
field_to_render = renderers.FieldToRender("field_name", True, field_metadata)
|
||||
assert field_to_render.default == expected_default
|
||||
|
||||
def test__build_comment(self, mocker):
|
||||
comment_functions = [mocker.Mock(return_value="foo"), mocker.Mock(return_value=None), mocker.Mock(return_value="bar")]
|
||||
comment = renderers.FieldToRender._build_comment(comment_functions)
|
||||
assert comment == "foo | bar"
|
||||
|
||||
|
||||
def test_parse_fields():
|
||||
required_fields = ["foo"]
|
||||
properties = {"foo": {}, "bar": {}}
|
||||
fields_to_render = renderers.parse_fields(required_fields, properties)
|
||||
assert fields_to_render[0].name == "foo"
|
||||
assert fields_to_render[0].required
|
||||
assert fields_to_render[1].name == "bar"
|
||||
assert not fields_to_render[1].required
|
||||
|
||||
|
||||
def test_get_object_fields(mocker):
|
||||
mocker.patch.object(renderers, "parse_fields")
|
||||
field_metadata = {"properties": {"foo": {}, "bar": {}}, "required": ["foo"]}
|
||||
object_properties = renderers.get_object_fields(field_metadata)
|
||||
assert object_properties == renderers.parse_fields.return_value
|
||||
renderers.parse_fields.assert_called_with(["foo"], field_metadata["properties"])
|
||||
field_metadata = {}
|
||||
assert renderers.get_object_fields(field_metadata) == []
|
||||
|
||||
|
||||
class TestBaseRenderer:
|
||||
@pytest.fixture
|
||||
def patch_base_class(self, mocker):
|
||||
# Mock abstract methods to enable instantiating abstract class
|
||||
mocker.patch.object(renderers.BaseRenderer, "__abstractmethods__", set())
|
||||
|
||||
def test_init(self, patch_base_class):
|
||||
base = renderers.BaseRenderer("resource_name")
|
||||
assert base.resource_name == "resource_name"
|
||||
|
||||
def test_get_output_path(self, patch_base_class, mocker):
|
||||
mocker.patch.object(renderers, "os")
|
||||
mocker.patch.object(renderers, "slugify")
|
||||
renderers.os.path.exists.return_value = False
|
||||
spec_renderer = renderers.BaseRenderer("my_resource_name")
|
||||
renderers.os.path.join.side_effect = [
|
||||
"./my_definition_types/my_resource_name",
|
||||
"./my_definition_types/my_resource_name/configuration.yaml",
|
||||
]
|
||||
output_path = spec_renderer.get_output_path(".", "my_definition_type", "my_resource_name")
|
||||
renderers.os.makedirs.assert_called_once()
|
||||
renderers.slugify.assert_called_with("my_resource_name", separator="_")
|
||||
renderers.os.path.join.assert_has_calls(
|
||||
[
|
||||
mocker.call(".", "my_definition_types", renderers.slugify.return_value),
|
||||
mocker.call("./my_definition_types/my_resource_name", "configuration.yaml"),
|
||||
]
|
||||
)
|
||||
assert output_path == Path("./my_definition_types/my_resource_name/configuration.yaml")
|
||||
|
||||
@pytest.mark.parametrize("file_exists, confirmed_overwrite", [(True, True), (False, None), (True, False)])
|
||||
def test__confirm_overwrite(self, mocker, file_exists, confirmed_overwrite):
|
||||
mock_output_path = mocker.Mock(is_file=mocker.Mock(return_value=file_exists))
|
||||
mocker.patch.object(renderers.click, "confirm", mocker.Mock(return_value=confirmed_overwrite))
|
||||
overwrite = renderers.BaseRenderer._confirm_overwrite(mock_output_path)
|
||||
if file_exists:
|
||||
assert overwrite == confirmed_overwrite
|
||||
else:
|
||||
assert overwrite is True
|
||||
|
||||
@pytest.mark.parametrize("confirmed_overwrite", [True, False])
|
||||
def test_import_configuration(self, mocker, patch_base_class, confirmed_overwrite):
|
||||
configuration = {"foo": "bar"}
|
||||
mocker.patch.object(renderers.BaseRenderer, "_render")
|
||||
mocker.patch.object(renderers.BaseRenderer, "get_output_path")
|
||||
mocker.patch.object(renderers.yaml, "safe_load", mocker.Mock(return_value={}))
|
||||
mocker.patch.object(renderers.yaml, "safe_dump")
|
||||
mocker.patch.object(renderers.BaseRenderer, "_confirm_overwrite", mocker.Mock(return_value=confirmed_overwrite))
|
||||
spec_renderer = renderers.BaseRenderer("my_resource_name")
|
||||
spec_renderer.definition = mocker.Mock(type="my_definition")
|
||||
expected_output_path = renderers.BaseRenderer.get_output_path.return_value
|
||||
with patch("builtins.open", mock_open()) as mock_file:
|
||||
output_path = spec_renderer.import_configuration(project_path=".", configuration=configuration)
|
||||
spec_renderer._render.assert_called_once()
|
||||
renderers.yaml.safe_load.assert_called_with(spec_renderer._render.return_value)
|
||||
assert renderers.yaml.safe_load.return_value["configuration"] == configuration
|
||||
spec_renderer.get_output_path.assert_called_with(".", spec_renderer.definition.type, spec_renderer.resource_name)
|
||||
spec_renderer._confirm_overwrite.assert_called_with(expected_output_path)
|
||||
if confirmed_overwrite:
|
||||
mock_file.assert_called_with(expected_output_path, "wb")
|
||||
renderers.yaml.safe_dump.assert_called_with(
|
||||
renderers.yaml.safe_load.return_value,
|
||||
mock_file.return_value,
|
||||
default_flow_style=False,
|
||||
sort_keys=False,
|
||||
allow_unicode=True,
|
||||
encoding="utf-8",
|
||||
)
|
||||
assert output_path == renderers.BaseRenderer.get_output_path.return_value
|
||||
|
||||
|
||||
class TestConnectorSpecificationRenderer:
|
||||
def test_init(self, mocker):
|
||||
assert renderers.ConnectorSpecificationRenderer.TEMPLATE == renderers.JINJA_ENV.get_template("source_or_destination.yaml.j2")
|
||||
definition = mocker.Mock()
|
||||
spec_renderer = renderers.ConnectorSpecificationRenderer("my_resource_name", definition)
|
||||
assert spec_renderer.resource_name == "my_resource_name"
|
||||
assert spec_renderer.definition == definition
|
||||
|
||||
def test__parse_connection_specification(self, mocker):
|
||||
mocker.patch.object(renderers, "parse_fields")
|
||||
schema = {"required": ["foo"], "properties": {"foo": "bar"}}
|
||||
definition = mocker.Mock()
|
||||
spec_renderer = renderers.ConnectorSpecificationRenderer("my_resource_name", definition)
|
||||
parsed_schema = spec_renderer._parse_connection_specification(schema)
|
||||
assert renderers.parse_fields.call_count == 1
|
||||
assert parsed_schema[0], renderers.parse_fields.return_value
|
||||
renderers.parse_fields.assert_called_with(["foo"], {"foo": "bar"})
|
||||
|
||||
def test__parse_connection_specification_one_of(self, mocker):
|
||||
mocker.patch.object(renderers, "parse_fields")
|
||||
schema = {"oneOf": [{"required": ["foo"], "properties": {"foo": "bar"}}, {"required": ["free"], "properties": {"free": "beer"}}]}
|
||||
spec_renderer = renderers.ConnectorSpecificationRenderer("my_resource_name", mocker.Mock())
|
||||
parsed_schema = spec_renderer._parse_connection_specification(schema)
|
||||
assert renderers.parse_fields.call_count == 2
|
||||
assert parsed_schema[0], renderers.parse_fields.return_value
|
||||
assert parsed_schema[1], renderers.parse_fields.return_value
|
||||
assert len(parsed_schema) == len(schema["oneOf"])
|
||||
renderers.parse_fields.assert_called_with(["free"], {"free": "beer"})
|
||||
|
||||
@pytest.mark.parametrize("overwrite", [True, False])
|
||||
def test_write_yaml(self, mocker, overwrite):
|
||||
|
||||
mocker.patch.object(renderers.ConnectorSpecificationRenderer, "get_output_path")
|
||||
mocker.patch.object(renderers.ConnectorSpecificationRenderer, "_parse_connection_specification")
|
||||
mocker.patch.object(
|
||||
renderers.ConnectorSpecificationRenderer, "TEMPLATE", mocker.Mock(render=mocker.Mock(return_value="rendered_string"))
|
||||
)
|
||||
mocker.patch.object(renderers.ConnectorSpecificationRenderer, "_confirm_overwrite", mocker.Mock(return_value=overwrite))
|
||||
|
||||
spec_renderer = renderers.ConnectorSpecificationRenderer("my_resource_name", mocker.Mock(type="source"))
|
||||
if overwrite:
|
||||
with patch("builtins.open", mock_open()) as mock_file:
|
||||
output_path = spec_renderer.write_yaml(".")
|
||||
spec_renderer.TEMPLATE.render.assert_called_with(
|
||||
{
|
||||
"resource_name": "my_resource_name",
|
||||
"definition": spec_renderer.definition,
|
||||
"configuration_fields": spec_renderer._parse_connection_specification.return_value,
|
||||
}
|
||||
)
|
||||
mock_file.assert_called_with(output_path, "w")
|
||||
else:
|
||||
output_path = spec_renderer.write_yaml(".")
|
||||
assert output_path == spec_renderer.get_output_path.return_value
|
||||
|
||||
def test__render(self, mocker):
|
||||
mocker.patch.object(renderers.ConnectorSpecificationRenderer, "_parse_connection_specification")
|
||||
mocker.patch.object(renderers.ConnectorSpecificationRenderer, "TEMPLATE")
|
||||
spec_renderer = renderers.ConnectorSpecificationRenderer("my_resource_name", mocker.Mock())
|
||||
rendered = spec_renderer._render()
|
||||
spec_renderer._parse_connection_specification.assert_called_with(spec_renderer.definition.specification.connection_specification)
|
||||
spec_renderer.TEMPLATE.render.assert_called_with(
|
||||
{
|
||||
"resource_name": spec_renderer.resource_name,
|
||||
"definition": spec_renderer.definition,
|
||||
"configuration_fields": spec_renderer._parse_connection_specification.return_value,
|
||||
}
|
||||
)
|
||||
assert rendered == spec_renderer.TEMPLATE.render.return_value
|
||||
|
||||
|
||||
class TestConnectionRenderer:
|
||||
@pytest.fixture
|
||||
def mock_source(self, mocker):
|
||||
return mocker.Mock()
|
||||
|
||||
@pytest.fixture
|
||||
def mock_destination(self, mocker):
|
||||
return mocker.Mock()
|
||||
|
||||
def test_init(self, mock_source, mock_destination):
|
||||
assert renderers.ConnectionRenderer.TEMPLATE == renderers.JINJA_ENV.get_template("connection.yaml.j2")
|
||||
connection_renderer = renderers.ConnectionRenderer("my_resource_name", mock_source, mock_destination)
|
||||
assert connection_renderer.resource_name == "my_resource_name"
|
||||
assert connection_renderer.source == mock_source
|
||||
assert connection_renderer.destination == mock_destination
|
||||
|
||||
def test_catalog_to_yaml(self, mocker):
|
||||
stream = AirbyteStream(
|
||||
default_cursor_field=["foo"], json_schema={}, name="my_stream", supported_sync_modes=[SyncMode("full_refresh")]
|
||||
)
|
||||
config = AirbyteStreamConfiguration(
|
||||
alias_name="pokemon", selected=True, destination_sync_mode=DestinationSyncMode("append"), sync_mode=SyncMode("full_refresh")
|
||||
)
|
||||
catalog = AirbyteCatalog([AirbyteStreamAndConfiguration(stream=stream, config=config)])
|
||||
yaml_catalog = renderers.ConnectionRenderer.catalog_to_yaml(catalog)
|
||||
assert yaml_catalog == yaml.dump(catalog.to_dict(), Dumper=yaml_dumpers.CatalogDumper, default_flow_style=False)
|
||||
|
||||
@pytest.mark.parametrize("overwrite", [True, False])
|
||||
def test_write_yaml(self, mocker, mock_source, mock_destination, overwrite):
|
||||
mocker.patch.object(renderers.ConnectionRenderer, "get_output_path")
|
||||
mocker.patch.object(renderers.ConnectionRenderer, "catalog_to_yaml")
|
||||
mocker.patch.object(renderers.ConnectionRenderer, "TEMPLATE")
|
||||
mocker.patch.object(renderers.ConnectionRenderer, "_confirm_overwrite", mocker.Mock(return_value=overwrite))
|
||||
|
||||
connection_renderer = renderers.ConnectionRenderer("my_resource_name", mock_source, mock_destination)
|
||||
if overwrite:
|
||||
with patch("builtins.open", mock_open()) as mock_file:
|
||||
output_path = connection_renderer.write_yaml(".")
|
||||
connection_renderer.get_output_path.assert_called_with(".", renderers.ConnectionDefinition.type, "my_resource_name")
|
||||
connection_renderer.catalog_to_yaml.assert_called_with(mock_source.catalog)
|
||||
mock_file.assert_called_with(output_path, "w")
|
||||
mock_file.return_value.write.assert_called_with(connection_renderer.TEMPLATE.render.return_value)
|
||||
connection_renderer.TEMPLATE.render.assert_called_with(
|
||||
{
|
||||
"connection_name": connection_renderer.resource_name,
|
||||
"source_configuration_path": mock_source.configuration_path,
|
||||
"destination_configuration_path": mock_destination.configuration_path,
|
||||
"catalog": connection_renderer.catalog_to_yaml.return_value,
|
||||
"supports_normalization": connection_renderer.destination.definition.normalization_config.supported,
|
||||
"supports_dbt": connection_renderer.destination.definition.supports_dbt,
|
||||
}
|
||||
)
|
||||
else:
|
||||
output_path = connection_renderer.write_yaml(".")
|
||||
assert output_path == connection_renderer.get_output_path.return_value
|
||||
|
||||
def test__render(self, mocker):
|
||||
mocker.patch.object(renderers.ConnectionRenderer, "catalog_to_yaml")
|
||||
mocker.patch.object(renderers.ConnectionRenderer, "TEMPLATE")
|
||||
connection_renderer = renderers.ConnectionRenderer("my_connection_name", mocker.Mock(), mocker.Mock())
|
||||
rendered = connection_renderer._render()
|
||||
connection_renderer.catalog_to_yaml.assert_called_with(connection_renderer.source.catalog)
|
||||
connection_renderer.TEMPLATE.render.assert_called_with(
|
||||
{
|
||||
"connection_name": connection_renderer.resource_name,
|
||||
"source_configuration_path": connection_renderer.source.configuration_path,
|
||||
"destination_configuration_path": connection_renderer.destination.configuration_path,
|
||||
"catalog": connection_renderer.catalog_to_yaml.return_value,
|
||||
"supports_normalization": connection_renderer.destination.definition.normalization_config.supported,
|
||||
"supports_dbt": connection_renderer.destination.definition.supports_dbt,
|
||||
}
|
||||
)
|
||||
assert rendered == connection_renderer.TEMPLATE.render.return_value
|
||||
|
||||
@pytest.mark.parametrize("confirmed_overwrite, operations", [(True, []), (False, []), (True, [{}]), (False, [{}])])
|
||||
def test_import_configuration(self, mocker, confirmed_overwrite, operations):
|
||||
configuration = {"foo": "bar", "bar": "foo", "operations": operations}
|
||||
mocker.patch.object(renderers.ConnectionRenderer, "KEYS_TO_REMOVE_FROM_REMOTE_CONFIGURATION", ["bar"])
|
||||
mocker.patch.object(renderers.ConnectionRenderer, "_render")
|
||||
mocker.patch.object(renderers.ConnectionRenderer, "get_output_path")
|
||||
mocker.patch.object(renderers.yaml, "safe_load", mocker.Mock(return_value={}))
|
||||
mocker.patch.object(renderers.yaml, "safe_dump")
|
||||
mocker.patch.object(renderers.ConnectionRenderer, "_confirm_overwrite", mocker.Mock(return_value=confirmed_overwrite))
|
||||
spec_renderer = renderers.ConnectionRenderer("my_resource_name", mocker.Mock(), mocker.Mock())
|
||||
expected_output_path = renderers.ConnectionRenderer.get_output_path.return_value
|
||||
with patch("builtins.open", mock_open()) as mock_file:
|
||||
output_path = spec_renderer.import_configuration(project_path=".", configuration=configuration)
|
||||
spec_renderer._render.assert_called_once()
|
||||
renderers.yaml.safe_load.assert_called_with(spec_renderer._render.return_value)
|
||||
if operations:
|
||||
assert renderers.yaml.safe_load.return_value["configuration"] == {"foo": "bar", "operations": operations}
|
||||
else:
|
||||
assert renderers.yaml.safe_load.return_value["configuration"] == {"foo": "bar"}
|
||||
spec_renderer.get_output_path.assert_called_with(".", spec_renderer.definition.type, spec_renderer.resource_name)
|
||||
spec_renderer._confirm_overwrite.assert_called_with(expected_output_path)
|
||||
if confirmed_overwrite:
|
||||
mock_file.assert_called_with(expected_output_path, "wb")
|
||||
renderers.yaml.safe_dump.assert_called_with(
|
||||
renderers.yaml.safe_load.return_value,
|
||||
mock_file.return_value,
|
||||
default_flow_style=False,
|
||||
sort_keys=False,
|
||||
allow_unicode=True,
|
||||
encoding="utf-8",
|
||||
)
|
||||
assert output_path == renderers.ConnectionRenderer.get_output_path.return_value
|
||||
@@ -1,3 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
@@ -1,102 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import pytest
|
||||
from click.testing import CliRunner
|
||||
from octavia_cli.get import commands
|
||||
|
||||
|
||||
def test_commands_in_get_group():
|
||||
get_commands = commands.get.commands.values()
|
||||
for command in commands.AVAILABLE_COMMANDS:
|
||||
assert command in get_commands
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def context_object(mock_api_client, mock_telemetry_client):
|
||||
return {
|
||||
"API_CLIENT": mock_api_client,
|
||||
"WORKSPACE_ID": "my_workspace_id",
|
||||
"resource_id": "my_resource_id",
|
||||
"TELEMETRY_CLIENT": mock_telemetry_client,
|
||||
}
|
||||
|
||||
|
||||
def test_available_commands():
|
||||
assert commands.AVAILABLE_COMMANDS == [commands.source, commands.destination, commands.connection]
|
||||
|
||||
|
||||
def test_build_help_message():
|
||||
assert commands.build_help_message("fake_resource_type") == "Get a JSON representation of a remote fake_resource_type."
|
||||
|
||||
|
||||
def test_get_resource_id_or_name():
|
||||
resource_id, resource_name = commands.get_resource_id_or_name("resource_name")
|
||||
assert resource_id is None and resource_name == "resource_name"
|
||||
resource_id, resource_name = commands.get_resource_id_or_name("8c2e8369-3b81-471a-9945-32a3c67c31b7")
|
||||
assert resource_id == "8c2e8369-3b81-471a-9945-32a3c67c31b7" and resource_name is None
|
||||
|
||||
|
||||
def test_get_json_representation(mocker, context_object):
|
||||
mock_cls = mocker.Mock()
|
||||
mocker.patch.object(commands.click, "echo")
|
||||
mock_resource_id = mocker.Mock()
|
||||
mock_resource_name = mocker.Mock()
|
||||
mocker.patch.object(commands, "get_resource_id_or_name", mocker.Mock(return_value=(mock_resource_id, mock_resource_name)))
|
||||
json_repr = commands.get_json_representation(context_object["API_CLIENT"], context_object["WORKSPACE_ID"], mock_cls, "resource_to_get")
|
||||
commands.get_resource_id_or_name.assert_called_with("resource_to_get")
|
||||
mock_cls.assert_called_with(
|
||||
context_object["API_CLIENT"], context_object["WORKSPACE_ID"], resource_id=mock_resource_id, resource_name=mock_resource_name
|
||||
)
|
||||
assert json_repr == mock_cls.return_value.to_json.return_value
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"command, resource_cls, resource",
|
||||
[
|
||||
(commands.source, commands.Source, "my_resource_id"),
|
||||
(commands.destination, commands.Destination, "my_resource_id"),
|
||||
(commands.connection, commands.Connection, "my_resource_id"),
|
||||
],
|
||||
)
|
||||
def test_commands(context_object, mocker, command, resource_cls, resource):
|
||||
mocker.patch.object(commands, "get_json_representation", mocker.Mock(return_value='{"foo": "bar"}'))
|
||||
runner = CliRunner()
|
||||
result = runner.invoke(command, [resource], obj=context_object)
|
||||
commands.get_json_representation.assert_called_once_with(
|
||||
context_object["API_CLIENT"], context_object["WORKSPACE_ID"], resource_cls, resource
|
||||
)
|
||||
assert result.exit_code == 0
|
||||
|
||||
|
||||
# @pytest.mark.parametrize(
|
||||
# "command,resource_id",
|
||||
# [
|
||||
# (commands.destination, "my_resource_id"),
|
||||
# ],
|
||||
# )
|
||||
# def test_destination(mocker, context_object, command, resource_id):
|
||||
# runner = CliRunner()
|
||||
# mocker.patch.object(commands, "Destination", mocker.Mock())
|
||||
# mock_renderer = commands.Destination.return_value
|
||||
# mock_renderer.get_remote_resource.return_value = '{"hello": "world"}'
|
||||
# result = runner.invoke(command, [resource_id], obj=context_object)
|
||||
# assert result.exit_code == 0
|
||||
# commands.Destination.assert_called_with(context_object["API_CLIENT"], context_object["WORKSPACE_ID"], resource_id)
|
||||
|
||||
|
||||
# @pytest.mark.parametrize(
|
||||
# "command,resource_id",
|
||||
# [
|
||||
# (commands.connection, "my_resource_id"),
|
||||
# ],
|
||||
# )
|
||||
# def test_connection(mocker, context_object, command, resource_id):
|
||||
# runner = CliRunner()
|
||||
# mocker.patch.object(commands, "Connection", mocker.Mock())
|
||||
# mock_renderer = commands.Connection.return_value
|
||||
# mock_renderer.get_remote_resource.return_value = '{"hello": "world"}'
|
||||
# result = runner.invoke(command, [resource_id], obj=context_object)
|
||||
# assert result.exit_code == 0
|
||||
# commands.Connection.assert_called_with(context_object["API_CLIENT"], context_object["WORKSPACE_ID"], resource_id)
|
||||
@@ -1,137 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import pytest
|
||||
from airbyte_api_client.api import destination_api, source_api, web_backend_api
|
||||
from airbyte_api_client.model.destination_id_request_body import DestinationIdRequestBody
|
||||
from airbyte_api_client.model.source_id_request_body import SourceIdRequestBody
|
||||
from airbyte_api_client.model.web_backend_connection_request_body import WebBackendConnectionRequestBody
|
||||
from octavia_cli.get.resources import BaseResource, Connection, Destination, DuplicateResourceError, ResourceNotFoundError, Source
|
||||
|
||||
|
||||
class TestBaseResource:
|
||||
@pytest.fixture
|
||||
def patch_base_class(self, mocker):
|
||||
# Mock abstract methods to enable instantiating abstract class
|
||||
mocker.patch.object(BaseResource, "__abstractmethods__", set())
|
||||
mocker.patch.object(BaseResource, "api", mocker.Mock())
|
||||
mocker.patch.object(BaseResource, "get_function_name", "get_function_name")
|
||||
mocker.patch.object(BaseResource, "get_payload", "get_payload")
|
||||
mocker.patch.object(BaseResource, "list_for_workspace_function_name", "list_for_workspace_function_name")
|
||||
mocker.patch.object(BaseResource, "name", "fake_resource")
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"resource_id, resource_name, expected_error, expected_error_message",
|
||||
[
|
||||
("my_resource_id", None, None, None),
|
||||
(None, "my_resource_name", None, None),
|
||||
(None, None, ValueError, "resource_id and resource_name keyword arguments can't be both None."),
|
||||
("my_resource_id", "my_resource_name", ValueError, "resource_id and resource_name keyword arguments can't be both set."),
|
||||
],
|
||||
)
|
||||
def test_init(self, patch_base_class, mock_api_client, resource_id, resource_name, expected_error, expected_error_message):
|
||||
if expected_error:
|
||||
with pytest.raises(expected_error, match=expected_error_message):
|
||||
base_resource = BaseResource(mock_api_client, "workspace_id", resource_id=resource_id, resource_name=resource_name)
|
||||
else:
|
||||
base_resource = BaseResource(mock_api_client, "workspace_id", resource_id=resource_id, resource_name=resource_name)
|
||||
base_resource.api.assert_called_with(mock_api_client)
|
||||
assert base_resource.api_instance == base_resource.api.return_value
|
||||
assert base_resource.workspace_id == "workspace_id"
|
||||
assert base_resource._get_fn == getattr(base_resource.api, base_resource.get_function_name)
|
||||
assert base_resource._list_for_workspace_fn == getattr(base_resource.api, base_resource.list_for_workspace_function_name)
|
||||
assert base_resource.resource_id == resource_id
|
||||
assert base_resource.resource_name == resource_name
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"resource_name, api_response_resources_names, expected_error, expected_error_message",
|
||||
[
|
||||
("foo", ["foo", "bar"], None, None),
|
||||
("foo", ["bar", "fooo"], ResourceNotFoundError, "The fake_resource foo was not found in your current Airbyte workspace."),
|
||||
(
|
||||
"foo",
|
||||
["foo", "foo"],
|
||||
DuplicateResourceError,
|
||||
"2 fake_resources with the name foo were found in your current Airbyte workspace.",
|
||||
),
|
||||
],
|
||||
)
|
||||
def test__find_by_resource_name(
|
||||
self, mocker, patch_base_class, mock_api_client, resource_name, api_response_resources_names, expected_error, expected_error_message
|
||||
):
|
||||
mock_api_response_records = []
|
||||
for fake_resource_name in api_response_resources_names:
|
||||
mock_api_response_record = mocker.Mock() # We can't set the mock name on creation as it's a reserved attribute
|
||||
mock_api_response_record.name = fake_resource_name
|
||||
mock_api_response_records.append(mock_api_response_record)
|
||||
|
||||
mocker.patch.object(
|
||||
BaseResource, "_list_for_workspace_fn", mocker.Mock(return_value=mocker.Mock(fake_resources=mock_api_response_records))
|
||||
)
|
||||
base_resource = BaseResource(mock_api_client, "workspace_id", resource_id=None, resource_name=resource_name)
|
||||
if not expected_error:
|
||||
found_resource = base_resource._find_by_resource_name()
|
||||
assert found_resource.name == resource_name
|
||||
if expected_error:
|
||||
with pytest.raises(expected_error, match=expected_error_message):
|
||||
base_resource._find_by_resource_name()
|
||||
|
||||
def test__find_by_id(self, mocker, patch_base_class, mock_api_client):
|
||||
mocker.patch.object(BaseResource, "_get_fn")
|
||||
base_resource = BaseResource(mock_api_client, "workspace_id", resource_id="my_resource_id")
|
||||
base_resource._find_by_resource_id()
|
||||
base_resource._get_fn.assert_called_with(base_resource.api_instance, base_resource.get_payload)
|
||||
|
||||
@pytest.mark.parametrize("resource_id, resource_name", [("my_resource_id", None), (None, "my_resource_name")])
|
||||
def test_get_remote_resource(self, mocker, patch_base_class, mock_api_client, resource_id, resource_name):
|
||||
mocker.patch.object(BaseResource, "_find_by_resource_id")
|
||||
mocker.patch.object(BaseResource, "_find_by_resource_name")
|
||||
base_resource = BaseResource(mock_api_client, "workspace_id", resource_id=resource_id, resource_name=resource_name)
|
||||
remote_resource = base_resource.get_remote_resource()
|
||||
if resource_id is not None:
|
||||
base_resource._find_by_resource_id.assert_called_once()
|
||||
base_resource._find_by_resource_name.assert_not_called()
|
||||
assert remote_resource == base_resource._find_by_resource_id.return_value
|
||||
if resource_name is not None:
|
||||
base_resource._find_by_resource_id.assert_not_called()
|
||||
base_resource._find_by_resource_name.assert_called_once()
|
||||
assert remote_resource == base_resource._find_by_resource_name.return_value
|
||||
|
||||
def test_to_json(self, mocker, patch_base_class, mock_api_client):
|
||||
mocker.patch.object(
|
||||
BaseResource, "get_remote_resource", mocker.Mock(return_value=mocker.Mock(to_dict=mocker.Mock(return_value={"foo": "bar"})))
|
||||
)
|
||||
base_resource = BaseResource(mock_api_client, "workspace_id", resource_id="my_resource_id")
|
||||
json_repr = base_resource.to_json()
|
||||
assert json_repr == '{"foo": "bar"}'
|
||||
|
||||
|
||||
class TestSource:
|
||||
def test_init(self, mock_api_client):
|
||||
assert Source.__base__ == BaseResource
|
||||
source = Source(mock_api_client, "workspace_id", "resource_id")
|
||||
assert source.api == source_api.SourceApi
|
||||
assert source.get_function_name == "get_source"
|
||||
assert source.list_for_workspace_function_name == "list_sources_for_workspace"
|
||||
assert source.get_payload == SourceIdRequestBody("resource_id")
|
||||
|
||||
|
||||
class TestDestination:
|
||||
def test_init(self, mock_api_client):
|
||||
assert Destination.__base__ == BaseResource
|
||||
destination = Destination(mock_api_client, "workspace_id", "resource_id")
|
||||
assert destination.api == destination_api.DestinationApi
|
||||
assert destination.get_function_name == "get_destination"
|
||||
assert destination.list_for_workspace_function_name == "list_destinations_for_workspace"
|
||||
assert destination.get_payload == DestinationIdRequestBody("resource_id")
|
||||
|
||||
|
||||
class TestConnection:
|
||||
def test_init(self, mock_api_client):
|
||||
assert Connection.__base__ == BaseResource
|
||||
connection = Connection(mock_api_client, "workspace_id", "resource_id")
|
||||
assert connection.api == web_backend_api.WebBackendApi
|
||||
assert connection.get_function_name == "web_backend_get_connection"
|
||||
assert connection.list_for_workspace_function_name == "web_backend_list_connections_for_workspace"
|
||||
assert connection.get_payload == WebBackendConnectionRequestBody(with_refreshed_catalog=False, connection_id=connection.resource_id)
|
||||
@@ -1,3 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2021 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
@@ -1,90 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
from unittest.mock import mock_open, patch
|
||||
|
||||
import pytest
|
||||
from click.testing import CliRunner
|
||||
from octavia_cli.init import commands
|
||||
from octavia_cli.init.commands import create_api_headers_configuration_file
|
||||
|
||||
|
||||
def test_directories_to_create():
|
||||
assert commands.DIRECTORIES_TO_CREATE == {"connections", "destinations", "sources"}
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def context_object(mock_telemetry_client):
|
||||
return {"TELEMETRY_CLIENT": mock_telemetry_client}
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"directories_to_create,mkdir_side_effects,expected_created_directories,expected_not_created_directories",
|
||||
[
|
||||
(["dir_a", "dir_b"], None, ["dir_a", "dir_b"], []),
|
||||
(["dir_a", "dir_b"], FileExistsError(), [], ["dir_a", "dir_b"]),
|
||||
(["dir_a", "dir_b"], [None, FileExistsError()], ["dir_a"], ["dir_b"]),
|
||||
],
|
||||
)
|
||||
def test_create_directories(
|
||||
mocker, directories_to_create, mkdir_side_effects, expected_created_directories, expected_not_created_directories
|
||||
):
|
||||
mocker.patch.object(commands, "os", mocker.Mock(mkdir=mocker.Mock(side_effect=mkdir_side_effects)))
|
||||
created_directories, not_created_directories = commands.create_directories(directories_to_create)
|
||||
assert created_directories == expected_created_directories
|
||||
assert not_created_directories == expected_not_created_directories
|
||||
commands.os.mkdir.assert_has_calls([mocker.call(d) for d in directories_to_create])
|
||||
|
||||
|
||||
def test_init(mocker, context_object):
|
||||
runner = CliRunner()
|
||||
mocker.patch.object(commands, "create_directories", mocker.Mock(return_value=(["dir_a", "dir_b"], [])))
|
||||
mocker.patch.object(commands, "create_api_headers_configuration_file", mocker.Mock(return_value=True))
|
||||
result = runner.invoke(commands.init, obj=context_object)
|
||||
assert result.exit_code == 0
|
||||
assert (
|
||||
result.output
|
||||
== "🔨 - Initializing the project.\n✅ - Created the following directories: dir_a, dir_b.\n"
|
||||
+ f"✅ - Created API HTTP headers file in {commands.API_HTTP_HEADERS_TARGET_PATH}\n"
|
||||
)
|
||||
|
||||
|
||||
def test_init_some_existing_directories(mocker, context_object):
|
||||
runner = CliRunner()
|
||||
mocker.patch.object(commands, "create_directories", mocker.Mock(return_value=(["dir_a"], ["dir_b"])))
|
||||
mocker.patch.object(commands, "create_api_headers_configuration_file", mocker.Mock(return_value=False))
|
||||
result = runner.invoke(commands.init, obj=context_object)
|
||||
assert result.exit_code == 0
|
||||
assert "Already existing directories: dir_b.\n" in result.output
|
||||
|
||||
|
||||
def test_init_all_existing_directories(mocker, context_object):
|
||||
runner = CliRunner()
|
||||
mocker.patch.object(commands, "create_directories", mocker.Mock(return_value=([], ["dir_a", "dir_b"])))
|
||||
mocker.patch.object(commands, "create_api_headers_configuration_file", mocker.Mock(return_value=False))
|
||||
result = runner.invoke(commands.init, obj=context_object)
|
||||
assert result.exit_code == 0
|
||||
assert "Already existing directories: dir_a, dir_b.\n" in result.output
|
||||
|
||||
|
||||
def test_init_when_api_headers_configuration_file_exists(mocker, context_object):
|
||||
runner = CliRunner()
|
||||
mocker.patch.object(commands, "create_directories", mocker.Mock(return_value=([], ["dir_a", "dir_b"])))
|
||||
mocker.patch.object(commands, "create_api_headers_configuration_file", mocker.Mock(return_value=False))
|
||||
result = runner.invoke(commands.init, obj=context_object)
|
||||
assert result.exit_code == 0
|
||||
assert "API HTTP headers file already exists, skipping." in result.output
|
||||
|
||||
|
||||
@pytest.mark.parametrize("api_http_headers_file_exist", [False, True])
|
||||
def test_create_init_configuration(mocker, api_http_headers_file_exist):
|
||||
mock_path = mocker.Mock(is_file=mocker.Mock(return_value=api_http_headers_file_exist))
|
||||
mocker.patch.object(commands, "API_HTTP_HEADERS_TARGET_PATH", mock_path)
|
||||
if not api_http_headers_file_exist:
|
||||
with patch("builtins.open", mock_open()) as mock_file:
|
||||
assert create_api_headers_configuration_file()
|
||||
mock_file.assert_called_with(commands.API_HTTP_HEADERS_TARGET_PATH, "w")
|
||||
mock_file.return_value.write.assert_called_with(commands.DEFAULT_API_HEADERS_FILE_CONTENT)
|
||||
else:
|
||||
assert not create_api_headers_configuration_file()
|
||||
@@ -1,3 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2021 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
@@ -1,62 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import pytest
|
||||
from click.testing import CliRunner
|
||||
from octavia_cli.list import commands
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def context_object(mock_api_client, mock_telemetry_client):
|
||||
return {"API_CLIENT": mock_api_client, "WORKSPACE_ID": "my_workspace_id", "TELEMETRY_CLIENT": mock_telemetry_client}
|
||||
|
||||
|
||||
def test_available_commands():
|
||||
assert commands.AVAILABLE_COMMANDS == [commands.connectors, commands.workspace]
|
||||
|
||||
|
||||
def test_commands_in_list_group():
|
||||
list_commands = commands._list.commands.values()
|
||||
for command in commands.AVAILABLE_COMMANDS:
|
||||
assert command in list_commands
|
||||
|
||||
|
||||
def test_connectors_sources(mocker, context_object):
|
||||
mocker.patch.object(commands, "SourceConnectorsDefinitions", mocker.Mock(return_value="SourceConnectorsDefinitionsRepr"))
|
||||
runner = CliRunner()
|
||||
result = runner.invoke(commands.sources_connectors, obj=context_object)
|
||||
commands.SourceConnectorsDefinitions.assert_called_with(context_object["API_CLIENT"])
|
||||
assert result.output == "SourceConnectorsDefinitionsRepr\n"
|
||||
|
||||
|
||||
def test_connectors_destinations(mocker, context_object):
|
||||
mocker.patch.object(commands, "DestinationConnectorsDefinitions", mocker.Mock(return_value="DestinationConnectorsDefinitionsRepr"))
|
||||
runner = CliRunner()
|
||||
result = runner.invoke(commands.destinations_connectors, obj=context_object)
|
||||
commands.DestinationConnectorsDefinitions.assert_called_with(context_object["API_CLIENT"])
|
||||
assert result.output == "DestinationConnectorsDefinitionsRepr\n"
|
||||
|
||||
|
||||
def test_sources(mocker, context_object):
|
||||
mocker.patch.object(commands, "Sources", mocker.Mock(return_value="SourcesRepr"))
|
||||
runner = CliRunner()
|
||||
result = runner.invoke(commands.sources, obj=context_object)
|
||||
commands.Sources.assert_called_with(context_object["API_CLIENT"], context_object["WORKSPACE_ID"])
|
||||
assert result.output == "SourcesRepr\n"
|
||||
|
||||
|
||||
def test_destinations(mocker, context_object):
|
||||
mocker.patch.object(commands, "Destinations", mocker.Mock(return_value="DestinationsRepr"))
|
||||
runner = CliRunner()
|
||||
result = runner.invoke(commands.destinations, obj=context_object)
|
||||
commands.Destinations.assert_called_with(context_object["API_CLIENT"], context_object["WORKSPACE_ID"])
|
||||
assert result.output == "DestinationsRepr\n"
|
||||
|
||||
|
||||
def test_connections(mocker, context_object):
|
||||
mocker.patch.object(commands, "Connections", mocker.Mock(return_value="ConnectionsRepr"))
|
||||
runner = CliRunner()
|
||||
result = runner.invoke(commands.connections, obj=context_object)
|
||||
commands.Connections.assert_called_with(context_object["API_CLIENT"], context_object["WORKSPACE_ID"])
|
||||
assert result.output == "ConnectionsRepr\n"
|
||||
@@ -1,45 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import pytest
|
||||
from octavia_cli.list import formatting
|
||||
|
||||
PADDING = 2
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"test_data,expected_columns_width",
|
||||
[
|
||||
([["a", "___10chars"], ["e", "f"]], [1 + PADDING, 10 + PADDING]),
|
||||
([["a", "___10chars"], ["e", "____11chars"]], [1 + PADDING, 11 + PADDING]),
|
||||
([[""]], [PADDING]),
|
||||
],
|
||||
)
|
||||
def test_compute_columns_width(test_data, expected_columns_width):
|
||||
columns_width = formatting.compute_columns_width(test_data, PADDING)
|
||||
assert columns_width == expected_columns_width
|
||||
|
||||
|
||||
@pytest.mark.parametrize("input_camelcased,expected_output", [("camelCased", "CAMEL CASED"), ("notcamelcased", "NOTCAMELCASED")])
|
||||
def test_camelcased_to_uppercased_spaced(input_camelcased, expected_output):
|
||||
assert formatting.camelcased_to_uppercased_spaced(input_camelcased) == expected_output
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"test_data,columns_width,expected_output",
|
||||
[
|
||||
([["a", "___10chars"], ["e", "____11chars"]], [1 + PADDING, 11 + PADDING], "a ___10chars \ne ____11chars "),
|
||||
],
|
||||
)
|
||||
def test_display_as_table(mocker, test_data, columns_width, expected_output):
|
||||
mocker.patch.object(formatting, "compute_columns_width", mocker.Mock(return_value=columns_width))
|
||||
assert formatting.display_as_table(test_data) == expected_output
|
||||
|
||||
|
||||
def test_format_column_names():
|
||||
columns_to_format = ["camelCased"]
|
||||
formatted_columns = formatting.format_column_names(columns_to_format)
|
||||
assert len(formatted_columns) == 1
|
||||
for i, c in enumerate(formatted_columns):
|
||||
assert c == formatting.camelcased_to_uppercased_spaced(columns_to_format[i])
|
||||
@@ -1,154 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import pytest
|
||||
from airbyte_api_client.api import connection_api, destination_api, destination_definition_api, source_api, source_definition_api
|
||||
from octavia_cli.list import listings
|
||||
from octavia_cli.list.listings import (
|
||||
BaseListing,
|
||||
Connections,
|
||||
DestinationConnectorsDefinitions,
|
||||
Destinations,
|
||||
SourceConnectorsDefinitions,
|
||||
Sources,
|
||||
WorkspaceListing,
|
||||
)
|
||||
|
||||
|
||||
class TestBaseListing:
|
||||
@pytest.fixture
|
||||
def patch_base_class(self, mocker):
|
||||
# Mock abstract methods to enable instantiating abstract class
|
||||
mocker.patch.object(BaseListing, "__abstractmethods__", set())
|
||||
mocker.patch.object(BaseListing, "list_function_name", "my_list_function_name")
|
||||
mocker.patch.object(BaseListing, "api", mocker.Mock(my_list_function_name=mocker.Mock()))
|
||||
|
||||
def test_init(self, patch_base_class, mock_api_client):
|
||||
base_listing = BaseListing(mock_api_client)
|
||||
assert base_listing._list_fn == BaseListing.api.my_list_function_name
|
||||
assert base_listing.list_function_kwargs == {}
|
||||
assert base_listing.api_instance == base_listing.api.return_value
|
||||
base_listing.api.assert_called_with(mock_api_client)
|
||||
assert base_listing.COMMON_LIST_FUNCTION_KWARGS == {"_check_return_type": False}
|
||||
|
||||
def test_abstract_methods(self, mock_api_client):
|
||||
assert BaseListing.__abstractmethods__ == {"api", "fields_to_display", "list_field_in_response", "list_function_name"}
|
||||
with pytest.raises(TypeError):
|
||||
BaseListing(mock_api_client)
|
||||
|
||||
def test_parse_response(self, patch_base_class, mocker, mock_api_client):
|
||||
mocker.patch.object(BaseListing, "fields_to_display", ["fieldA", "fieldB"])
|
||||
base_listing = BaseListing(mock_api_client)
|
||||
api_response = {base_listing.list_field_in_response: []}
|
||||
for i in range(5):
|
||||
definition = {field: f"{field}_value_{i}" for field in base_listing.fields_to_display}
|
||||
definition["discarded_field"] = "discarded_value"
|
||||
api_response[base_listing.list_field_in_response].append(definition)
|
||||
parsed_listing = base_listing._parse_response(api_response)
|
||||
assert len(parsed_listing) == 5
|
||||
for i in range(5):
|
||||
assert parsed_listing[i] == [f"{field}_value_{i}" for field in base_listing.fields_to_display]
|
||||
assert "discarded_value" not in parsed_listing[i]
|
||||
|
||||
def test_gest_listing(self, patch_base_class, mocker, mock_api_client):
|
||||
mocker.patch.object(BaseListing, "_parse_response")
|
||||
mocker.patch.object(BaseListing, "_list_fn")
|
||||
base_listing = BaseListing(mock_api_client)
|
||||
listing = base_listing.get_listing()
|
||||
base_listing._list_fn.assert_called_with(
|
||||
base_listing.api_instance, **base_listing.list_function_kwargs, **base_listing.COMMON_LIST_FUNCTION_KWARGS
|
||||
)
|
||||
base_listing._parse_response.assert_called_with(base_listing._list_fn.return_value)
|
||||
assert listing == base_listing._parse_response.return_value
|
||||
|
||||
def test_repr(self, patch_base_class, mocker, mock_api_client):
|
||||
headers = ["fieldA", "fieldB", "fieldC"]
|
||||
api_response_listing = [["a", "b", "c"]]
|
||||
mocker.patch.object(BaseListing, "fields_to_display", headers)
|
||||
mocker.patch.object(BaseListing, "get_listing", mocker.Mock(return_value=api_response_listing))
|
||||
mocker.patch.object(listings, "formatting")
|
||||
base_listing = BaseListing(mock_api_client)
|
||||
representation = base_listing.__repr__()
|
||||
listings.formatting.display_as_table.assert_called_with(
|
||||
[listings.formatting.format_column_names.return_value] + api_response_listing
|
||||
)
|
||||
assert representation == listings.formatting.display_as_table.return_value
|
||||
|
||||
|
||||
class TestSourceConnectorsDefinitions:
|
||||
def test_init(self, mock_api_client):
|
||||
assert SourceConnectorsDefinitions.__base__ == BaseListing
|
||||
source_connectors_definition = SourceConnectorsDefinitions(mock_api_client)
|
||||
assert source_connectors_definition.api == source_definition_api.SourceDefinitionApi
|
||||
assert source_connectors_definition.fields_to_display == ["name", "dockerRepository", "dockerImageTag", "sourceDefinitionId"]
|
||||
assert source_connectors_definition.list_field_in_response == "source_definitions"
|
||||
assert source_connectors_definition.list_function_name == "list_source_definitions"
|
||||
|
||||
|
||||
class TestDestinationConnectorsDefinitions:
|
||||
def test_init(self, mock_api_client):
|
||||
assert DestinationConnectorsDefinitions.__base__ == BaseListing
|
||||
destination_connectors_definition = DestinationConnectorsDefinitions(mock_api_client)
|
||||
assert destination_connectors_definition.api == destination_definition_api.DestinationDefinitionApi
|
||||
assert destination_connectors_definition.fields_to_display == [
|
||||
"name",
|
||||
"dockerRepository",
|
||||
"dockerImageTag",
|
||||
"destinationDefinitionId",
|
||||
]
|
||||
assert destination_connectors_definition.list_field_in_response == "destination_definitions"
|
||||
assert destination_connectors_definition.list_function_name == "list_destination_definitions"
|
||||
|
||||
|
||||
class TestWorkspaceListing:
|
||||
@pytest.fixture
|
||||
def patch_base_class(self, mocker):
|
||||
# Mock abstract methods to enable instantiating abstract class
|
||||
mocker.patch.object(WorkspaceListing, "__abstractmethods__", set())
|
||||
mocker.patch.object(WorkspaceListing, "api", mocker.Mock())
|
||||
|
||||
def test_init(self, patch_base_class, mocker, mock_api_client):
|
||||
mocker.patch.object(listings, "WorkspaceIdRequestBody")
|
||||
mocker.patch.object(BaseListing, "__init__")
|
||||
assert WorkspaceListing.__base__ == BaseListing
|
||||
sources_and_destinations = WorkspaceListing(mock_api_client, "my_workspace_id")
|
||||
|
||||
assert sources_and_destinations.workspace_id == "my_workspace_id"
|
||||
assert sources_and_destinations.list_function_kwargs == {"workspace_id_request_body": listings.WorkspaceIdRequestBody.return_value}
|
||||
listings.WorkspaceIdRequestBody.assert_called_with(workspace_id="my_workspace_id")
|
||||
BaseListing.__init__.assert_called_with(mock_api_client)
|
||||
|
||||
def test_abstract(self, mock_api_client):
|
||||
with pytest.raises(TypeError):
|
||||
WorkspaceListing(mock_api_client)
|
||||
|
||||
|
||||
class TestSources:
|
||||
def test_init(self, mock_api_client):
|
||||
assert Sources.__base__ == WorkspaceListing
|
||||
sources = Sources(mock_api_client, "my_workspace_id")
|
||||
assert sources.api == source_api.SourceApi
|
||||
assert sources.fields_to_display == ["name", "sourceName", "sourceId"]
|
||||
assert sources.list_field_in_response == "sources"
|
||||
assert sources.list_function_name == "list_sources_for_workspace"
|
||||
|
||||
|
||||
class TestDestinations:
|
||||
def test_init(self, mock_api_client):
|
||||
assert Destinations.__base__ == WorkspaceListing
|
||||
destinations = Destinations(mock_api_client, "my_workspace_id")
|
||||
assert destinations.api == destination_api.DestinationApi
|
||||
assert destinations.fields_to_display == ["name", "destinationName", "destinationId"]
|
||||
assert destinations.list_field_in_response == "destinations"
|
||||
assert destinations.list_function_name == "list_destinations_for_workspace"
|
||||
|
||||
|
||||
class TestConnections:
|
||||
def test_init(self, mock_api_client):
|
||||
assert Connections.__base__ == WorkspaceListing
|
||||
connections = Connections(mock_api_client, "my_workspace_id")
|
||||
assert connections.api == connection_api.ConnectionApi
|
||||
assert connections.fields_to_display == ["name", "connectionId", "status", "sourceId", "destinationId"]
|
||||
assert connections.list_field_in_response == "connections"
|
||||
assert connections.list_function_name == "list_connections_for_workspace"
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user