1
0
mirror of synced 2025-12-25 02:09:19 -05:00

Migrate connectors to use our python base image (Round 2) (#31599)

This commit is contained in:
Augustin
2023-10-19 17:58:45 +02:00
committed by GitHub
parent 3874698ea0
commit a41c4f5b3d
95 changed files with 1778 additions and 1156 deletions

View File

@@ -1,17 +0,0 @@
FROM python:3.9-slim
# Bash is installed for more convenient debugging.
RUN apt-get update && apt-get install -y bash && rm -rf /var/lib/apt/lists/*
WORKDIR /airbyte/integration_code
COPY source_amazon_ads ./source_amazon_ads
COPY main.py ./
COPY setup.py ./
RUN pip install .
ENV AIRBYTE_ENTRYPOINT "python /airbyte/integration_code/main.py"
ENTRYPOINT ["python", "/airbyte/integration_code/main.py"]
LABEL io.airbyte.version=3.4.0
LABEL io.airbyte.name=airbyte/source-amazon-ads

View File

@@ -56,19 +56,70 @@ python main.py read --config secrets/config.json --catalog integration_tests/con
### Locally running the connector docker image
#### Build
First, make sure you build the latest Docker image:
#### Use `airbyte-ci` to build your connector
The Airbyte way of building this connector is to use our `airbyte-ci` tool.
You can follow install instructions [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md#L1).
Then running the following command will build your connector:
```bash
airbyte-ci connectors --name source-amazon-ads build
```
docker build . -t airbyte/source-amazon-ads:dev
Once the command is done, you will find your connector image in your local docker registry: `airbyte/source-amazon-ads:dev`.
##### Customizing our build process
When contributing on our connector you might need to customize the build process to add a system dependency or set an env var.
You can customize our build process by adding a `build_customization.py` module to your connector.
This module should contain a `pre_connector_install` and `post_connector_install` async function that will mutate the base image and the connector container respectively.
It will be imported at runtime by our build process and the functions will be called if they exist.
Here is an example of a `build_customization.py` module:
```python
from __future__ import annotations
from typing import TYPE_CHECKING
if TYPE_CHECKING:
# Feel free to check the dagger documentation for more information on the Container object and its methods.
# https://dagger-io.readthedocs.io/en/sdk-python-v0.6.4/
from dagger import Container
async def pre_connector_install(base_image_container: Container) -> Container:
return await base_image_container.with_env_variable("MY_PRE_BUILD_ENV_VAR", "my_pre_build_env_var_value")
async def post_connector_install(connector_container: Container) -> Container:
return await connector_container.with_env_variable("MY_POST_BUILD_ENV_VAR", "my_post_build_env_var_value")
```
You can also build the connector image via Gradle:
```
./gradlew :airbyte-integrations:connectors:source-amazon-ads:airbyteDocker
```
When building via Gradle, the docker image name and tag, respectively, are the values of the `io.airbyte.name` and `io.airbyte.version` `LABEL`s in
the Dockerfile.
#### Build your own connector image
This connector is built using our dynamic built process in `airbyte-ci`.
The base image used to build it is defined within the metadata.yaml file under the `connectorBuildOptions`.
The build logic is defined using [Dagger](https://dagger.io/) [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/builds/python_connectors.py).
It does not rely on a Dockerfile.
If you would like to patch our connector and build your own a simple approach would be to:
1. Create your own Dockerfile based on the latest version of the connector image.
```Dockerfile
FROM airbyte/source-amazon-ads:latest
COPY . ./airbyte/integration_code
RUN pip install ./airbyte/integration_code
# The entrypoint and default env vars are already set in the base image
# ENV AIRBYTE_ENTRYPOINT "python /airbyte/integration_code/main.py"
# ENTRYPOINT ["python", "/airbyte/integration_code/main.py"]
```
Please use this as an example. This is not optimized.
2. Build your image:
```bash
docker build -t airbyte/source-amazon-ads:dev .
# Running the spec command against your patched connector
docker run airbyte/source-amazon-ads:dev spec
```
#### Run
Then run any of the connector commands as follows:
```
@@ -128,4 +179,4 @@ You've checked out the repo, implemented a million dollar feature, and you're re
1. Bump the connector version in `Dockerfile` -- just increment the value of the `LABEL io.airbyte.version` appropriately (we use [SemVer](https://semver.org/)).
1. Create a Pull Request.
1. Pat yourself on the back for being an awesome contributor.
1. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
1. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.

View File

@@ -1,15 +1,21 @@
data:
ab_internal:
ql: 400
sl: 300
allowedHosts:
hosts:
- api.amazon.com
- advertising-api.amazon.com
- advertising-api-eu.amazon.com
- advertising-api-fe.amazon.com
connectorBuildOptions:
baseImage: docker.io/airbyte/python-connector-base:1.1.0@sha256:bd98f6505c6764b1b5f99d3aedc23dfc9e9af631a62533f60eb32b1d3dbab20c
connectorSubtype: api
connectorType: source
definitionId: c6b0a29e-1da9-4512-9002-7bfd0cba2246
dockerImageTag: 3.4.0
dockerImageTag: 3.4.1
dockerRepository: airbyte/source-amazon-ads
documentationUrl: https://docs.airbyte.com/integrations/sources/amazon-ads
githubIssueLabel: source-amazon-ads
icon: amazonads.svg
license: MIT
@@ -20,6 +26,11 @@ data:
oss:
enabled: true
releaseStage: generally_available
releases:
breakingChanges:
3.0.0:
message: Attribution report stream schemas fix.
upgradeDeadline: "2023-07-24"
suggestedStreams:
streams:
- profiles
@@ -27,16 +38,7 @@ data:
- sponsored_display_report_stream
- sponsored_brands_report_stream
- sponsored_products_report_stream
documentationUrl: https://docs.airbyte.com/integrations/sources/amazon-ads
supportLevel: certified
tags:
- language:python
releases:
breakingChanges:
3.0.0:
message: "Attribution report stream schemas fix."
upgradeDeadline: "2023-07-24"
ab_internal:
sl: 300
ql: 400
supportLevel: certified
metadataSpecVersion: "1.0"