🎉 New Source: Adjust (#16051)
* Initial version of the adjust source connector * source-adjust: add bootstrap.md * adjust-source: add setup guide * adjust-source: update integration READMEs * source-adjust: better stream name * source-adjust: fix sample conf * source-adjust: add spec order metadata * source-adjust: improve spec dimension doc * source-adjust: warn on custom metric cast failure * source adjust: Update source_definitions.yaml * source adjust: Update documentation url
This commit is contained in:
@@ -1,3 +1,11 @@
|
||||
- name: Adjust
|
||||
sourceDefinitionId: d3b7fa46-111b-419a-998a-d7f046f6d66d
|
||||
dockerRepository: airbyte/source-adjust
|
||||
dockerImageTag: 0.1.0
|
||||
documentationUrl: https://docs.airbyte.io/integrations/sources/adjust
|
||||
icon: adjust.svg
|
||||
sourceType: api
|
||||
releaseStage: alpha
|
||||
- name: Airtable
|
||||
sourceDefinitionId: 14c6e7ea-97ed-4f5e-a7b5-25e9a80b8212
|
||||
dockerRepository: airbyte/source-airtable
|
||||
|
||||
@@ -5,6 +5,7 @@
|
||||
|name |status |
|
||||
| :--- | :--- |
|
||||
| 3PL Central | [](https://dnsgjos7lj2fu.cloudfront.net/tests/summary/source-tplcentral) |
|
||||
| Adjust | [](https://dnsgjos7lj2fu.cloudfront.net/tests/summary/source-adjust) |
|
||||
| Airtable | [](https://dnsgjos7lj2fu.cloudfront.net/tests/summary/source-airtable) |
|
||||
| AlloyDB | [](https://dnsgjos7lj2fu.cloudfront.net/tests/summary/source-alloydb) |
|
||||
| Amazon Seller Partner | [](https://dnsgjos7lj2fu.cloudfront.net/tests/summary/source-amazon-seller-partner) |
|
||||
|
||||
@@ -0,0 +1,6 @@
|
||||
*
|
||||
!Dockerfile
|
||||
!main.py
|
||||
!source_adjust
|
||||
!setup.py
|
||||
!secrets
|
||||
38
airbyte-integrations/connectors/source-adjust/Dockerfile
Normal file
38
airbyte-integrations/connectors/source-adjust/Dockerfile
Normal file
@@ -0,0 +1,38 @@
|
||||
FROM python:3.9.11-alpine3.15 as base
|
||||
|
||||
# build and load all requirements
|
||||
FROM base as builder
|
||||
WORKDIR /airbyte/integration_code
|
||||
|
||||
# upgrade pip to the latest version
|
||||
RUN apk --no-cache upgrade \
|
||||
&& pip install --upgrade pip \
|
||||
&& apk --no-cache add tzdata build-base
|
||||
|
||||
|
||||
COPY setup.py ./
|
||||
# install necessary packages to a temporary folder
|
||||
RUN pip install --prefix=/install .
|
||||
|
||||
# build a clean environment
|
||||
FROM base
|
||||
WORKDIR /airbyte/integration_code
|
||||
|
||||
# copy all loaded and built libraries to a pure basic image
|
||||
COPY --from=builder /install /usr/local
|
||||
# add default timezone settings
|
||||
COPY --from=builder /usr/share/zoneinfo/Etc/UTC /etc/localtime
|
||||
RUN echo "Etc/UTC" > /etc/timezone
|
||||
|
||||
# bash is installed for more convenient debugging.
|
||||
RUN apk --no-cache add bash
|
||||
|
||||
# copy payload code only
|
||||
COPY main.py ./
|
||||
COPY source_adjust ./source_adjust
|
||||
|
||||
ENV AIRBYTE_ENTRYPOINT "python /airbyte/integration_code/main.py"
|
||||
ENTRYPOINT ["python", "/airbyte/integration_code/main.py"]
|
||||
|
||||
LABEL io.airbyte.version=0.1.0
|
||||
LABEL io.airbyte.name=airbyte/source-adjust
|
||||
124
airbyte-integrations/connectors/source-adjust/README.md
Normal file
124
airbyte-integrations/connectors/source-adjust/README.md
Normal file
@@ -0,0 +1,124 @@
|
||||
# Adjust Source
|
||||
|
||||
This is the repository for the Adjust source connector, written in Python.
|
||||
For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.io/integrations/sources/adjust).
|
||||
|
||||
## Local development
|
||||
|
||||
### Prerequisites
|
||||
**To iterate on this connector, make sure to complete this prerequisites section.**
|
||||
|
||||
#### Minimum Python version required `= 3.9.0`
|
||||
|
||||
#### Build & Activate Virtual Environment and install dependencies
|
||||
From this connector directory, create a virtual environment:
|
||||
```
|
||||
python -m venv .venv
|
||||
```
|
||||
|
||||
This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your
|
||||
development environment of choice. To activate it from the terminal, run:
|
||||
```
|
||||
source .venv/bin/activate
|
||||
pip install -r requirements.txt
|
||||
pip install '.[tests]'
|
||||
```
|
||||
If you are in an IDE, follow your IDE's instructions to activate the virtualenv.
|
||||
|
||||
Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is
|
||||
used for editable installs (`pip install -e`) to pull in Python dependencies from the monorepo and will call `setup.py`.
|
||||
If this is mumbo jumbo to you, don't worry about it, just put your deps in `setup.py` but install using `pip install -r requirements.txt` and everything
|
||||
should work as you expect.
|
||||
|
||||
#### Building via Gradle
|
||||
You can also build the connector in Gradle. This is typically used in CI and not needed for your development workflow.
|
||||
|
||||
To build using Gradle, from the Airbyte repository root, run:
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:source-adjust:build
|
||||
```
|
||||
|
||||
#### Create credentials
|
||||
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/adjust)
|
||||
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_adjust/spec.yaml` file.
|
||||
Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
|
||||
See `integration_tests/sample_config.json` for a sample config file.
|
||||
|
||||
**If you are an Airbyte core member**, copy the credentials in Lastpass under the secret name `source adjust test creds`
|
||||
and place them into `secrets/config.json`.
|
||||
|
||||
### Locally running the connector
|
||||
```
|
||||
python main.py spec
|
||||
python main.py check --config secrets/config.json
|
||||
python main.py discover --config secrets/config.json
|
||||
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.json
|
||||
```
|
||||
|
||||
### Locally running the connector docker image
|
||||
|
||||
#### Build
|
||||
First, make sure you build the latest Docker image:
|
||||
```
|
||||
docker build . -t airbyte/source-adjust:dev
|
||||
```
|
||||
|
||||
You can also build the connector image via Gradle:
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:source-adjust:airbyteDocker
|
||||
```
|
||||
When building via Gradle, the docker image name and tag, respectively, are the values of the `io.airbyte.name` and `io.airbyte.version` `LABEL`s in
|
||||
the Dockerfile.
|
||||
|
||||
#### Run
|
||||
Then run any of the connector commands as follows:
|
||||
```
|
||||
docker run --rm airbyte/source-adjust:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-adjust:dev check --config /secrets/config.json
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-adjust:dev discover --config /secrets/config.json
|
||||
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-adjust:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.json
|
||||
```
|
||||
## Testing
|
||||
Make sure to familiarize yourself with [pytest test discovery](https://docs.pytest.org/en/latest/goodpractices.html#test-discovery) to know how your test files and methods should be named.
|
||||
First install test dependencies into your virtual environment:
|
||||
```
|
||||
pip install .[tests]
|
||||
```
|
||||
### Unit Tests
|
||||
To run unit tests locally, from the connector directory run:
|
||||
```
|
||||
python -m pytest unit_tests
|
||||
```
|
||||
|
||||
### Integration Tests
|
||||
There are two types of integration tests: Acceptance Tests (Airbyte's test suite for all source connectors) and custom integration tests (which are specific to this connector).
|
||||
#### Custom Integration tests
|
||||
Place custom tests inside `integration_tests/` folder, then, from the connector root, run
|
||||
```
|
||||
python -m pytest integration_tests
|
||||
```
|
||||
#### Acceptance Tests
|
||||
Customize `acceptance-test-config.yml` file to configure tests. See [Source Acceptance Tests](https://docs.airbyte.io/connector-development/testing-connectors/source-acceptance-tests-reference) for more information.
|
||||
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
|
||||
To run your integration tests with acceptance tests, from the connector root, run
|
||||
```
|
||||
python -m pytest integration_tests -p integration_tests.acceptance
|
||||
```
|
||||
To run your integration tests with docker
|
||||
|
||||
### Using gradle to run tests
|
||||
All commands should be run from airbyte project root.
|
||||
To run unit tests:
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:source-adjust:unitTest
|
||||
```
|
||||
To run acceptance and custom integration tests:
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:source-adjust:integrationTest
|
||||
```
|
||||
|
||||
## Dependency Management
|
||||
All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
|
||||
We split dependencies between two groups, dependencies that are:
|
||||
* required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
* required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
@@ -0,0 +1,24 @@
|
||||
# See [Source Acceptance Tests](https://docs.airbyte.io/connector-development/testing-connectors/source-acceptance-tests-reference)
|
||||
# for more information about how to configure these tests
|
||||
connector_image: airbyte/source-adjust:dev
|
||||
tests:
|
||||
spec:
|
||||
- spec_path: "source_adjust/spec.yaml"
|
||||
connection:
|
||||
- config_path: "secrets/config.json"
|
||||
status: "succeed"
|
||||
- config_path: "integration_tests/invalid_config.json"
|
||||
status: "failed"
|
||||
discovery:
|
||||
- config_path: "secrets/config.json"
|
||||
basic_read:
|
||||
- config_path: "secrets/config.json"
|
||||
configured_catalog_path: "integration_tests/configured_catalog.json"
|
||||
empty_streams: []
|
||||
incremental:
|
||||
- config_path: "secrets/config.json"
|
||||
configured_catalog_path: "integration_tests/configured_catalog.json"
|
||||
future_state_path: "integration_tests/abnormal_state.json"
|
||||
full_refresh:
|
||||
- config_path: "secrets/config.json"
|
||||
configured_catalog_path: "integration_tests/configured_catalog.json"
|
||||
@@ -0,0 +1,16 @@
|
||||
#!/usr/bin/env sh
|
||||
|
||||
# Build latest connector image
|
||||
docker build . -t $(cat acceptance-test-config.yml | grep "connector_image" | head -n 1 | cut -d: -f2-)
|
||||
|
||||
# Pull latest acctest image
|
||||
docker pull airbyte/source-acceptance-test:latest
|
||||
|
||||
# Run
|
||||
docker run --rm -it \
|
||||
-v /var/run/docker.sock:/var/run/docker.sock \
|
||||
-v /tmp:/tmp \
|
||||
-v $(pwd):/test_input \
|
||||
airbyte/source-acceptance-test \
|
||||
--acceptance-test-config /test_input
|
||||
|
||||
19
airbyte-integrations/connectors/source-adjust/bootstrap.md
Normal file
19
airbyte-integrations/connectors/source-adjust/bootstrap.md
Normal file
@@ -0,0 +1,19 @@
|
||||
# Adjust
|
||||
|
||||
The Adjust source connector interacts with the Adjust reports API, which
|
||||
provides aggregated metrics from various Adjust sources; KPI Service
|
||||
deliverables, KPI Service cohorts, SKAdNetwork, and Ad Spend.
|
||||
|
||||
Metrics and dimensions of interest for a time span are requested from a single
|
||||
HTTP endpoint by using URL query parameters. The time span (also a query
|
||||
parameter)can be specified in several ways, but the connector simply
|
||||
requests daily chunks of data.
|
||||
|
||||
Dimensions allow for a breakdown of metrics into groups. For instance by
|
||||
country and operating system.
|
||||
|
||||
[Authentication](https://help.adjust.com/en/article/report-service-api-authentication)
|
||||
is handled via a regular `Authorization` HTTP header which can be found in the UI.
|
||||
|
||||
See the [reports documentation](https://help.adjust.com/en/article/reports-endpoint)
|
||||
for details on how the API works.
|
||||
@@ -0,0 +1,9 @@
|
||||
plugins {
|
||||
id 'airbyte-python'
|
||||
id 'airbyte-docker'
|
||||
id 'airbyte-source-acceptance-test'
|
||||
}
|
||||
|
||||
airbytePython {
|
||||
moduleDirectory 'source_adjust'
|
||||
}
|
||||
@@ -0,0 +1,3 @@
|
||||
#
|
||||
# Copyright (c) 2022 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
@@ -0,0 +1,5 @@
|
||||
{
|
||||
"adjust_report_stream": {
|
||||
"day": "2050-01-01"
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,14 @@
|
||||
#
|
||||
# Copyright (c) 2022 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
|
||||
import pytest
|
||||
|
||||
pytest_plugins = ("source_acceptance_test.plugin",)
|
||||
|
||||
|
||||
@pytest.fixture(scope="session", autouse=True)
|
||||
def connector_setup():
|
||||
"""This fixture is a placeholder for external resources that acceptance test might require."""
|
||||
yield
|
||||
@@ -0,0 +1,43 @@
|
||||
{
|
||||
"streams": [
|
||||
{
|
||||
"name": "adjust_report_stream",
|
||||
"supported_sync_modes": ["full_refresh", "incremental"],
|
||||
"source_defined_cursor": true,
|
||||
"default_cursor_field": "day",
|
||||
"json_schema": {
|
||||
"$schema": "http://json-schema.org/draft-07/schema#",
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"clicks": {
|
||||
"title": "Clicks",
|
||||
"description": "Clicks.",
|
||||
"type": "integer"
|
||||
},
|
||||
"installs": {
|
||||
"title": "Installs",
|
||||
"description": "Installs.",
|
||||
"type": "integer"
|
||||
},
|
||||
"day": {
|
||||
"title": "Day",
|
||||
"description": "Date.",
|
||||
"type": "string",
|
||||
"format": "date"
|
||||
},
|
||||
"app": {
|
||||
"title": "App",
|
||||
"description": "Name of the app.",
|
||||
"type": "string"
|
||||
},
|
||||
"network": {
|
||||
"title": "Network",
|
||||
"description": "The name of the advertising network.",
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"required": ["day"]
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
@@ -0,0 +1,49 @@
|
||||
{
|
||||
"streams": [
|
||||
{
|
||||
"sync_mode": "incremental",
|
||||
"destination_sync_mode": "append",
|
||||
"stream": {
|
||||
"name": "adjust_report_stream",
|
||||
"json_schema": {
|
||||
"title": "Report",
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"clicks": {
|
||||
"title": "Clicks",
|
||||
"description": "Clicks",
|
||||
"type": "integer"
|
||||
},
|
||||
"installs": {
|
||||
"title": "Installs",
|
||||
"description": "Installs",
|
||||
"type": "integer"
|
||||
},
|
||||
"day": {
|
||||
"title": "Day",
|
||||
"description": "Date.",
|
||||
"type": "string",
|
||||
"format": "date"
|
||||
},
|
||||
"app": {
|
||||
"title": "App",
|
||||
"description": "Name of the app.",
|
||||
"type": "string"
|
||||
},
|
||||
"network": {
|
||||
"title": "Network",
|
||||
"description": "The name of the advertising network.",
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"required": ["day"],
|
||||
"$schema": "http://json-schema.org/draft-07/schema#"
|
||||
},
|
||||
"supported_sync_modes": [
|
||||
"full_refresh",
|
||||
"incremental"
|
||||
]
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
@@ -0,0 +1,7 @@
|
||||
{
|
||||
"api_token": "invalid_token",
|
||||
"ingest_start": "2022-07-07",
|
||||
"metrics": ["installs", "clicks"],
|
||||
"dimensions": ["app", "network"],
|
||||
"additional_metrics": []
|
||||
}
|
||||
@@ -0,0 +1,8 @@
|
||||
{
|
||||
"ingest_start": "2022-07-07",
|
||||
"api_token": "token",
|
||||
"metrics": ["installs", "clicks"],
|
||||
"dimensions": ["app", "network"],
|
||||
"additional_metrics": [],
|
||||
"until_today": true
|
||||
}
|
||||
@@ -0,0 +1,5 @@
|
||||
{
|
||||
"adjust_report_stream": {
|
||||
"day": "2022-07-07"
|
||||
}
|
||||
}
|
||||
13
airbyte-integrations/connectors/source-adjust/main.py
Normal file
13
airbyte-integrations/connectors/source-adjust/main.py
Normal file
@@ -0,0 +1,13 @@
|
||||
#
|
||||
# Copyright (c) 2022 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
|
||||
import sys
|
||||
|
||||
from airbyte_cdk.entrypoint import launch
|
||||
from source_adjust import SourceAdjust
|
||||
|
||||
if __name__ == "__main__":
|
||||
source = SourceAdjust()
|
||||
launch(source, sys.argv[1:])
|
||||
@@ -0,0 +1,2 @@
|
||||
-e ../../bases/source-acceptance-test
|
||||
-e .
|
||||
29
airbyte-integrations/connectors/source-adjust/setup.py
Normal file
29
airbyte-integrations/connectors/source-adjust/setup.py
Normal file
@@ -0,0 +1,29 @@
|
||||
#
|
||||
# Copyright (c) 2022 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
|
||||
from setuptools import find_packages, setup
|
||||
|
||||
MAIN_REQUIREMENTS = [
|
||||
"airbyte-cdk~=0.1.56",
|
||||
]
|
||||
|
||||
TEST_REQUIREMENTS = [
|
||||
"pytest~=6.1",
|
||||
"pytest-mock~=3.6.1",
|
||||
"source-acceptance-test",
|
||||
]
|
||||
|
||||
setup(
|
||||
name="source_adjust",
|
||||
description="Source implementation for Adjust.",
|
||||
author="Airbyte",
|
||||
author_email="contact@airbyte.io",
|
||||
packages=find_packages(),
|
||||
install_requires=MAIN_REQUIREMENTS,
|
||||
package_data={"": ["*.json", "*.yaml", "schemas/*.json", "schemas/shared/*.json"]},
|
||||
extras_require={
|
||||
"tests": TEST_REQUIREMENTS,
|
||||
},
|
||||
)
|
||||
@@ -0,0 +1,8 @@
|
||||
#
|
||||
# Copyright (c) 2022 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
|
||||
from .source import SourceAdjust
|
||||
|
||||
__all__ = ["SourceAdjust"]
|
||||
@@ -0,0 +1,632 @@
|
||||
#
|
||||
# Copyright (c) 2022 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
# The spec and report schema are generated from this module.
|
||||
#
|
||||
# import source_adjust.model, yaml, json
|
||||
# yaml.dump(yaml.safe_load(source_adjust.model.Spec.schema_json()),
|
||||
# stream=open('source_adjust/spec.yaml', 'w'),
|
||||
# )
|
||||
|
||||
import datetime
|
||||
import decimal
|
||||
import typing
|
||||
|
||||
import pydantic
|
||||
|
||||
BASE_METRICS = typing.Literal[
|
||||
"network_cost",
|
||||
"network_cost_diff",
|
||||
"network_clicks",
|
||||
"network_impressions",
|
||||
"network_installs",
|
||||
"network_installs_diff",
|
||||
"network_ecpc",
|
||||
"network_ecpi",
|
||||
"network_ecpm",
|
||||
"arpdau_ad",
|
||||
"arpdau",
|
||||
"arpdau_iap",
|
||||
"ad_impressions",
|
||||
"ad_rpm",
|
||||
"ad_revenue",
|
||||
"cohort_ad_revenue",
|
||||
"cost",
|
||||
"adjust_cost",
|
||||
"all_revenue",
|
||||
"cohort_all_revenue",
|
||||
"daus",
|
||||
"maus",
|
||||
"waus",
|
||||
"base_sessions",
|
||||
"ctr",
|
||||
"click_conversion_rate",
|
||||
"click_cost",
|
||||
"clicks",
|
||||
"paid_clicks",
|
||||
"deattributions",
|
||||
"ecpc",
|
||||
"gdpr_forgets",
|
||||
"gross_profit",
|
||||
"cohort_gross_profit",
|
||||
"impression_conversion_rate",
|
||||
"impression_cost",
|
||||
"impressions",
|
||||
"paid_impressions",
|
||||
"install_cost",
|
||||
"installs",
|
||||
"paid_installs",
|
||||
"installs_per_mile",
|
||||
"limit_ad_tracking_installs",
|
||||
"limit_ad_tracking_install_rate",
|
||||
"limit_ad_tracking_reattribution_rate",
|
||||
"limit_ad_tracking_reattributions",
|
||||
"non_organic_installs",
|
||||
"organic_installs",
|
||||
"roas_ad",
|
||||
"roas",
|
||||
"roas_iap",
|
||||
"reattributions",
|
||||
"return_on_investment",
|
||||
"revenue",
|
||||
"cohort_revenue",
|
||||
"revenue_events",
|
||||
"revenue_to_cost",
|
||||
"sessions",
|
||||
"events",
|
||||
"ecpi_all",
|
||||
"ecpi",
|
||||
"ecpm",
|
||||
]
|
||||
|
||||
|
||||
DIMENSIONS = typing.Literal[
|
||||
"os_name",
|
||||
"device_type",
|
||||
"app",
|
||||
"app_token",
|
||||
"store_id",
|
||||
"store_type",
|
||||
"app_network",
|
||||
"currency",
|
||||
"currency_code",
|
||||
"network",
|
||||
"campaign",
|
||||
"campaign_network",
|
||||
"campaign_id_network",
|
||||
"adgroup",
|
||||
"adgroup_network",
|
||||
"adgroup_id_network",
|
||||
"source_network",
|
||||
"source_id_network",
|
||||
"creative",
|
||||
"creative_network",
|
||||
"creative_id_network",
|
||||
"country",
|
||||
"country_code",
|
||||
"region",
|
||||
"partner_name",
|
||||
"partner_id",
|
||||
]
|
||||
|
||||
|
||||
class Spec(pydantic.BaseModel):
|
||||
"""
|
||||
Adjust reporting API connector.
|
||||
"""
|
||||
|
||||
api_token: str = pydantic.Field(
|
||||
...,
|
||||
title="API Token",
|
||||
description="Adjust API key, see https://help.adjust.com/en/article/report-service-api-authentication",
|
||||
airbyte_secret=True,
|
||||
order=0,
|
||||
)
|
||||
|
||||
ingest_start: datetime.date = pydantic.Field(
|
||||
...,
|
||||
title="Ingest Start Date",
|
||||
description="Data ingest start date.",
|
||||
order=1,
|
||||
)
|
||||
|
||||
metrics: typing.Set[BASE_METRICS] = pydantic.Field(
|
||||
...,
|
||||
title="Metrics to ingest",
|
||||
description="Select at least one metric to query.",
|
||||
min_items=1,
|
||||
order=2,
|
||||
)
|
||||
|
||||
additional_metrics: typing.List[str] = pydantic.Field(
|
||||
None,
|
||||
title="Additional metrics for ingestion",
|
||||
description="Metrics names that are not pre-defined, such as cohort metrics or app specific metrics.",
|
||||
order=3,
|
||||
)
|
||||
|
||||
dimensions: typing.Set[DIMENSIONS] = pydantic.Field(
|
||||
...,
|
||||
description=(
|
||||
"Dimensions allow a user to break down metrics into "
|
||||
"groups using one or several parameters. For example, "
|
||||
"the number of installs by date, country and network. "
|
||||
"See https://help.adjust.com/en/article/reports-endpoint#dimensions "
|
||||
"for more information about the dimensions."
|
||||
),
|
||||
min_items=1,
|
||||
order=4,
|
||||
)
|
||||
|
||||
until_today: bool = pydantic.Field(
|
||||
False,
|
||||
description="Syncs data up until today. Useful when running daily incremental syncs, and duplicates are not desired.",
|
||||
order=5,
|
||||
)
|
||||
|
||||
class Config:
|
||||
title = "Adjust Spec"
|
||||
|
||||
@staticmethod
|
||||
def schema_extra(schema: typing.Dict[str, typing.Any]):
|
||||
spec = {"$schema": "http://json-schema.org/draft-07/schema#"}
|
||||
for key in list(schema.keys()):
|
||||
spec[key] = schema.pop(key)
|
||||
|
||||
schema["connectionSpecification"] = spec
|
||||
schema["documentationUrl"] = "https://docs.airbyte.io/integrations/sources/adjust"
|
||||
|
||||
|
||||
class Report(pydantic.BaseModel):
|
||||
# Base metrics
|
||||
network_cost: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="Ad Spend (Network).",
|
||||
)
|
||||
|
||||
network_cost_diff: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="Ad Spend Diff (Network).",
|
||||
)
|
||||
|
||||
network_clicks: typing.Optional[int] = pydantic.Field(
|
||||
None,
|
||||
description="Clicks (Network).",
|
||||
)
|
||||
|
||||
network_impressions: typing.Optional[int] = pydantic.Field(
|
||||
None,
|
||||
description="Impressions (Network).",
|
||||
)
|
||||
|
||||
network_installs: typing.Optional[int] = pydantic.Field(
|
||||
None,
|
||||
description="Installs (Network).",
|
||||
)
|
||||
|
||||
network_installs_diff: typing.Optional[int] = pydantic.Field(
|
||||
None,
|
||||
description="Installs Diff (Network).",
|
||||
)
|
||||
|
||||
network_ecpc: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="eCPC (Network).",
|
||||
)
|
||||
|
||||
network_ecpi: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="eCPI (Network).",
|
||||
)
|
||||
|
||||
network_ecpm: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="eCPM (Network).",
|
||||
)
|
||||
|
||||
arpdau_ad: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="ARPDAU (Ad).",
|
||||
)
|
||||
|
||||
arpdau: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="ARPDAU (All).",
|
||||
)
|
||||
|
||||
arpdau_iap: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="ARPDAU (IAP).",
|
||||
)
|
||||
|
||||
ad_impressions: typing.Optional[int] = pydantic.Field(
|
||||
None,
|
||||
description="Ad Impressions.",
|
||||
)
|
||||
|
||||
ad_rpm: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="Ad RPM.",
|
||||
)
|
||||
|
||||
ad_revenue: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="Ad Revenue.",
|
||||
)
|
||||
|
||||
cohort_ad_revenue: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="Ad Revenue (Cohort).",
|
||||
)
|
||||
|
||||
cost: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="Ad Spend.",
|
||||
)
|
||||
|
||||
adjust_cost: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="Ad Spend (Attribution).",
|
||||
)
|
||||
|
||||
all_revenue: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="All Revenue.",
|
||||
)
|
||||
|
||||
cohort_all_revenue: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="All Revenue (Cohort).",
|
||||
)
|
||||
|
||||
daus: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="Avg. DAUs.",
|
||||
)
|
||||
|
||||
maus: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="Avg. MAUs.",
|
||||
)
|
||||
|
||||
waus: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="Avg. WAUs.",
|
||||
)
|
||||
|
||||
base_sessions: typing.Optional[int] = pydantic.Field(
|
||||
None,
|
||||
description="Base Sessions.",
|
||||
)
|
||||
|
||||
ctr: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="CTR.",
|
||||
)
|
||||
|
||||
click_conversion_rate: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="Click Conversion Rate (CCR).",
|
||||
)
|
||||
|
||||
click_cost: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="Click Cost.",
|
||||
)
|
||||
|
||||
clicks: typing.Optional[int] = pydantic.Field(
|
||||
None,
|
||||
description="Clicks.",
|
||||
)
|
||||
|
||||
paid_clicks: typing.Optional[int] = pydantic.Field(
|
||||
None,
|
||||
description="Clicks (paid).",
|
||||
)
|
||||
|
||||
deattributions: typing.Optional[int] = pydantic.Field(
|
||||
None,
|
||||
description="Deattributions.",
|
||||
)
|
||||
|
||||
ecpc: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="Effective Cost per Click (eCPC).",
|
||||
)
|
||||
|
||||
gdpr_forgets: typing.Optional[int] = pydantic.Field(
|
||||
None,
|
||||
description="GDPR Forgets.",
|
||||
)
|
||||
|
||||
gross_profit: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="Gross profit.",
|
||||
)
|
||||
|
||||
cohort_gross_profit: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="Gross profit (Cohort).",
|
||||
)
|
||||
|
||||
impression_conversion_rate: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="Impression Conversion Rate (ICR).",
|
||||
)
|
||||
|
||||
impression_cost: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="Impression Cost.",
|
||||
)
|
||||
|
||||
impressions: typing.Optional[int] = pydantic.Field(
|
||||
None,
|
||||
description="Impressions.",
|
||||
)
|
||||
|
||||
paid_impressions: typing.Optional[int] = pydantic.Field(
|
||||
None,
|
||||
description="Impressions (paid).",
|
||||
)
|
||||
|
||||
install_cost: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="Install Cost.",
|
||||
)
|
||||
|
||||
installs: typing.Optional[int] = pydantic.Field(
|
||||
None,
|
||||
description="Installs.",
|
||||
)
|
||||
|
||||
paid_installs: typing.Optional[int] = pydantic.Field(
|
||||
None,
|
||||
description="Installs (paid).",
|
||||
)
|
||||
|
||||
installs_per_mile: typing.Optional[int] = pydantic.Field(
|
||||
None,
|
||||
description="Installs per Mille (IPM).",
|
||||
)
|
||||
|
||||
limit_ad_tracking_installs: typing.Optional[int] = pydantic.Field(
|
||||
None,
|
||||
description="Limit Ad Tracking Installs.",
|
||||
)
|
||||
|
||||
limit_ad_tracking_install_rate: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="Limit Ad Tracking Rate.",
|
||||
)
|
||||
|
||||
limit_ad_tracking_reattribution_rate: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="Limit Ad Tracking Reattribution Rate.",
|
||||
)
|
||||
|
||||
limit_ad_tracking_reattributions: typing.Optional[int] = pydantic.Field(
|
||||
None,
|
||||
description="Limit Ad Tracking Reattributions.",
|
||||
)
|
||||
|
||||
non_organic_installs: typing.Optional[int] = pydantic.Field(
|
||||
None,
|
||||
description="Non-Organic Installs.",
|
||||
)
|
||||
|
||||
organic_installs: typing.Optional[int] = pydantic.Field(
|
||||
None,
|
||||
description="Organic Installs.",
|
||||
)
|
||||
|
||||
roas_ad: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="ROAS (Ad Revenue).",
|
||||
)
|
||||
|
||||
roas: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="ROAS (All Revenue).",
|
||||
)
|
||||
|
||||
roas_iap: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="ROAS (IAP Revenue).",
|
||||
)
|
||||
|
||||
reattributions: typing.Optional[int] = pydantic.Field(
|
||||
None,
|
||||
description="Reattributions.",
|
||||
)
|
||||
|
||||
return_on_investment: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="Return On Investment (ROI).",
|
||||
)
|
||||
|
||||
revenue: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="Revenue.",
|
||||
)
|
||||
|
||||
cohort_revenue: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="Revenue (Cohort).",
|
||||
)
|
||||
|
||||
revenue_events: typing.Optional[int] = pydantic.Field(
|
||||
None,
|
||||
description="Revenue Events.",
|
||||
)
|
||||
|
||||
revenue_to_cost: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="Revenue To Cost Ratio (RCR).",
|
||||
)
|
||||
|
||||
sessions: typing.Optional[int] = pydantic.Field(
|
||||
None,
|
||||
description="Sessions.",
|
||||
)
|
||||
|
||||
events: typing.Optional[int] = pydantic.Field(
|
||||
None,
|
||||
description="Total Events.",
|
||||
)
|
||||
|
||||
ecpi_all: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="eCPI (All Installs).",
|
||||
)
|
||||
|
||||
ecpi: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="eCPI (Paid Installs).",
|
||||
)
|
||||
|
||||
ecpm: typing.Optional[decimal.Decimal] = pydantic.Field(
|
||||
None,
|
||||
description="eCPM (Attribution).",
|
||||
)
|
||||
|
||||
# Dimensions
|
||||
day: datetime.date = pydantic.Field(
|
||||
...,
|
||||
description="Date.",
|
||||
)
|
||||
|
||||
os_name: typing.Optional[str] = pydantic.Field(
|
||||
None,
|
||||
description="Operating system.",
|
||||
)
|
||||
|
||||
device_type: typing.Optional[str] = pydantic.Field(
|
||||
None,
|
||||
description="Device, e.g., phone or tablet.",
|
||||
)
|
||||
|
||||
app: typing.Optional[str] = pydantic.Field(
|
||||
None,
|
||||
description="Name of the app.",
|
||||
)
|
||||
|
||||
app_token: typing.Optional[str] = pydantic.Field(
|
||||
None,
|
||||
description="App ID in the Adjust system.",
|
||||
)
|
||||
|
||||
store_id: typing.Optional[str] = pydantic.Field(
|
||||
None,
|
||||
description="Store App ID.",
|
||||
)
|
||||
|
||||
store_type: typing.Optional[str] = pydantic.Field(
|
||||
None,
|
||||
description="Store from where the app was installed.",
|
||||
)
|
||||
|
||||
app_network: typing.Optional[str] = pydantic.Field(
|
||||
None,
|
||||
description="Value with format: <store_type>:<store_id>",
|
||||
)
|
||||
|
||||
currency: typing.Optional[str] = pydantic.Field(
|
||||
None,
|
||||
description="Currency name.",
|
||||
)
|
||||
|
||||
currency_code: typing.Optional[str] = pydantic.Field(
|
||||
None,
|
||||
description="3-character value ISO 4217.",
|
||||
)
|
||||
|
||||
network: typing.Optional[str] = pydantic.Field(
|
||||
None,
|
||||
description="The name of the advertising network.",
|
||||
)
|
||||
|
||||
campaign: typing.Optional[str] = pydantic.Field(
|
||||
None,
|
||||
description=("Tracker sub-level 1. String value usually " "contains campaign name and id."),
|
||||
)
|
||||
|
||||
campaign_network: typing.Optional[str] = pydantic.Field(
|
||||
None,
|
||||
description="Campaign name from the network.",
|
||||
)
|
||||
|
||||
campaign_id_network: typing.Optional[str] = pydantic.Field(
|
||||
None,
|
||||
description="Campaign ID from the network.",
|
||||
)
|
||||
|
||||
adgroup: typing.Optional[str] = pydantic.Field(
|
||||
None,
|
||||
description=("Tracker sub-level 2. String value usually " "contains adgroup name and id."),
|
||||
)
|
||||
|
||||
adgroup_network: typing.Optional[str] = pydantic.Field(
|
||||
None,
|
||||
description="Adgroup name from the network.",
|
||||
)
|
||||
|
||||
adgroup_id_network: typing.Optional[str] = pydantic.Field(
|
||||
None,
|
||||
description="Adgroup ID from the network.",
|
||||
)
|
||||
|
||||
source_network: typing.Optional[str] = pydantic.Field(
|
||||
None,
|
||||
description=("Optional value dependent on the network. " "Usually the same as adgroup_network."),
|
||||
)
|
||||
|
||||
source_id_network: typing.Optional[str] = pydantic.Field(
|
||||
None,
|
||||
description="Value for source_app.",
|
||||
)
|
||||
|
||||
creative: typing.Optional[str] = pydantic.Field(
|
||||
None,
|
||||
description=("Tracker sub-level 3. String value usually " "contains creative name and id."),
|
||||
)
|
||||
|
||||
creative_network: typing.Optional[str] = pydantic.Field(
|
||||
None,
|
||||
description="Creative name from the network.",
|
||||
)
|
||||
|
||||
creative_id_network: typing.Optional[str] = pydantic.Field(
|
||||
None,
|
||||
description="Creative ID from the network.",
|
||||
)
|
||||
|
||||
country: typing.Optional[str] = pydantic.Field(
|
||||
None,
|
||||
description="Country name.",
|
||||
)
|
||||
|
||||
country_code: typing.Optional[str] = pydantic.Field(
|
||||
None,
|
||||
description="2-character value ISO 3166.",
|
||||
)
|
||||
|
||||
region: typing.Optional[str] = pydantic.Field(
|
||||
None,
|
||||
description="Business region.",
|
||||
)
|
||||
|
||||
partner_name: typing.Optional[str] = pydantic.Field(
|
||||
None,
|
||||
description="Partner's name in the Adjust system.",
|
||||
)
|
||||
|
||||
partner_id: typing.Optional[str] = pydantic.Field(
|
||||
None,
|
||||
description="Partner’s id in the Adjust system.",
|
||||
)
|
||||
|
||||
class Config:
|
||||
@staticmethod
|
||||
def schema_extra(schema: typing.Dict[str, typing.Any]):
|
||||
schema["$schema"] = "http://json-schema.org/draft-07/schema#"
|
||||
@@ -0,0 +1,214 @@
|
||||
#
|
||||
# Copyright (c) 2022 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import datetime
|
||||
import decimal
|
||||
from typing import Any, Iterable, List, Mapping, MutableMapping, Optional, Tuple
|
||||
|
||||
import requests
|
||||
import source_adjust.model
|
||||
from airbyte_cdk.models import AirbyteMessage, AirbyteStateMessage, SyncMode, Type
|
||||
from airbyte_cdk.sources import AbstractSource
|
||||
from airbyte_cdk.sources.streams import IncrementalMixin, Stream
|
||||
from airbyte_cdk.sources.streams.http import HttpStream
|
||||
from airbyte_cdk.sources.streams.http.requests_native_auth import TokenAuthenticator
|
||||
|
||||
|
||||
class AdjustReportStream(HttpStream, IncrementalMixin):
|
||||
"""
|
||||
Adjust reports service integration with support for incremental synchronization.
|
||||
"""
|
||||
|
||||
def __init__(self, connector: "SourceAdjust", config: Mapping[str, Any], **kwargs):
|
||||
super().__init__(**kwargs)
|
||||
|
||||
self.connector = connector
|
||||
self.config = config
|
||||
self._cursor: Optional[datetime.date] = None
|
||||
|
||||
@property
|
||||
def url_base(self) -> str:
|
||||
return "https://dash.adjust.com/control-center/reports-service/"
|
||||
|
||||
@property
|
||||
def state(self):
|
||||
if self._cursor is not None:
|
||||
cursor = self._cursor.isoformat()
|
||||
else:
|
||||
cursor = self.config["ingest_start"]
|
||||
|
||||
return {
|
||||
self.cursor_field: cursor,
|
||||
}
|
||||
|
||||
@state.setter
|
||||
def state(self, value: MutableMapping[str, Any]):
|
||||
self._cursor = datetime.date.fromisoformat(value[self.cursor_field])
|
||||
|
||||
def read_records(
|
||||
self,
|
||||
sync_mode: SyncMode,
|
||||
cursor_field: Optional[List[str]] = None,
|
||||
stream_slice: Optional[Mapping[str, Any]] = None,
|
||||
stream_state: Optional[Mapping[str, Any]] = None,
|
||||
) -> Iterable[Mapping[str, Any]]:
|
||||
fallback = datetime.date.fromisoformat(self.config["ingest_start"])
|
||||
cf: str = self.cursor_field
|
||||
|
||||
for record in super().read_records(sync_mode, cursor_field, stream_slice, stream_state):
|
||||
record_stamp = datetime.date.fromisoformat(record[cf])
|
||||
self._cursor = max(record_stamp, self._cursor or fallback)
|
||||
yield record
|
||||
|
||||
def path(
|
||||
self,
|
||||
stream_state: Optional[Mapping[str, Any]] = None,
|
||||
stream_slice: Optional[Mapping[str, Any]] = None,
|
||||
next_page_token: Optional[Mapping[str, Any]] = None,
|
||||
) -> str:
|
||||
"""
|
||||
Report URL path suffix.
|
||||
"""
|
||||
return "report"
|
||||
|
||||
def request_params(
|
||||
self,
|
||||
stream_state: Mapping[str, Any],
|
||||
stream_slice: Optional[Mapping[str, Any]] = None,
|
||||
next_page_token: Optional[Mapping[str, Any]] = None,
|
||||
) -> MutableMapping[str, Any]:
|
||||
"""
|
||||
Get query parameter definitions.
|
||||
"""
|
||||
required_dimensions = ["day"]
|
||||
dimensions = required_dimensions + self.config["dimensions"]
|
||||
metrics = self.config["metrics"] + self.config["additional_metrics"]
|
||||
date = stream_slice[self.cursor_field]
|
||||
return {
|
||||
"date_period": ":".join([date, date]), # inclusive
|
||||
"metrics": ",".join(metrics),
|
||||
"dimensions": ",".join(dimensions),
|
||||
}
|
||||
|
||||
def parse_response(self, response: requests.Response, **kwargs) -> Iterable[Mapping]:
|
||||
def reshape(row: MutableMapping[str, Any]):
|
||||
model = source_adjust.model.Report.__dict__["__fields__"]
|
||||
row.pop("attr_dependency", None)
|
||||
# Unfortunately all fields are returned as strings by the API
|
||||
for k, v in list(row.items()):
|
||||
if k in model:
|
||||
type_ = model[k].type_
|
||||
else: # Additional user-provided metrics are assumed to be decimal
|
||||
type_ = decimal.Decimal
|
||||
if type_ in (int, decimal.Decimal):
|
||||
try:
|
||||
row[k] = type_(v)
|
||||
except TypeError:
|
||||
self.logger.warning("Unable to convert field '%s': %s to %s, leaving '%s' as is", k, v, type_.__name__, k)
|
||||
|
||||
return row
|
||||
|
||||
body = response.json()
|
||||
return (reshape(row) for row in body["rows"])
|
||||
|
||||
def stream_slices(self, stream_state: Optional[Mapping[str, Any]] = None, **kwargs) -> Iterable[Optional[Mapping[str, Any]]]:
|
||||
cf: str = self.cursor_field
|
||||
now = datetime.datetime.utcnow().date()
|
||||
|
||||
if self._cursor and self._cursor > now:
|
||||
self.logger.warning("State ingest target date in future, setting cursor to today's date")
|
||||
self._cursor = now
|
||||
self.connector.checkpoint()
|
||||
if stream_state is not None and stream_state.get(cf):
|
||||
date = datetime.date.fromisoformat(stream_state[cf])
|
||||
if now - date == datetime.timedelta(days=1):
|
||||
return
|
||||
else:
|
||||
date = datetime.date.fromisoformat(self.config["ingest_start"])
|
||||
|
||||
if self.config["until_today"]:
|
||||
end_date = now
|
||||
else:
|
||||
end_date = now + datetime.timedelta(days=1)
|
||||
|
||||
while date < end_date:
|
||||
yield {cf: date.isoformat()}
|
||||
date += datetime.timedelta(days=1)
|
||||
|
||||
def get_json_schema(self):
|
||||
"""
|
||||
Prune the schema to only include selected fields to synchronize.
|
||||
"""
|
||||
schema = source_adjust.model.Report.schema()
|
||||
properties = schema["properties"]
|
||||
|
||||
required = schema["required"]
|
||||
selected = self.config["metrics"] + self.config["dimensions"]
|
||||
retain = required + selected
|
||||
for attr in list(properties.keys()):
|
||||
if attr not in retain:
|
||||
del properties[attr]
|
||||
|
||||
for attr in self.config["additional_metrics"]:
|
||||
properties[attr] = {"type": "number"}
|
||||
|
||||
return schema
|
||||
|
||||
@property
|
||||
def cursor_field(self) -> str:
|
||||
"""
|
||||
Name of the field in the API response body used as cursor.
|
||||
"""
|
||||
return "day"
|
||||
|
||||
@property
|
||||
def primary_key(self):
|
||||
return None
|
||||
|
||||
def next_page_token(self, response: requests.Response) -> Optional[Mapping[str, Any]]:
|
||||
return None
|
||||
|
||||
|
||||
class SourceAdjust(AbstractSource):
|
||||
check_endpoint = "https://dash.adjust.com/control-center/reports-service/filters_data"
|
||||
|
||||
def check_connection(self, logger, config) -> Tuple[bool, Any]:
|
||||
"""
|
||||
Verify the configuration supplied can be used to connect to the API.
|
||||
|
||||
:param config: config object as per definition in spec.yaml
|
||||
:param logger: logger object
|
||||
:return: (True, None) on connecton to the API successfully,
|
||||
(False, error) otherwise.
|
||||
"""
|
||||
requests.get(
|
||||
url=self.check_endpoint,
|
||||
headers={"Authorization": f'Bearer {config["api_token"]:s}'},
|
||||
).raise_for_status()
|
||||
return True, None # Are we coding in go?
|
||||
|
||||
def streams(self, config: Mapping[str, Any]) -> List[Stream]:
|
||||
"""
|
||||
Stream registry.
|
||||
|
||||
:param config: user input configuration as defined in the connector spec.
|
||||
"""
|
||||
auth = TokenAuthenticator(token=config["api_token"])
|
||||
|
||||
self._streams = [
|
||||
AdjustReportStream(connector=self, config=config, authenticator=auth),
|
||||
]
|
||||
return self._streams
|
||||
|
||||
def checkpoint(self):
|
||||
"""
|
||||
Checkpoint state.
|
||||
"""
|
||||
state = AirbyteMessage(
|
||||
type=Type.STATE,
|
||||
state=AirbyteStateMessage(
|
||||
data={stream.name: stream.state for stream in self._streams},
|
||||
),
|
||||
)
|
||||
print(state.json(exclude_unset=True)) # Emit state
|
||||
@@ -0,0 +1,150 @@
|
||||
connectionSpecification:
|
||||
$schema: http://json-schema.org/draft-07/schema#
|
||||
description: Adjust reporting API connector.
|
||||
properties:
|
||||
additional_metrics:
|
||||
description: Metrics names that are not pre-defined, such as cohort metrics
|
||||
or app specific metrics.
|
||||
items:
|
||||
type: string
|
||||
order: 3
|
||||
title: Additional metrics for ingestion
|
||||
type: array
|
||||
api_token:
|
||||
airbyte_secret: true
|
||||
description: Adjust API key, see https://help.adjust.com/en/article/report-service-api-authentication
|
||||
order: 0
|
||||
title: API Token
|
||||
type: string
|
||||
dimensions:
|
||||
description: Dimensions allow a user to break down metrics into groups using
|
||||
one or several parameters. For example, the number of installs by date, country
|
||||
and network. See https://help.adjust.com/en/article/reports-endpoint#dimensions
|
||||
for more information about the dimensions.
|
||||
items:
|
||||
enum:
|
||||
- os_name
|
||||
- device_type
|
||||
- app
|
||||
- app_token
|
||||
- store_id
|
||||
- store_type
|
||||
- app_network
|
||||
- currency
|
||||
- currency_code
|
||||
- network
|
||||
- campaign
|
||||
- campaign_network
|
||||
- campaign_id_network
|
||||
- adgroup
|
||||
- adgroup_network
|
||||
- adgroup_id_network
|
||||
- source_network
|
||||
- source_id_network
|
||||
- creative
|
||||
- creative_network
|
||||
- creative_id_network
|
||||
- country
|
||||
- country_code
|
||||
- region
|
||||
- partner_name
|
||||
- partner_id
|
||||
type: string
|
||||
minItems: 1
|
||||
order: 4
|
||||
title: Dimensions
|
||||
type: array
|
||||
uniqueItems: true
|
||||
ingest_start:
|
||||
description: Data ingest start date.
|
||||
format: date
|
||||
order: 1
|
||||
title: Ingest Start Date
|
||||
type: string
|
||||
metrics:
|
||||
description: Select at least one metric to query.
|
||||
items:
|
||||
enum:
|
||||
- network_cost
|
||||
- network_cost_diff
|
||||
- network_clicks
|
||||
- network_impressions
|
||||
- network_installs
|
||||
- network_installs_diff
|
||||
- network_ecpc
|
||||
- network_ecpi
|
||||
- network_ecpm
|
||||
- arpdau_ad
|
||||
- arpdau
|
||||
- arpdau_iap
|
||||
- ad_impressions
|
||||
- ad_rpm
|
||||
- ad_revenue
|
||||
- cohort_ad_revenue
|
||||
- cost
|
||||
- adjust_cost
|
||||
- all_revenue
|
||||
- cohort_all_revenue
|
||||
- daus
|
||||
- maus
|
||||
- waus
|
||||
- base_sessions
|
||||
- ctr
|
||||
- click_conversion_rate
|
||||
- click_cost
|
||||
- clicks
|
||||
- paid_clicks
|
||||
- deattributions
|
||||
- ecpc
|
||||
- gdpr_forgets
|
||||
- gross_profit
|
||||
- cohort_gross_profit
|
||||
- impression_conversion_rate
|
||||
- impression_cost
|
||||
- impressions
|
||||
- paid_impressions
|
||||
- install_cost
|
||||
- installs
|
||||
- paid_installs
|
||||
- installs_per_mile
|
||||
- limit_ad_tracking_installs
|
||||
- limit_ad_tracking_install_rate
|
||||
- limit_ad_tracking_reattribution_rate
|
||||
- limit_ad_tracking_reattributions
|
||||
- non_organic_installs
|
||||
- organic_installs
|
||||
- roas_ad
|
||||
- roas
|
||||
- roas_iap
|
||||
- reattributions
|
||||
- return_on_investment
|
||||
- revenue
|
||||
- cohort_revenue
|
||||
- revenue_events
|
||||
- revenue_to_cost
|
||||
- sessions
|
||||
- events
|
||||
- ecpi_all
|
||||
- ecpi
|
||||
- ecpm
|
||||
type: string
|
||||
minItems: 1
|
||||
order: 2
|
||||
title: Metrics to ingest
|
||||
type: array
|
||||
uniqueItems: true
|
||||
until_today:
|
||||
default: false
|
||||
description: Syncs data up until today. Useful when running daily incremental
|
||||
syncs, and duplicates are not desired.
|
||||
order: 5
|
||||
title: Until Today
|
||||
type: boolean
|
||||
required:
|
||||
- api_token
|
||||
- ingest_start
|
||||
- metrics
|
||||
- dimensions
|
||||
title: Adjust Spec
|
||||
type: object
|
||||
documentationUrl: https://docs.airbyte.io/integrations/sources/adjust
|
||||
@@ -0,0 +1,3 @@
|
||||
#
|
||||
# Copyright (c) 2022 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
@@ -0,0 +1,43 @@
|
||||
#
|
||||
# Copyright (c) 2022 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import datetime
|
||||
|
||||
from airbyte_cdk.models import SyncMode
|
||||
from source_adjust.source import AdjustReportStream
|
||||
|
||||
CONFIG = {
|
||||
"ingest_start": "2022-07-01",
|
||||
"until_today": True,
|
||||
}
|
||||
|
||||
|
||||
def test_cursor_field():
|
||||
stream = AdjustReportStream(connector=None, config=CONFIG)
|
||||
expected_cursor_field = "day"
|
||||
assert stream.cursor_field == expected_cursor_field
|
||||
|
||||
|
||||
def test_stream_slices():
|
||||
stream = AdjustReportStream(connector=None, config=CONFIG)
|
||||
period = 5
|
||||
start = datetime.date.today() - datetime.timedelta(days=period)
|
||||
inputs = {"sync_mode": SyncMode.incremental, "cursor_field": "day", "stream_state": {"day": start.isoformat()}}
|
||||
assert list(stream.stream_slices(**inputs)) == [{"day": (start + datetime.timedelta(days=d)).isoformat()} for d in range(period)]
|
||||
|
||||
|
||||
def test_supports_incremental():
|
||||
stream = AdjustReportStream(connector=None, config=CONFIG)
|
||||
assert stream.supports_incremental
|
||||
|
||||
|
||||
def test_source_defined_cursor():
|
||||
stream = AdjustReportStream(connector=None, config=CONFIG)
|
||||
assert stream.source_defined_cursor
|
||||
|
||||
|
||||
def test_stream_checkpoint_interval():
|
||||
stream = AdjustReportStream(connector=None, config=CONFIG)
|
||||
expected_checkpoint_interval = None
|
||||
assert stream.state_checkpoint_interval == expected_checkpoint_interval
|
||||
@@ -0,0 +1,26 @@
|
||||
#
|
||||
# Copyright (c) 2022 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
from unittest import mock
|
||||
|
||||
from source_adjust.source import SourceAdjust
|
||||
|
||||
API_KEY = "API KEY"
|
||||
CONFIG = {
|
||||
"api_token": API_KEY,
|
||||
}
|
||||
|
||||
|
||||
def test_check_connection():
|
||||
source = SourceAdjust()
|
||||
with mock.patch("requests.get") as http_get:
|
||||
assert source.check_connection(mock.MagicMock(), CONFIG) == (True, None)
|
||||
assert http_get.call_count == 1
|
||||
|
||||
|
||||
def test_streams():
|
||||
source = SourceAdjust()
|
||||
streams = source.streams(CONFIG)
|
||||
expected_streams_number = 1
|
||||
assert len(streams) == expected_streams_number
|
||||
@@ -0,0 +1,96 @@
|
||||
#
|
||||
# Copyright (c) 2022 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
from http import HTTPStatus
|
||||
from unittest.mock import MagicMock
|
||||
|
||||
import pytest
|
||||
from source_adjust.source import AdjustReportStream
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def stream():
|
||||
return AdjustReportStream(
|
||||
connector=None,
|
||||
config={
|
||||
"ingest_start": "2022-07-01",
|
||||
"dimensions": [],
|
||||
"metrics": [],
|
||||
"additional_metrics": [],
|
||||
"api_token": "",
|
||||
},
|
||||
)
|
||||
|
||||
|
||||
def test_request_params(stream):
|
||||
some_date = "2022-07-07"
|
||||
inputs = {
|
||||
"stream_slice": {"day": some_date},
|
||||
"stream_state": None,
|
||||
"next_page_token": None,
|
||||
}
|
||||
expected_params = {
|
||||
"date_period": f"{some_date}:{some_date}",
|
||||
"dimensions": "day",
|
||||
"metrics": "",
|
||||
}
|
||||
assert stream.request_params(**inputs) == expected_params
|
||||
|
||||
|
||||
def test_next_page_token(stream):
|
||||
expected_token = None
|
||||
assert (
|
||||
stream.next_page_token(
|
||||
response=None,
|
||||
)
|
||||
== expected_token
|
||||
)
|
||||
|
||||
|
||||
def test_parse_response(stream):
|
||||
body = {
|
||||
"rows": [
|
||||
{
|
||||
"device_type": "phone",
|
||||
"app_token": "some id",
|
||||
}
|
||||
],
|
||||
}
|
||||
inputs = {
|
||||
"response": MagicMock(json=lambda: body),
|
||||
"stream_state": {},
|
||||
}
|
||||
assert next(stream.parse_response(**inputs)) == body["rows"][0]
|
||||
|
||||
|
||||
def test_request_headers(stream):
|
||||
inputs = {"stream_slice": None, "stream_state": None, "next_page_token": None}
|
||||
expected_headers = {}
|
||||
assert stream.request_headers(**inputs) == expected_headers
|
||||
|
||||
|
||||
def test_http_method(stream):
|
||||
expected_method = "GET"
|
||||
assert stream.http_method == expected_method
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
("http_status", "should_retry"),
|
||||
[
|
||||
(HTTPStatus.OK, False),
|
||||
(HTTPStatus.BAD_REQUEST, False),
|
||||
(HTTPStatus.TOO_MANY_REQUESTS, True),
|
||||
(HTTPStatus.INTERNAL_SERVER_ERROR, True),
|
||||
],
|
||||
)
|
||||
def test_should_retry(stream, http_status, should_retry):
|
||||
response_mock = MagicMock()
|
||||
response_mock.status_code = http_status
|
||||
assert stream.should_retry(response_mock) == should_retry
|
||||
|
||||
|
||||
def test_backoff_time(stream):
|
||||
response_mock = MagicMock()
|
||||
expected_backoff_time = None
|
||||
assert stream.backoff_time(response_mock) == expected_backoff_time
|
||||
@@ -17,6 +17,7 @@ For more information about the grading system, see [Product Release Stages](http
|
||||
| Connector | Product Release Stage| Available in Cloud? |
|
||||
|:--------------------------------------------------------------------------------------------| :------------------- | :------------------ |
|
||||
| [3PL Central](sources/tplcentral.md) | Alpha | No |
|
||||
| [Adjust](sources/adjust.md) | Alpha | No |
|
||||
| [Airtable](sources/airtable.md) | Alpha | No |
|
||||
| [AlloyDB](sources/alloydb.md) | Generally Available | Yes |
|
||||
| [Amazon Ads](sources/amazon-ads.md) | Beta | Yes |
|
||||
|
||||
41
docs/integrations/sources/adjust.md
Normal file
41
docs/integrations/sources/adjust.md
Normal file
@@ -0,0 +1,41 @@
|
||||
# Airbyte Source Connector for Adjust
|
||||
|
||||
This is a setup guide for the Adjust source connector which ingests data from the reports API.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
An API token is required to get hold of reports from the Adjust reporting API. See the [Adjust API authentication help article](https://help.adjust.com/en/article/report-service-api-authentication) on how to obtain a key.
|
||||
|
||||
As Adjust allows you to setup custom events etc that are specific to your apps, only a subset of available metrics are made pre-selectable. To list all metrics that are available, query the filters data endpoint. Information about available metrics are available in the [Datascape metrics glossary](https://help.adjust.com/en/article/datascape-metrics-glossary).
|
||||
|
||||
### Full Metrics Listing
|
||||
Take a look at the [filters data endpoint documentation](https://help.adjust.com/en/article/filters-data-endpoint) to see available filters. The example below shows how to obtain the events that are defined for your apps (replace the `API_KEY` with the key obtained in the previous step):
|
||||
|
||||
```sh
|
||||
curl --header 'Authorization: Bearer API_KEY' 'https://dash.adjust.com/control-center/reports-service/filters_data?required_filters=event_metrics' | jq
|
||||
```
|
||||
|
||||
## Set up the Adjust source connector
|
||||
|
||||
1. Click **Sources** and then click **+ New source**.
|
||||
2. On the Set up the source page, select **Adjust** from the Source type dropdown.
|
||||
3. Enter a name for your new source.
|
||||
4. For **API Token**, enter your API key obtained in the previous step.
|
||||
5. For **Ingestion Start Date**, enter a date in YYYY-MM-DD format (UTC timezone is assumed). Data starting from this date will be replicated.
|
||||
6. In the **Metrics to Ingest** field, select the metrics of interest to query.
|
||||
7. Enter any additional, custom metrics, to query in the **Additional Metrics** box. Available metrics can be listed as described in the Prerequisites section. These selected metrics are assumed to be decimal values.
|
||||
8. In the **Dimensions** field, select the dimensions to group metrics by.
|
||||
9. Click **Set up source**.
|
||||
|
||||
## Supported sync modes
|
||||
|
||||
The source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes):
|
||||
|
||||
- Full Refresh
|
||||
- Incremental
|
||||
|
||||
## Changelog
|
||||
|
||||
| Version | Date | Pull Request | Description |
|
||||
|---------|------------|----------------------------------------------------------|------------------|
|
||||
| 0.1.0 | 2022-08-26 | [16051](https://github.com/airbytehq/airbyte/pull/16051) | Initial version. |
|
||||
Reference in New Issue
Block a user