🎉 New source: Babelforce (#12700)
* solve deleted file * Set Babelforce version to `0.1.0` * Update Babelforce source documentaiton with the PR link * Remove `.python-version` file in Babelforce source from Git * Fix typo in Babelfoce `spec` description field * Remove redaundant `title` field from JSON schema in Babelforce source * Add examples for the `date_created_from` and `date_created_to` Babelforce source spec * Remove redaundant `static` `authenticator` method in Babelforce source * Check if `date_created_from` less than `date_created_to` In Babelforce source * Add parsing details to `docstring` For Babelforce source * Fix fetching next page token In Babelforce source * Remove parent class `docstring` In Babelforce source * Use `pendulum` to parse date/time In Babelforce source * add babelforce to source index * remove icon * format files * auto-bump connector version Co-authored-by: marcosmarxm <marcosmarxm@gmail.com> Co-authored-by: Octavia Squidington III <octavia-squidington-iii@users.noreply.github.com> Co-authored-by: Marcos Marx <marcosmarxm@users.noreply.github.com>
This commit is contained in:
@@ -143,6 +143,13 @@
|
||||
icon: azureblobstorage.svg
|
||||
sourceType: database
|
||||
releaseStage: alpha
|
||||
- name: Babelforce
|
||||
sourceDefinitionId: 971c3e1e-78a5-411e-ad56-c4052b50876b
|
||||
dockerRepository: airbyte/source-babelforce
|
||||
dockerImageTag: 0.1.0
|
||||
documentationUrl: https://docs.airbyte.com/integrations/sources/babelforce
|
||||
sourceType: api
|
||||
releaseStage: alpha
|
||||
- name: BambooHR
|
||||
sourceDefinitionId: 90916976-a132-4ce9-8bce-82a03dd58788
|
||||
dockerRepository: airbyte/source-bamboo-hr
|
||||
|
||||
@@ -1607,6 +1607,60 @@
|
||||
supportsNormalization: false
|
||||
supportsDBT: false
|
||||
supported_destination_sync_modes: []
|
||||
- dockerImage: "airbyte/source-babelforce:0.1.0"
|
||||
spec:
|
||||
documentationUrl: "https://docsurl.com"
|
||||
connectionSpecification:
|
||||
$schema: "http://json-schema.org/draft-07/schema#"
|
||||
title: "Babelforce Spec"
|
||||
type: "object"
|
||||
required:
|
||||
- "region"
|
||||
- "access_key_id"
|
||||
- "access_token"
|
||||
additionalProperties: false
|
||||
properties:
|
||||
region:
|
||||
type: "string"
|
||||
title: "Region"
|
||||
default: "services"
|
||||
description: "Babelforce region"
|
||||
enum:
|
||||
- "services"
|
||||
- "us-east"
|
||||
- "ap-southeast"
|
||||
order: 1
|
||||
access_key_id:
|
||||
type: "string"
|
||||
title: "Access Key ID"
|
||||
description: "The Babelforce access key ID"
|
||||
airbyte_secret: true
|
||||
order: 2
|
||||
access_token:
|
||||
type: "string"
|
||||
title: "Access Token"
|
||||
description: "The Babelforce access token"
|
||||
airbyte_secret: true
|
||||
order: 3
|
||||
date_created_from:
|
||||
type: "integer"
|
||||
title: "Date Created from"
|
||||
description: "Timestamp in Unix the replication from Babelforce API will\
|
||||
\ start from. For example 1651363200 which corresponds to 2022-05-01 00:00:00."
|
||||
examples:
|
||||
- 1651363200
|
||||
order: 4
|
||||
date_created_to:
|
||||
type: "integer"
|
||||
title: "Date Created to"
|
||||
description: "Timestamp in Unix the replication from Babelforce will be\
|
||||
\ up to. For example 1651363200 which corresponds to 2022-05-01 00:00:00."
|
||||
examples:
|
||||
- 1651363200
|
||||
order: 5
|
||||
supportsNormalization: false
|
||||
supportsDBT: false
|
||||
supported_destination_sync_modes: []
|
||||
- dockerImage: "airbyte/source-bamboo-hr:0.2.2"
|
||||
spec:
|
||||
documentationUrl: "https://docs.airbyte.com/integrations/sources/bamboo-hr"
|
||||
|
||||
@@ -0,0 +1,6 @@
|
||||
*
|
||||
!Dockerfile
|
||||
!main.py
|
||||
!source_babelforce
|
||||
!setup.py
|
||||
!secrets
|
||||
38
airbyte-integrations/connectors/source-babelforce/Dockerfile
Normal file
38
airbyte-integrations/connectors/source-babelforce/Dockerfile
Normal file
@@ -0,0 +1,38 @@
|
||||
FROM python:3.9.11-alpine3.15 as base
|
||||
|
||||
# build and load all requirements
|
||||
FROM base as builder
|
||||
WORKDIR /airbyte/integration_code
|
||||
|
||||
# upgrade pip to the latest version
|
||||
RUN apk --no-cache upgrade \
|
||||
&& pip install --upgrade pip \
|
||||
&& apk --no-cache add tzdata build-base
|
||||
|
||||
|
||||
COPY setup.py ./
|
||||
# install necessary packages to a temporary folder
|
||||
RUN pip install --prefix=/install .
|
||||
|
||||
# build a clean environment
|
||||
FROM base
|
||||
WORKDIR /airbyte/integration_code
|
||||
|
||||
# copy all loaded and built libraries to a pure basic image
|
||||
COPY --from=builder /install /usr/local
|
||||
# add default timezone settings
|
||||
COPY --from=builder /usr/share/zoneinfo/Etc/UTC /etc/localtime
|
||||
RUN echo "Etc/UTC" > /etc/timezone
|
||||
|
||||
# bash is installed for more convenient debugging.
|
||||
RUN apk --no-cache add bash
|
||||
|
||||
# copy payload code only
|
||||
COPY main.py ./
|
||||
COPY source_babelforce ./source_babelforce
|
||||
|
||||
ENV AIRBYTE_ENTRYPOINT "python /airbyte/integration_code/main.py"
|
||||
ENTRYPOINT ["python", "/airbyte/integration_code/main.py"]
|
||||
|
||||
LABEL io.airbyte.version=0.1.0
|
||||
LABEL io.airbyte.name=airbyte/source-babelforce
|
||||
132
airbyte-integrations/connectors/source-babelforce/README.md
Normal file
132
airbyte-integrations/connectors/source-babelforce/README.md
Normal file
@@ -0,0 +1,132 @@
|
||||
# Babelforce Source
|
||||
|
||||
This is the repository for the Babelforce source connector, written in Python.
|
||||
For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.io/integrations/sources/babelforce).
|
||||
|
||||
## Local development
|
||||
|
||||
### Prerequisites
|
||||
**To iterate on this connector, make sure to complete this prerequisites section.**
|
||||
|
||||
#### Minimum Python version required `= 3.7.0`
|
||||
|
||||
#### Build & Activate Virtual Environment and install dependencies
|
||||
From this connector directory, create a virtual environment:
|
||||
```
|
||||
python -m venv .venv
|
||||
```
|
||||
|
||||
This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your
|
||||
development environment of choice. To activate it from the terminal, run:
|
||||
```
|
||||
source .venv/bin/activate
|
||||
pip install -r requirements.txt
|
||||
pip install '.[tests]'
|
||||
```
|
||||
If you are in an IDE, follow your IDE's instructions to activate the virtualenv.
|
||||
|
||||
Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is
|
||||
used for editable installs (`pip install -e`) to pull in Python dependencies from the monorepo and will call `setup.py`.
|
||||
If this is mumbo jumbo to you, don't worry about it, just put your deps in `setup.py` but install using `pip install -r requirements.txt` and everything
|
||||
should work as you expect.
|
||||
|
||||
#### Building via Gradle
|
||||
You can also build the connector in Gradle. This is typically used in CI and not needed for your development workflow.
|
||||
|
||||
To build using Gradle, from the Airbyte repository root, run:
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:source-babelforce:build
|
||||
```
|
||||
|
||||
#### Create credentials
|
||||
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/babelforce)
|
||||
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_babelforce/spec.json` file.
|
||||
Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
|
||||
See `integration_tests/sample_config.json` for a sample config file.
|
||||
|
||||
**If you are an Airbyte core member**, copy the credentials in Lastpass under the secret name `source babelforce test creds`
|
||||
and place them into `secrets/config.json`.
|
||||
|
||||
### Locally running the connector
|
||||
```
|
||||
python main.py spec
|
||||
python main.py check --config secrets/config.json
|
||||
python main.py discover --config secrets/config.json
|
||||
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.json
|
||||
```
|
||||
|
||||
### Locally running the connector docker image
|
||||
|
||||
#### Build
|
||||
First, make sure you build the latest Docker image:
|
||||
```
|
||||
docker build . -t airbyte/source-babelforce:dev
|
||||
```
|
||||
|
||||
You can also build the connector image via Gradle:
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:source-babelforce:airbyteDocker
|
||||
```
|
||||
When building via Gradle, the docker image name and tag, respectively, are the values of the `io.airbyte.name` and `io.airbyte.version` `LABEL`s in
|
||||
the Dockerfile.
|
||||
|
||||
#### Run
|
||||
Then run any of the connector commands as follows:
|
||||
```
|
||||
docker run --rm airbyte/source-babelforce:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-babelforce:dev check --config /secrets/config.json
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-babelforce:dev discover --config /secrets/config.json
|
||||
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-babelforce:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.json
|
||||
```
|
||||
## Testing
|
||||
Make sure to familiarize yourself with [pytest test discovery](https://docs.pytest.org/en/latest/goodpractices.html#test-discovery) to know how your test files and methods should be named.
|
||||
First install test dependencies into your virtual environment:
|
||||
```
|
||||
pip install .[tests]
|
||||
```
|
||||
### Unit Tests
|
||||
To run unit tests locally, from the connector directory run:
|
||||
```
|
||||
python -m pytest unit_tests
|
||||
```
|
||||
|
||||
### Integration Tests
|
||||
There are two types of integration tests: Acceptance Tests (Airbyte's test suite for all source connectors) and custom integration tests (which are specific to this connector).
|
||||
#### Custom Integration tests
|
||||
Place custom tests inside `integration_tests/` folder, then, from the connector root, run
|
||||
```
|
||||
python -m pytest integration_tests
|
||||
```
|
||||
#### Acceptance Tests
|
||||
Customize `acceptance-test-config.yml` file to configure tests. See [Source Acceptance Tests](https://docs.airbyte.io/connector-development/testing-connectors/source-acceptance-tests-reference) for more information.
|
||||
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
|
||||
To run your integration tests with acceptance tests, from the connector root, run
|
||||
```
|
||||
python -m pytest integration_tests -p integration_tests.acceptance
|
||||
```
|
||||
To run your integration tests with docker
|
||||
|
||||
### Using gradle to run tests
|
||||
All commands should be run from airbyte project root.
|
||||
To run unit tests:
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:source-babelforce:unitTest
|
||||
```
|
||||
To run acceptance and custom integration tests:
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:source-babelforce:integrationTest
|
||||
```
|
||||
|
||||
## Dependency Management
|
||||
All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
|
||||
We split dependencies between two groups, dependencies that are:
|
||||
* required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
* required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
### Publishing a new version of the connector
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
1. Make sure your changes are passing unit and integration tests.
|
||||
1. Bump the connector version in `Dockerfile` -- just increment the value of the `LABEL io.airbyte.version` appropriately (we use [SemVer](https://semver.org/)).
|
||||
1. Create a Pull Request.
|
||||
1. Pat yourself on the back for being an awesome contributor.
|
||||
1. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
@@ -0,0 +1,23 @@
|
||||
# See [Source Acceptance Tests](https://docs.airbyte.io/connector-development/testing-connectors/source-acceptance-tests-reference)
|
||||
# for more information about how to configure these tests
|
||||
connector_image: airbyte/source-babelforce:dev
|
||||
tests:
|
||||
spec:
|
||||
- spec_path: "source_babelforce/spec.json"
|
||||
connection:
|
||||
- config_path: "secrets/config.json"
|
||||
status: "succeed"
|
||||
- config_path: "integration_tests/invalid_config.json"
|
||||
status: "failed"
|
||||
discovery:
|
||||
- config_path: "secrets/config.json"
|
||||
basic_read:
|
||||
- config_path: "secrets/config.json"
|
||||
configured_catalog_path: "integration_tests/configured_catalog.json"
|
||||
empty_streams: []
|
||||
incremental:
|
||||
- config_path: "secrets/config.json"
|
||||
configured_catalog_path: "integration_tests/configured_catalog.json"
|
||||
full_refresh:
|
||||
- config_path: "secrets/config.json"
|
||||
configured_catalog_path: "integration_tests/configured_catalog.json"
|
||||
@@ -0,0 +1,16 @@
|
||||
#!/usr/bin/env sh
|
||||
|
||||
# Build latest connector image
|
||||
docker build . -t $(cat acceptance-test-config.yml | grep "connector_image" | head -n 1 | cut -d: -f2-)
|
||||
|
||||
# Pull latest acctest image
|
||||
docker pull airbyte/source-acceptance-test:latest
|
||||
|
||||
# Run
|
||||
docker run --rm -it \
|
||||
-v /var/run/docker.sock:/var/run/docker.sock \
|
||||
-v /tmp:/tmp \
|
||||
-v $(pwd):/test_input \
|
||||
airbyte/source-acceptance-test \
|
||||
--acceptance-test-config /test_input
|
||||
|
||||
@@ -0,0 +1,9 @@
|
||||
plugins {
|
||||
id 'airbyte-python'
|
||||
id 'airbyte-docker'
|
||||
id 'airbyte-source-acceptance-test'
|
||||
}
|
||||
|
||||
airbytePython {
|
||||
moduleDirectory 'source_babelforce'
|
||||
}
|
||||
@@ -0,0 +1,3 @@
|
||||
#
|
||||
# Copyright (c) 2021 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
@@ -0,0 +1,13 @@
|
||||
#
|
||||
# Copyright (c) 2022 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
|
||||
import pytest
|
||||
|
||||
pytest_plugins = ("source_acceptance_test.plugin",)
|
||||
|
||||
|
||||
@pytest.fixture(scope="session", autouse=True)
|
||||
def connector_setup():
|
||||
yield
|
||||
@@ -0,0 +1,16 @@
|
||||
{
|
||||
"streams": [
|
||||
{
|
||||
"stream": {
|
||||
"name": "calls",
|
||||
"supported_sync_modes": ["full_refresh", "incremental"],
|
||||
"source_defined_cursor": true,
|
||||
"default_cursor_field": ["dateCreated"],
|
||||
"source_defined_primary_key": [["id"]],
|
||||
"json_schema": {}
|
||||
},
|
||||
"sync_mode": "incremental",
|
||||
"destination_sync_mode": "overwrite"
|
||||
}
|
||||
]
|
||||
}
|
||||
@@ -0,0 +1,5 @@
|
||||
{
|
||||
"region": "services",
|
||||
"access_key_id": "wrong",
|
||||
"access_token": "wrong"
|
||||
}
|
||||
13
airbyte-integrations/connectors/source-babelforce/main.py
Normal file
13
airbyte-integrations/connectors/source-babelforce/main.py
Normal file
@@ -0,0 +1,13 @@
|
||||
#
|
||||
# Copyright (c) 2022 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
|
||||
import sys
|
||||
|
||||
from airbyte_cdk.entrypoint import launch
|
||||
from source_babelforce import SourceBabelforce
|
||||
|
||||
if __name__ == "__main__":
|
||||
source = SourceBabelforce()
|
||||
launch(source, sys.argv[1:])
|
||||
@@ -0,0 +1,2 @@
|
||||
-e ../../bases/source-acceptance-test
|
||||
-e .
|
||||
27
airbyte-integrations/connectors/source-babelforce/setup.py
Normal file
27
airbyte-integrations/connectors/source-babelforce/setup.py
Normal file
@@ -0,0 +1,27 @@
|
||||
#
|
||||
# Copyright (c) 2022 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
|
||||
from setuptools import find_packages, setup
|
||||
|
||||
MAIN_REQUIREMENTS = ["airbyte-cdk~=0.1", "python-dateutil==2.8.2"]
|
||||
|
||||
TEST_REQUIREMENTS = [
|
||||
"pytest~=6.1",
|
||||
"pytest-mock~=3.6.1",
|
||||
"source-acceptance-test",
|
||||
]
|
||||
|
||||
setup(
|
||||
name="source_babelforce",
|
||||
description="Source implementation for Babelforce.",
|
||||
author="Airbyte",
|
||||
author_email="contact@airbyte.io",
|
||||
packages=find_packages(),
|
||||
install_requires=MAIN_REQUIREMENTS,
|
||||
package_data={"": ["*.json", "schemas/*.json", "schemas/shared/*.json"]},
|
||||
extras_require={
|
||||
"tests": TEST_REQUIREMENTS,
|
||||
},
|
||||
)
|
||||
@@ -0,0 +1,8 @@
|
||||
#
|
||||
# Copyright (c) 2021 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
|
||||
from .source import SourceBabelforce
|
||||
|
||||
__all__ = ["SourceBabelforce"]
|
||||
@@ -0,0 +1,203 @@
|
||||
{
|
||||
"$schema": "http://json-schema.org/draft-07/schema#",
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"id": {
|
||||
"maxLength": 32,
|
||||
"type": "string"
|
||||
},
|
||||
"parentId": {
|
||||
"maxLength": 32,
|
||||
"type": ["string", "null"]
|
||||
},
|
||||
"sessionId": {
|
||||
"maxLength": 32,
|
||||
"type": "string"
|
||||
},
|
||||
"conversationId": {
|
||||
"maxLength": 32,
|
||||
"type": "string"
|
||||
},
|
||||
"dateCreated": {
|
||||
"type": "string",
|
||||
"format": "date-time"
|
||||
},
|
||||
"dateEstablished": {
|
||||
"format": "date-time",
|
||||
"type": [
|
||||
"string",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"dateFinished": {
|
||||
"format": "date-time",
|
||||
"type": [
|
||||
"string",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"lastUpdated": {
|
||||
"type": "string",
|
||||
"format": "date-time"
|
||||
},
|
||||
"state": {
|
||||
"maxLength": 256,
|
||||
"type": "string"
|
||||
},
|
||||
"finishReason": {
|
||||
"maxLength": 256,
|
||||
"type": "string"
|
||||
},
|
||||
"from": {
|
||||
"type": [
|
||||
"string",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"to": {
|
||||
"type": [
|
||||
"string",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"type": {
|
||||
"maxLength": 256,
|
||||
"type": "string"
|
||||
},
|
||||
"source": {
|
||||
"maxLength": 256,
|
||||
"type": "string"
|
||||
},
|
||||
"domain": {
|
||||
"maxLength": 256,
|
||||
"type": "string"
|
||||
},
|
||||
"duration": {
|
||||
"type": [
|
||||
"integer",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"anonymous": {
|
||||
"type": [
|
||||
"boolean",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"recordings": {
|
||||
"type": ["array", "null"],
|
||||
"items": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"id": {
|
||||
"maxLength": 32,
|
||||
"type": "string"
|
||||
},
|
||||
"dateCreated": {
|
||||
"type": "string",
|
||||
"format": "date-time"
|
||||
},
|
||||
"lastUpdated": {
|
||||
"type": "string",
|
||||
"format": "date-time"
|
||||
},
|
||||
"duration": {
|
||||
"type": "integer"
|
||||
},
|
||||
"url": {
|
||||
"type": "string"
|
||||
},
|
||||
"tags": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"file": {
|
||||
"type": ["object", "null"],
|
||||
"properties": {
|
||||
"id": {
|
||||
"maxLength": 32,
|
||||
"type": "string"
|
||||
},
|
||||
"state": {
|
||||
"type": [
|
||||
"string",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"name": {
|
||||
"type": [
|
||||
"string",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"size": {
|
||||
"type": [
|
||||
"integer",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"contentType": {
|
||||
"type": [
|
||||
"string",
|
||||
"null"
|
||||
]
|
||||
}
|
||||
}
|
||||
},
|
||||
"state": {
|
||||
"maxLength": 256,
|
||||
"type": "string"
|
||||
},
|
||||
"agent": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"id": {
|
||||
"type": [
|
||||
"string",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"name": {
|
||||
"type": [
|
||||
"string",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"number": {
|
||||
"type": [
|
||||
"string",
|
||||
"null"
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"bridged": {
|
||||
"type": ["object", "null"],
|
||||
"properties": {
|
||||
"id": {
|
||||
"type": [
|
||||
"string",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"name": {
|
||||
"type": [
|
||||
"string",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"number": {
|
||||
"type": [
|
||||
"string",
|
||||
"null"
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,150 @@
|
||||
#
|
||||
# Copyright (c) 2022 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
import operator
|
||||
from abc import ABC
|
||||
from datetime import datetime
|
||||
from time import mktime
|
||||
from typing import Any, Iterable, List, Mapping, MutableMapping, Optional, Tuple
|
||||
|
||||
import pendulum
|
||||
import requests
|
||||
from airbyte_cdk.sources import AbstractSource
|
||||
from airbyte_cdk.sources.streams import Stream
|
||||
from airbyte_cdk.sources.streams.http import HttpStream
|
||||
from airbyte_cdk.sources.streams.http.auth import HttpAuthenticator
|
||||
from dateutil.tz import tzutc
|
||||
|
||||
DEFAULT_CURSOR = "dateCreated"
|
||||
|
||||
|
||||
class InvalidStartAndEndDateException(Exception):
|
||||
pass
|
||||
|
||||
|
||||
# Basic full refresh stream
|
||||
class BabelforceStream(HttpStream, ABC):
|
||||
page_size = 100
|
||||
|
||||
def __init__(self, region: str, **args):
|
||||
super().__init__(**args)
|
||||
|
||||
self.region = region
|
||||
|
||||
@property
|
||||
def url_base(self) -> str:
|
||||
return f"https://{self.region}.babelforce.com/api/v2/"
|
||||
|
||||
def next_page_token(self, response: requests.Response) -> Optional[Mapping[str, Any]]:
|
||||
"""
|
||||
:param response: the most recent response from the API
|
||||
:return If there is another page in the result, a mapping (e.g: dict) containing information needed to query the next page in the response.
|
||||
If there are no more pages in the result, return None.
|
||||
"""
|
||||
pagination = response.json().get("pagination")
|
||||
|
||||
if pagination.get("current"):
|
||||
return {"page": pagination.get("current", 0) + 1}
|
||||
else:
|
||||
return None
|
||||
|
||||
def parse_response(self, response: requests.Response, **kwargs) -> Iterable[Mapping]:
|
||||
"""
|
||||
Babelforce calls are sorted in reverse order. To process the calls in ascending order an in-memory sort is performed
|
||||
|
||||
:return an iterable containing each record in the response
|
||||
"""
|
||||
items = response.json().get("items")
|
||||
items.sort(key=operator.itemgetter("dateCreated"))
|
||||
keys = self.get_json_schema().get("properties").keys()
|
||||
|
||||
for item in items:
|
||||
yield {key: val for key, val in item.items() if key in keys}
|
||||
|
||||
|
||||
# Basic incremental stream
|
||||
class IncrementalBabelforceStream(BabelforceStream, ABC):
|
||||
cursor_field = DEFAULT_CURSOR
|
||||
|
||||
def get_updated_state(self, current_stream_state: MutableMapping[str, Any], latest_record: Mapping[str, Any]) -> Mapping[str, Any]:
|
||||
if current_stream_state and current_stream_state.get(self.cursor_field):
|
||||
current_updated_at = pendulum.parse(current_stream_state.get(self.cursor_field))
|
||||
else:
|
||||
current_updated_at = datetime(1970, 1, 1)
|
||||
|
||||
current_updated_at = current_updated_at.replace(tzinfo=tzutc())
|
||||
latest_record_updated_at = pendulum.parse(latest_record.get(self.cursor_field)).replace(tzinfo=tzutc())
|
||||
|
||||
return {self.cursor_field: max(latest_record_updated_at, current_updated_at).isoformat(timespec="seconds")}
|
||||
|
||||
|
||||
class Calls(IncrementalBabelforceStream):
|
||||
primary_key = "id"
|
||||
|
||||
def __init__(self, date_created_from: int = None, date_created_to: int = None, **args):
|
||||
super(Calls, self).__init__(**args)
|
||||
|
||||
self.date_created_from = date_created_from
|
||||
self.date_created_to = date_created_to
|
||||
|
||||
def path(
|
||||
self, stream_state: Mapping[str, Any] = None, stream_slice: Mapping[str, Any] = None, next_page_token: Mapping[str, Any] = None
|
||||
) -> str:
|
||||
return "calls/reporting"
|
||||
|
||||
def request_params(
|
||||
self, stream_state: Mapping[str, Any], stream_slice: Mapping[str, any] = None, next_page_token: Mapping[str, Any] = None
|
||||
) -> MutableMapping[str, Any]:
|
||||
page = next_page_token.get("page", 1) if next_page_token else 1
|
||||
|
||||
params = {"page": page, "max": self.page_size}
|
||||
|
||||
if stream_state:
|
||||
cursor_value = pendulum.parse(stream_state[self.cursor_field])
|
||||
self.date_created_from = int(mktime(cursor_value.timetuple()))
|
||||
|
||||
if self.date_created_from and self.date_created_to and self.date_created_from > self.date_created_to:
|
||||
raise InvalidStartAndEndDateException("`date_created_from` should be less than or equal to `date_created_to`")
|
||||
|
||||
if self.date_created_from:
|
||||
params.update({"filters.dateCreated.from": self.date_created_from})
|
||||
|
||||
if self.date_created_to:
|
||||
params.update({"filters.dateCreated.to": self.date_created_to})
|
||||
|
||||
return params
|
||||
|
||||
|
||||
class SourceBabelforce(AbstractSource):
|
||||
def check_connection(self, logger, config) -> Tuple[bool, any]:
|
||||
try:
|
||||
authenticator = BabelforceAuthenticator(access_key_id=config.get("access_key_id"), access_token=config.get("access_token"))
|
||||
calls = Calls(region=config.get("region"), authenticator=authenticator)
|
||||
|
||||
test_url = f"{calls.url_base}{calls.path()}?max=1"
|
||||
response = requests.request("GET", url=test_url, headers=authenticator.get_auth_header())
|
||||
|
||||
if response.ok:
|
||||
return True, None
|
||||
else:
|
||||
response.raise_for_status()
|
||||
except Exception as exception:
|
||||
return False, exception
|
||||
|
||||
def streams(self, config: Mapping[str, Any]) -> List[Stream]:
|
||||
date_created_from = config.get("date_created_from")
|
||||
date_created_to = config.get("date_created_to")
|
||||
region = config.get("region")
|
||||
|
||||
auth = BabelforceAuthenticator(access_key_id=config.get("access_key_id"), access_token=config.get("access_token"))
|
||||
return [Calls(authenticator=auth, region=region, date_created_from=date_created_from, date_created_to=date_created_to)]
|
||||
|
||||
|
||||
class BabelforceAuthenticator(HttpAuthenticator):
|
||||
def __init__(self, access_key_id: str, access_token: str):
|
||||
self.access_key_id = access_key_id
|
||||
self.access_token = access_token
|
||||
|
||||
def get_auth_header(self) -> Mapping[str, Any]:
|
||||
return {"X-Auth-Access-ID": self.access_key_id, "X-Auth-Access-Token": self.access_token}
|
||||
@@ -0,0 +1,48 @@
|
||||
{
|
||||
"documentationUrl": "https://docsurl.com",
|
||||
"connectionSpecification": {
|
||||
"$schema": "http://json-schema.org/draft-07/schema#",
|
||||
"title": "Babelforce Spec",
|
||||
"type": "object",
|
||||
"required": ["region", "access_key_id", "access_token"],
|
||||
"additionalProperties": false,
|
||||
"properties": {
|
||||
"region": {
|
||||
"type": "string",
|
||||
"title": "Region",
|
||||
"default": "services",
|
||||
"description": "Babelforce region",
|
||||
"enum": ["services", "us-east", "ap-southeast"],
|
||||
"order": 1
|
||||
},
|
||||
"access_key_id": {
|
||||
"type": "string",
|
||||
"title": "Access Key ID",
|
||||
"description": "The Babelforce access key ID",
|
||||
"airbyte_secret": true,
|
||||
"order": 2
|
||||
},
|
||||
"access_token": {
|
||||
"type": "string",
|
||||
"title": "Access Token",
|
||||
"description": "The Babelforce access token",
|
||||
"airbyte_secret": true,
|
||||
"order": 3
|
||||
},
|
||||
"date_created_from": {
|
||||
"type": "integer",
|
||||
"title": "Date Created from",
|
||||
"description": "Timestamp in Unix the replication from Babelforce API will start from. For example 1651363200 which corresponds to 2022-05-01 00:00:00.",
|
||||
"examples": [1651363200],
|
||||
"order": 4
|
||||
},
|
||||
"date_created_to": {
|
||||
"type": "integer",
|
||||
"title": "Date Created to",
|
||||
"description": "Timestamp in Unix the replication from Babelforce will be up to. For example 1651363200 which corresponds to 2022-05-01 00:00:00.",
|
||||
"examples": [1651363200],
|
||||
"order": 5
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,3 @@
|
||||
#
|
||||
# Copyright (c) 2021 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
@@ -0,0 +1,75 @@
|
||||
#
|
||||
# Copyright (c) 2022 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
from datetime import datetime
|
||||
from unittest.mock import MagicMock
|
||||
|
||||
import pytest
|
||||
from source_babelforce.source import Calls, InvalidStartAndEndDateException
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def patch_base_class(mocker):
|
||||
# Mock abstract methods to enable instantiating abstract class
|
||||
mocker.patch.object(Calls, "path", "v0/example_endpoint")
|
||||
mocker.patch.object(Calls, "primary_key", "test_primary_key")
|
||||
mocker.patch.object(Calls, "__abstractmethods__", set())
|
||||
|
||||
|
||||
def test_request_params(patch_base_class):
|
||||
stream = Calls(region="services")
|
||||
inputs = {"stream_slice": None, "stream_state": None, "next_page_token": {"page": 1}}
|
||||
expected_params = {"max": 100, "page": 1}
|
||||
assert stream.request_params(**inputs) == expected_params
|
||||
|
||||
|
||||
def test_date_created_from_less_than_date_created_to_not_raise_exception(patch_base_class):
|
||||
try:
|
||||
stream = Calls(region="services", date_created_from=1, date_created_to=2)
|
||||
inputs = {"stream_slice": None, "stream_state": None, "next_page_token": {"page": 1}}
|
||||
stream.request_params(**inputs)
|
||||
except InvalidStartAndEndDateException as exception:
|
||||
assert False, exception
|
||||
|
||||
|
||||
def test_date_created_from_less_than_date_created_to_raise_exception(patch_base_class):
|
||||
with pytest.raises(InvalidStartAndEndDateException):
|
||||
stream = Calls(region="services", date_created_from=2, date_created_to=1)
|
||||
inputs = {"stream_slice": None, "stream_state": None, "next_page_token": {"page": 1}}
|
||||
stream.request_params(**inputs)
|
||||
|
||||
|
||||
def test_parse_response(patch_base_class):
|
||||
stream = Calls(region="services")
|
||||
|
||||
fake_date_str = "2022-04-27T00:00:00"
|
||||
fake_date = datetime.strptime(fake_date_str, "%Y-%m-%dT%H:%M:%S")
|
||||
|
||||
fake_item = {
|
||||
"id": "abc",
|
||||
"parentId": "abc",
|
||||
"sessionId": "abc",
|
||||
"conversationId": "abc",
|
||||
"dateCreated": fake_date,
|
||||
"dateEstablished": None,
|
||||
"dateFinished": fake_date,
|
||||
"lastUpdated": fake_date,
|
||||
"state": "completed",
|
||||
"finishReason": "unreachable",
|
||||
"from": "123",
|
||||
"to": "123",
|
||||
"type": "outbound",
|
||||
"source": "queue",
|
||||
"domain": "internal",
|
||||
"duration": 0,
|
||||
"anonymous": False,
|
||||
"recordings": [],
|
||||
"bridged": {}
|
||||
}
|
||||
|
||||
fake_call_json = {
|
||||
"items": [fake_item]
|
||||
}
|
||||
inputs = {"response": MagicMock(json=MagicMock(return_value=fake_call_json))}
|
||||
assert next(stream.parse_response(**inputs)) == fake_item
|
||||
@@ -0,0 +1,41 @@
|
||||
#
|
||||
# Copyright (c) 2022 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
from dateutil.parser import parse
|
||||
from dateutil.tz import tzutc
|
||||
from pytest import fixture
|
||||
from source_babelforce.source import DEFAULT_CURSOR, IncrementalBabelforceStream
|
||||
|
||||
|
||||
@fixture
|
||||
def patch_incremental_base_class(mocker):
|
||||
# Mock abstract methods to enable instantiating abstract class
|
||||
mocker.patch.object(IncrementalBabelforceStream, "path", "v0/example_endpoint")
|
||||
mocker.patch.object(IncrementalBabelforceStream, "primary_key", "test_primary_key")
|
||||
mocker.patch.object(IncrementalBabelforceStream, "__abstractmethods__", set())
|
||||
|
||||
|
||||
def test_cursor_field(patch_incremental_base_class):
|
||||
stream = IncrementalBabelforceStream(region="services")
|
||||
expected_cursor_field = DEFAULT_CURSOR
|
||||
assert stream.cursor_field == expected_cursor_field
|
||||
|
||||
|
||||
def test_get_updated_state(patch_incremental_base_class):
|
||||
stream = IncrementalBabelforceStream(region="services")
|
||||
fake_date = "2022-02-01T00:00:00"
|
||||
inputs = {"current_stream_state": None, "latest_record": {DEFAULT_CURSOR: fake_date}}
|
||||
expected_state = {DEFAULT_CURSOR: parse(fake_date).replace(tzinfo=tzutc()).isoformat(timespec="seconds")}
|
||||
assert stream.get_updated_state(**inputs) == expected_state
|
||||
|
||||
|
||||
def test_supports_incremental(patch_incremental_base_class, mocker):
|
||||
mocker.patch.object(IncrementalBabelforceStream, "cursor_field", "dummy_field")
|
||||
stream = IncrementalBabelforceStream(region="services")
|
||||
assert stream.supports_incremental
|
||||
|
||||
|
||||
def test_source_defined_cursor(patch_incremental_base_class):
|
||||
stream = IncrementalBabelforceStream(region="services")
|
||||
assert stream.source_defined_cursor
|
||||
@@ -0,0 +1,23 @@
|
||||
#
|
||||
# Copyright (c) 2022 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
from unittest.mock import MagicMock
|
||||
|
||||
import requests
|
||||
from source_babelforce.source import SourceBabelforce
|
||||
|
||||
|
||||
def test_check_connection(mocker):
|
||||
source = SourceBabelforce()
|
||||
logger_mock, config_mock = MagicMock(), MagicMock()
|
||||
mocker.patch.object(requests, "request", return_value=MagicMock(ok=True))
|
||||
assert source.check_connection(logger_mock, config_mock) == (True, None)
|
||||
|
||||
|
||||
def test_streams(mocker):
|
||||
source = SourceBabelforce()
|
||||
config_mock = MagicMock()
|
||||
streams = source.streams(config_mock)
|
||||
expected_streams_number = 1
|
||||
assert len(streams) == expected_streams_number
|
||||
@@ -0,0 +1,64 @@
|
||||
#
|
||||
# Copyright (c) 2022 Airbyte, Inc., all rights reserved.
|
||||
#
|
||||
|
||||
from http import HTTPStatus
|
||||
from unittest.mock import MagicMock
|
||||
|
||||
import pytest
|
||||
from source_babelforce.source import BabelforceStream
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def patch_base_class(mocker):
|
||||
# Mock abstract methods to enable instantiating abstract class
|
||||
mocker.patch.object(BabelforceStream, "path", "v0/example_endpoint")
|
||||
mocker.patch.object(BabelforceStream, "primary_key", "test_primary_key")
|
||||
mocker.patch.object(BabelforceStream, "__abstractmethods__", set())
|
||||
|
||||
|
||||
def test_next_page_token(patch_base_class):
|
||||
stream = BabelforceStream(region="services")
|
||||
|
||||
json_response_mock = MagicMock()
|
||||
json_response_mock.json.return_value = {"pagination": {"current": 1}}
|
||||
|
||||
inputs = {"response": json_response_mock}
|
||||
expected_token = {"page": 2}
|
||||
assert stream.next_page_token(**inputs) == expected_token
|
||||
|
||||
|
||||
def test_request_headers(patch_base_class):
|
||||
stream = BabelforceStream(region="services")
|
||||
inputs = {"stream_slice": None, "stream_state": None, "next_page_token": None}
|
||||
expected_headers = {}
|
||||
assert stream.request_headers(**inputs) == expected_headers
|
||||
|
||||
|
||||
def test_http_method(patch_base_class):
|
||||
stream = BabelforceStream(region="services")
|
||||
expected_method = "GET"
|
||||
assert stream.http_method == expected_method
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
("http_status", "should_retry"),
|
||||
[
|
||||
(HTTPStatus.OK, False),
|
||||
(HTTPStatus.BAD_REQUEST, False),
|
||||
(HTTPStatus.TOO_MANY_REQUESTS, True),
|
||||
(HTTPStatus.INTERNAL_SERVER_ERROR, True),
|
||||
],
|
||||
)
|
||||
def test_should_retry(patch_base_class, http_status, should_retry):
|
||||
response_mock = MagicMock()
|
||||
response_mock.status_code = http_status
|
||||
stream = BabelforceStream(region="services")
|
||||
assert stream.should_retry(response_mock) == should_retry
|
||||
|
||||
|
||||
def test_backoff_time(patch_base_class):
|
||||
response_mock = MagicMock()
|
||||
stream = BabelforceStream(region="services")
|
||||
expected_backoff_time = None
|
||||
assert stream.backoff_time(response_mock) == expected_backoff_time
|
||||
50
docs/integrations/sources/babelforce.md
Normal file
50
docs/integrations/sources/babelforce.md
Normal file
@@ -0,0 +1,50 @@
|
||||
# Recurly
|
||||
|
||||
## Overview
|
||||
|
||||
The Babelforce source supports _Full Refresh_ as well as _Incremental_ syncs.
|
||||
|
||||
_Full Refresh_ sync means every time a sync is run, Airbyte will copy all rows in the tables and columns you set up for replication into the destination in a new table.
|
||||
_Incremental_ syn means only changed resources are copied from Babelformce. For the first run, it will be a Full Refresh sync.
|
||||
|
||||
### Output schema
|
||||
|
||||
Several output streams are available from this source:
|
||||
|
||||
* [Calls](https://api.babelforce.com/#af7a6b6e-b262-487f-aabd-c59e6fe7ba41)
|
||||
|
||||
|
||||
If there are more endpoints you'd like Airbyte to support, please [create an issue.](https://github.com/airbytehq/airbyte/issues/new/choose)
|
||||
|
||||
### Features
|
||||
|
||||
| Feature | Supported? |
|
||||
| :--- | :--- |
|
||||
| Full Refresh Sync | Yes |
|
||||
| Incremental Sync | Yes |
|
||||
| Replicate Incremental Deletes | Coming soon |
|
||||
| SSL connection | Yes |
|
||||
| Namespaces | No |
|
||||
|
||||
### Performance considerations
|
||||
|
||||
There are no performance consideration in the current version.
|
||||
|
||||
## Getting started
|
||||
|
||||
### Requirements
|
||||
|
||||
* Region/environment as listed in the `Regions & environments` section [here](https://api.babelforce.com/#intro)
|
||||
* Babelforce access key ID
|
||||
* Babelforce access token
|
||||
* (Optional) start date from when the import starts in epoch Unix timestamp
|
||||
|
||||
### Setup guide
|
||||
|
||||
Generate a API access key ID and token using the [Babelforce documentation](https://help.babelforce.com/hc/en-us/articles/360035622132-API-documentation-and-endpoints-an-introduction)
|
||||
|
||||
## CHANGELOG
|
||||
|
||||
| Version | Date | Pull Request | Subject |
|
||||
|:--------|:-----------|:---------------------------------------------------------|:----------------------------|
|
||||
0.1.0 | 2022-05-09 | [12700](https://github.com/airbytehq/airbyte/pull/12700) | Introduce Babelforce source |
|
||||
Reference in New Issue
Block a user