* add source onesignal
* update PR number in change log
* change source define cursor and sync mode
* get correct max cursor time across stream slices
* code improvement as code review advices
* format code
* remove unused code
* remove TODOs
Co-authored-by: Maksym Pavlenok <antixar@gmail.com>
* create a new connector folder
* add base classes
* add schemas
* add ads/ad_groups streams
* update tests and docs
* add a bootstrap file
* update the base domain name for API
* update the base domain name for API
* update Dockerfile libs
* remove unused comments
* Update airbyte-integrations/connectors/source-tiktok-marketing/bootstrap.md
Co-authored-by: George Claireaux <george@claireaux.co.uk>
* Update airbyte-integrations/connectors/source-tiktok-marketing/bootstrap.md
Co-authored-by: George Claireaux <george@claireaux.co.uk>
* fix UI error with input parameters
* fix problem with updated state
* Update airbyte-integrations/connectors/source-tiktok-marketing/source_tiktok_marketing/spec.py
Co-authored-by: Davin Chia <davinchia@gmail.com>
* add an unit test
* update README.md
* bump version
Co-authored-by: Maksym Pavlenok <maksym.pavlenok@globallogic.com>
Co-authored-by: George Claireaux <george@claireaux.co.uk>
Co-authored-by: Davin Chia <davinchia@gmail.com>
Implement new destination connector for databricks delta lake.
Resolves#2075.
Co-authored-by: George Claireaux <george@claireaux.co.uk>
Co-authored-by: Sherif A. Nada <snadalive@gmail.com>
* New abstraction for NoSql database sources
* New MongoDbSource: partial impl
* Added MongoDataType
* Improved MongoDatabase and fixed read method
* code review changes;
* merge clean up;
* Renamed NoSqlDatabase to AbstractDatabase
* formatter changes;
* code review changes: changed mongodb-new to mongodb-v2; left only new connector info in all docs
* code review changes: changed mongodb-new to mongodb-v2; left only new connector info in all docs
* updated spec.json and toDatabaseConfig() method
* updated doc accordingly to spec.json changes
Co-authored-by: Iryna Kruk <iryna.o.kruk@globallogic.com>
* new mongo db destination
* fix remarks
* updated documentation and added loggers
* updated documentation
* added hashCode field to mongodb document and fix minor remarks
* fix code style
* updated mongodb data hash from integer to UUID string
* Add Google Analytics v4 implementation
* Add docs and connector index
* Fix a broken link to Airbyte CDK
* Fix a broken link to source acceptance tests docs
* Add condition for a full refresh or incremental stream
* Add unit tests
* Fix formatting to flake8
* Updated to review
* Added logger for custom reports validation
* Add comments to code
* Updated to format
* Initial version of Apify Dataset source connector
* Add apify dataset to source definition
* Make sure clean is False by default
* Remove need for user id and token since it is not needed for reading dataset
* Add comment
* Update README
* Add docs to summary
* Add changelog to readme
* Add link to README
* Add PR link
* Initial version of Apify Dataset source connector
* Add apify dataset to source definition
* Make sure clean is False by default
* Remove need for user id and token since it is not needed for reading dataset
* Add comment
* Update README
* Add docs to summary
* Add changelog to readme
* Add link to README
* Add PR link
* Address comments
* Add newline
* added secrets stuff
* added environment more-secrets
* added more-secrets environment
* removed environment more-secrets to add in separate PR
* Docs nits
* Make sure that dataset items come in the correct order
* lint
* User partial function
* lint
* Address comments:
* newline
* format fix
* format
* bump version for formatting fix
Co-authored-by: Matej Hamas <matej.hamas@gmail.com>
* Prepare Chargebee connector for publishing
* Update docs
Update `docs/SUMMARY.md` file.
Update `docs/integrations/README.md` file.
* Update changelog
* Implement change request
* Remove `name` field from streams
* Rename env var for Chargebee
Rename from `CHARGEBEE_TEST_CREDS` to `CHARGEBEE_INTEGRATION_TEST_CREDS`.
* Revert "Rename env var for Chargebee"
This reverts commit 7ddc6e0cb1.
* Revert "Revert "Rename env var for Chargebee""
This reverts commit 6df6751034.
* Add custom backoff handler
* Implement change request
* Add comment about why `order` is an empty stream
* Bump connector version
* unfinished jdbcsource separation
* creation AbstactRelation
* Migrate StateManager to new abstract level (JdbcSource -> RelationalSource)
* fix imports
* move configs to Database level + fix MySql source
* make in line jdbc source with a new impl
* Fix ScaffoldJavaJdbcSource template
* rename `AbstractField` to `CommonField`. Now it
s not an abstract class.
+ add default implementation for `AbstractRelationalDbSource.getFullyQualifiedTableName`
* format
* rename generated files in line with their location
* bonus renaming
* move utility methods specific for jdbc source to a proper module
* internal review update
* BigQueryDatabase impl without row transformation
* add Static method for BigQueryDatabase instancing
* remove data type parameter limitation + rename class parameters
* Move DataTypeUtils from jdbs to common + impl basic types BigQueryUtils
* make DB2 in line with new relational abstract classes
* add missing import
* cover all biqquery classes + add type transformation method from StandardSQLTypeName to JsonSchemaPrimitive
* close unused connections
* add table list extract method
* bigquery source connector
* return all tables for a whole project instead of a dataset
* impl incremental fetch
* bigquery source connector
* bigquery source connector
* remove unnecessary databaseid
* add primitive type filtering
* add temporary workaround for test database.
* add dataset location
* fix table info retrieving
* handle dataset config
* Add working comprehensive test without data cases
* minor changes in the source processing
* acceptance tests; discover method fix
* discover method fix
* first comprehensinve test
* Comprehensive tests for the BigQuery source + database timeout config
* bigquery acceptance tests fix; formatting
* fix incremental sync using date, datetime, time and timestamp types
* Implement source checks: basic and dataset
* format
* revert: airbyte_protocol.by
* internal review update
* Add possibility to get list of comprehensive tests in a Markdown table format.
* Update airbyte-integrations/connectors/source-bigquery/src/main/resources/spec.json
Co-authored-by: Sherif A. Nada <snadalive@gmail.com>
* review update
* Implement processing for arrays and structures
* format
* added bigquery secrets
* added bigquery secrets
* spec fix
* test configs fix
* extend mapping for Arrays and Structs
* Process nested arrays
* handle arrays of records properly.
* format
* BigQuery source docs
* docs readme update
* hide evidences
* fix changlog order
* Add bigquery to source_defintions yaml
Co-authored-by: heade <danildubinin2@gmail.com>
Co-authored-by: Sherif A. Nada <snadalive@gmail.com>
* Adding Google Cloud Storage as destination
* Removed few comments and amended the version
* Added documentation in docs/integrations/destinations/gcs.md
* Amended gcs.md with the right pull id
* Implemented all the fixes requested by tuliren as per https://github.com/airbytehq/airbyte/pull/4329
* Renaming all the files
* Branch alligned to S3 0.1.7 (with Avro and Jsonl). Removed redundant file by making S3 a dependency for GCS
* Removed some additional duplicates between GCS and S3
* Revert changes in the root files
* Revert jdbc files
* Fix package names
* Refactor gcs config
* Format code
* Fix gcs connection
* Format code
* Add acceptance tests
* Fix parquet acceptance test
* Add ci credentials
* Register the connector and update documentations
* Fix typo
* Format code
* Add unit test
* Add comments
* Update readme
Co-authored-by: Sherif A. Nada <snadalive@gmail.com>
Co-authored-by: Marco Fontana <marco.fontana@sohohouse.com>
Co-authored-by: marcofontana.ing@gmail.com <marcofontana.ing@gmail.com>
Co-authored-by: Marco Fontana <MaxwellJK@users.noreply.github.com>
Co-authored-by: Sherif A. Nada <snadalive@gmail.com>
* pre-PR
* add git config
* format
* Update airbyte-integrations/connectors/source-zendesk-sunshine/requirements.txt
upd requirements.txt remove extra
Co-authored-by: Eugene Kulak <widowmakerreborn@gmail.com>
* Update airbyte-integrations/connectors/source-zendesk-sunshine/source_zendesk_sunshine/streams.py
backoff time int to float (btw real return type in headers is integer)
Co-authored-by: Eugene Kulak <widowmakerreborn@gmail.com>
* requested changes
* fix newline absence && rm unnecessary temp file
* url_base to property
* rm extra var coming property
* rm extra var coming property
* save
* finishing updating the documentation
* forgotten definition
* add nullable to pass the test
* fix date in the log
Co-authored-by: Eugene Kulak <widowmakerreborn@gmail.com>
* Add missing files
Add files for publishing the connector.
Update typing in few files.
Add `amazon-seller-partner.md` file.
Add `Amazon Seller Partner` to `builds.md` and to `README.md` files.
* Remove release files
* Comment out tests in `acceptance-test-config.yml`
* Update few files
Remove `!Dockerfile.test` from `.dockerignore` file.
Add `dependencies` to `build.gradle` file.
* Update `amazon-seller-partner.md` file
* Add stream to docs
Add `GET_FLAT_FILE_ALL_ORDERS_DATA_BY_ORDER_DATE_GENERAL` stream.
* Add release info
Add connector to `source_definitions.yaml` file.
Add connector to `STANDARD_SOURCE_DEFINITION` folder.